NerdKits - electronics education for a digital generation

You are not logged in. [log in]

NEW: Learning electronics? Ask your questions on the new Electronics Questions & Answers site hosted by CircuitLab.

Microcontroller Programming » Reading battery voltage using ADC

January 02, 2010
by NK_EK
NK_EK's Avatar

Hi,

Maybe a stupid question, but how would I go about reading the voltage of a battery (9v or 12v) using the ADC?

I know the ADC reads in 1024 steps and in the tempsensor project, the reference voltage is set as 5000. Would I now use 9000 instead for a 9v battery & 12000 for a 12v battery and will the ADC be able to read from a 9 or 12v source?

If so, I think the connection should be as follows:

A)  GND -- 1 Ohm Sense Resistor -----o------- Battery
                                     |
                                    ADC

OR

B)  GND -------o--- 1 Ohm Sense Resistor ---- Battery
               |
              ADC

I want to be able to use this to make a decision on if and when to charge the battery(s) or not.

Thanks in advance.

Ernest

January 02, 2010
by NK_EK
NK_EK's Avatar

Me again,

I read in the datasheet (section 23, from page 245) that you can source external reference voltage (AREF), but what I can't find is whether this can be as high as 12v or not. I cannot see anything to the contrary.

So, if this is the case, the ADC would read the ADC values anywhere between 0 (for GND) and 1023 (for AREF / 12v in this case).

Any ideas or pokes in the right direction?

Thanks

Ernest

January 02, 2010
by mongo
mongo's Avatar

Personally, I would leave the reference part alone. Check the battery voltage across a voltage divider of 4.7K and 3.9K (two standard values to keep it simple.) Use the math in the program to tune the scaling and multiplication factors.

The only reason I would do that is so that no external voltages are higher than the operating voltage.

They would be connected like this; Battery + --------/4.7K/-----ADC-----/3.9K/------------Ground

January 02, 2010
by Rick_S
Rick_S's Avatar

You should not connect any voltage larger than 5v to the ADC of the MCU. Section 23.2 of the datasheet states that " The ADC has a separate analog supply voltage pin, AVCC. AVCC must not differ more than ±0.3V from VCC. "

Since the maximum voltage of the MCU is 5.5V (Typical would be 5v) and the ADC should differ by more than +/- .3, you should not try 9v or 12 that would most likely damage your mcu in short order.

You can check a higher voltage, to do this, you setup a voltage divider with resistors and feed the divided voltage into the ADC. The math becomes a bit more complicated with that but it will work. Depending on the voltage being tested, you have to change the values of the resistors to keep the voltage fed into the ADC within range.

An example:

GND------/\/\/\/\----+----/\/\/\/\-------9vDC
|                    |                     |
|         10K      4.5vDC    10K         |

With equal resistors, the centerpoint will be half the voltage. If you increase the value of the resistor on the ground side, the tap voltage will increase, if you increase the value of the resistor on the 9vDC side, the tap voltage will decrease. Ohms law will help determine what the voltage at that point will be.

Sorry mongo, I didn't fully read what you said before I typed... I agree with mongo, leave the reference part alone and use a voltage divider circuit. ;)

Rick

January 02, 2010
by mjswan
mjswan's Avatar

I'm also trying to read battery voltage for replacement/recharging purposes. I'm using 9V battery through the voltage regulator to produce a 5V reference voltage. If I feed the 5V into the ADC, I get a very stable +/- 4950mV, as I'd expect.

If I use a voltage divider of 1K and 10K like this (as suggested above)

 9V Battery ----/10K/---ADC---/1K/---Ground

I expect to get 1/11 of the 9V battery's true voltage (Vout = (R2/R1+R2) * Vin).

However, what I observe instead is that over time, the ADC is returning smaller and smaller values for Vout. Immediately after powering up, I observe 618mV at the ADC which is 6.8V. After waiting about five minutes, I observe 576mV at the ADC which is 6.3V. My multimeter indicates that the battery voltage is actually 7.78V.

So, two questions:

  1. Why am I seeing the measured voltage decrease over such a short time? I'd expect to see a much more stable Vout value at the ADC. Could this be because I'm using a partially depleted battery?

  2. Why am I seeing a difference between what the ADC reports (6.3V) and what the multimeter reports (7.78V)?

January 02, 2010
by mongo
mongo's Avatar

Try something out to see if the swing actually goes from 0 to 5V.

Temporarily ground the ADC input and see if it reads zero and stays there. Then go the other way and put a known voltage on it and again, watch for stability and linearity.

Chances are, there is some drift going on. That was why I went with the 4.7K and 3.9K resistors. It gives you the full range (maybe just a tiny bit more) and is easier to read with less internal error.

Another thing to remember is input impedance of the measuring device. It will act as a parallel resistance between the ADC and ground, which can cause error in the reading by the additional loading

Dave.

January 02, 2010
by mongo
mongo's Avatar

OK, I just tried something that ought to work nicely...

Instead of the resistors, use a 10K potentiometer and a 15K resistor in series. Connect the wiper of the pot to the ADC input and the low side of the pot to ground while the resistor to the battery to check.

Change the math in the program where it calls for 5000/1024 to 10000/1024.

Adjust the pot to read as needed on the display to match the actual battery voltage.

Battery------/15K/------/pot/--------Gnd

January 02, 2010
by mjswan
mjswan's Avatar

Thanks for your replies Dave.

I did ground the ADC input and read a stable 7.4mV, which is close to zero, as I'd expect. I previously connected the 5V out of the voltage regulator into the ADC and read a stable 4952.6mV which is also what I expected.

I naively hadn't measured the battery voltage when under a load, that is, when the NerdKit was powered up. Of course, powering up will cause a drop in the voltage, which I then measured using the multimeter. When I compared the multimeter voltage to the ADC-calculated voltage through the voltage divider, they were identical to one significant digit. The drift in ADC-calculated voltage that I'd experienced appears to be insignificant over time.

January 03, 2010
by NK_EK
NK_EK's Avatar

Hi guys,

As always, thank you for the quick replies & suggestions.

I'm using Rick's suggestion of a voltage divider with two 10K resistors in series and it seems to be working fine. Just one funny thing: the reading I get from the ADC is 7.8V (using math as below with an ADC reading of 883) while testing using the multimeter it shows 8.6V (even while connected to the ADC).

My circuit looks like this (including multimeter leads - MMB (black) & MMR (red) & Battery GND connected to same ground as MCU):

Common GND -----MMB-----/\10K/\-----o-----/\10K/\-----MMR-----+9V (Battery)
                                    |
                                   ADC

My math is as follows:

#define BATT_FULL 9000.0

double sampleBatteryVolts (uint16_t sample) {

    // conversion ratio
    // (Battery Reference Voltage / 1024 steps) - Battery voltage = 9000mV
    // 1000mV per 1V
    return (sample * ((BATT_FULL / 1024.0) / 1000.0));

}

Using this function, if the battery is fully charged, the ADC reading should be at or close to 1024. If I use that, I should be able to get to 9V on the dot:

     (1024 * ((9000.0 / 1024.0) / 1000.0));
   = (1024 * (8.7890625 / 1000.0));
   = (1024 * 0.0087890625);
   = 9

So, the math seems to work (works for 4.5V or ADC reading of 512 as well).

I haven't tested the battery under any other load, so could that be the difference that I'm getting? The MCU and rest of the circuit runs off a wall transformer through the Voltage Regulator.

Sorry for the n00b questions, but as you all say "the only stupid question is the one not asked."

Thanks for the help so far and your patience.

Ernest

January 03, 2010
by Rick_S
Rick_S's Avatar
  1. Make sure the circuit is powered by 5v
  2. Do not allow more than 10v into your voltage divider input
  3. Tie your AREF to 5v and use external reference
  4. Tie All your grounds together (5V power for MCU as well as your test voltage) This will prevent ground loops that can put higher voltages where you don't want them.
  5. Make sure both resistors in the voltage divider are as exact equals as possible (check with a meter)

Since the output of the voltage divider is 1/2 the input voltage, your formula should work like this:

Measured_Voltage = ((5/1024)*(2*ADC))

Where (5/1024) is AREF voltage divided by total ADC Steps * the ADC reading * 2 to double (because of the voltage dividing circuit we set up is 1/2 the original voltage).

I haven't tried that but I think that should work out for zero to ten volts. Again, don't put higher than 10v to the circuit as it could damage your micro-controller.

Rick

January 03, 2010
by NK_EK
NK_EK's Avatar

Rick,

Thanks again. I've included a schematic of my setup below:

Battery Charger Circuit

(The image can also be viewed here if it's not included in this post)

As you'll see, I added a diode to the GND side of the voltage divider. This made a huge difference in that my reading on the ADC is now very close to the reading on the multimeter (using my original math). Not quite sure why this makes a difference yet, but it seems to be working.

I also added diodes between the MCU and Q1 (for protection of the MCU) and between Q1 and the Battery (so that there will not be any current flowing back into the charging circuit when Q1 is 'off' - maybe overkill, but better safe than sorry).

The rest of the setup on the MCU is standard, i.e. I haven't played around with AREF, etc.

Hope it makes sense?

Thanks

Ernest

January 05, 2010
by mrobbins
(NerdKits Staff)

mrobbins's Avatar

Hi NK_EK,

There's a lot of great discussions going on in this thread! I just want to jump in and clarify something:

Two messages ago, you said that you had an ADC reading of 883 while your math came out to 7.8V but the multimeter read 8.6V. My math suggests that 883 really does indicate about 8.6V:

883 is the number of ADC steps it measured, where 0 is 0 volts, and 1024 would represent AREF=5.0 volts. So the voltage being measured at the ADC input is about (883/1024)*5.0 = 4.31 volts. Because of the voltage divider (10K and 10K), the "true" battery voltage being measured is double this, or 8.62 volts. Exactly what you said you measured with the multimeter!

The BATT_FULL constant shouldn't really enter your code (at least at this point) -- to determine the voltage being measured at the ADC, all you need to know is the reference voltage, the number of total ADC steps, and the voltage divider ratio: Vbat = ((883/1024)*5.0)/0.5. (When implementing this in code, be careful about not doing integer division.)

Hope that helps!

Mike

January 05, 2010
by Rick_S
Rick_S's Avatar

Another reason the diode on the ground side made your calculation more accurate is the diode will provide a voltage drop across it. Thus with the diode on the ground side and not the other, you will see the ratio be different than a 50/50 split. Since the diode was on the ground side, the voltage going to the ADC will be above the mid point. Making your calculation more accurate since your formula is based on a full 1024 reading on the ADC which only will happen when the ADC gets 5V.

With the divider left as is, my formula or Mike's variation of that formula should work. However, with your circuit modification, your formula is probably real close. The only thing to be cautious about is if you shift from the midpoint too much, you risk applying too much voltage to your ADC. The datasheet for your diode should tell you what voltage drop to expect across it. Then you should be able to use ohms law to calculate the drop across the resistors.

Rick

January 06, 2010
by NK_EK
NK_EK's Avatar

Mike & Rick,

Thanks!

It's slowly but surely beginning to make more sense now :-).

I'll take the diode on the GND side out and use your calculations. I think what threw me was that when you use an ADC reading of 1023 in the calculations, you end up with a voltage of almost 10V (9.99 to be exact), but when I thought about it a bit, I remembered that most 9V batteries will in fact be anywhere between 9 - 10V when fully charged.

Rick, I also forgot about the voltage drop across the diode, so thanks for the reminder.

Now, my next question (yip, you're not getting rid of me that easy ;-) :

Using the 2N7000 MOSFET as a 'switch' to start or stop charging the battery; using a pin on the MCU to the Gate on the MOSFET, +12V supply on the Drain and the Battery (+9V) on the Source, I do not get a full 12V going through the MOSFET. I get +12V between the Drain & GND, but less than half that (can't exactly remember - at work at the moment) between Source and GND.

Am I making a n00b mistake again, or have I missed something somewhere? I will - in the meantime - go and read up again on the use of MOSFET's, their specs, etc.

Maybe I'll be able to answer myself sometime soon ;-)

Thanks for the help guys!

Ernest

July 20, 2010
by lsoltmann
lsoltmann's Avatar

A few posts ago Rick stated that you should not allow more than 10v into your voltage divider circuit. I'm guessing this is because when you divide 12.6/2 you get 6.3 which exceeds the maximum input voltage of the MCU. I'm trying to measure the voltage of a battery that will have a maximum of 12.6V (3S Lipo). Is there another way to measure it with the MCU other than a voltage divider or is there a way yo modify the voltage divider to make it work?

Post a Reply

Please log in to post a reply.

Did you know that NerdKits also has extra parts available for its customers? Learn more...