M
Mike
Guest
I've designed and built a prototype PIC based Sealed Lead Acid Battery
Tester, but I'm having confusing and conflicting results and would like some
input on whether the strategy I'm using is sensible ...
It's based around a constant-current load, controllable by the PIC chip.
The current can be set over a range of C/40 C/20 C/10 C/5 C/2 and C (where
C is the nominal Ah rating of the battery). The unit can monitor the
voltage of the battery. All the voltage measuring/current measuring and
current setting is appropriately calibrated, so any problems are down
to approach or duff assumptions on my part
The basic idea is a "brute force" test of the capacity of the battery: Load
it up at C/n and time how long the battery takes to reach a discharged state.
In theory, it will take "n" hours for a 100% battery (aside from the obvious
that discharging faster will reduce apparent capacity etc.)
From my research so far a fully charged SLA cell should read 2.16v at rest,
and 1.75v when discharged. So for a 12v battery, the useful capacity is
found between terminal voltages of 10.5v and 12.96v. Right?
The capacity of the battery can be estimated using this voltage range, which
I'm assuming to be a linear function (it nearly is...). So the first feature
on the tester is a battery capacity estimator, based on offload voltage.
Then loading the battery at (some constant current) begins the discharge
process, and the voltage starts to fall. I time how long it takes for the
voltage to reach 10.5v, and then stop the timer, and compute the capacity
actually achieved. It should be that simple, however ...
I have a dilemma: When things say "don't discharge the battery below 10.5v"
do they mean on-load or off-load? There is a difference!
I noticed that loading the battery, especially at higher currents, causes
the terminal voltage to fall, due to the internal resistance of the battery.
So I added a feature to try and measure this resistance at the start of the
discharge process. I did this by taking the off load voltage, the on-load
voltage at C/n, and working out the voltage being dropped across that
resistance. This gives me a figure of <100 mOhm on new batteries, and
anything up to 4-5 Ohms on older "suspect" batteries. It also gives me a
"delta voltage" to adjust the 10.5v limit figure downward by.
My tester currently records two capacities. One: Discharge to 10.5v on load.
Two: Keep discharging until 10.5v MINUS the voltage-delta calculated above.
Obviously capacity 2 is always >= capacity 1. This seems to give reasonable
figures, of capacities that are <= nominal.
The snag is that when this test terminates, the battery really *really*
should be flat. As I've compensated for the voltage-drop across the internal
resistance, I should get 10.5v when the load is taken off, and the battery
is empty, right?
Wrong.
The voltage bobs back up to an embarrassing "You have 60% left" level. That
shouldn't happen I flattened it to 10.5v on-load. I flattened it further
to 10.5v minus the voltage across the internal resistance ... how can it
still have power in there?
Trying to load the battery up further (repeating the discharge) just causes
it to immediately terminate as the voltage is too low.
Is there anything particularly wrong with this strategy?
Should I even be compensating for the voltage drop across the internal
resistance? I know it's meant to be low order mOhm, but even that can cause
a significant drop at e.g. 30A test current on large batteries. So I think
yes, I should be.
And does the internal resistance significantly change, meaning I should be
re-testing it at intervals? I thought the internal resistance was a function
of how sulphated the plates were, and would only vary if the battery was
deep discharged and left, or charged and rejeuvenated.
Why does there appear to be useful power left in the battery (from the
terminal voltage) and yet there .. isn't!?
If the discharge is genuinely flattening the battery, yet the terminal
voltage comes back up so high, how can "battery fuel gauge meters" ever
reliably indicate the state of charge of the battery? I'm beginning to
think that they can't!
Maybe I should take the load off every so often, wait for the voltage to
bob back up and settle, and if it's > 10.5v, go on a bit longer?
Here's an example of the kind of readings I'm getting. Each capacity is
stated twice, once for "discharge to 10.5v" and once for "discharge
below 10.5v by dV"
Battery: Camden Europa Plus, 20 months old, UPS battery. 7.2Ah nominal.
Internal Res: 94-126 mOhm (tested at 7.2 Amps)
Measured Capacity @ C/1 rate: 2.44Ah/3.16Ah
Measured Capacity @ C/10 rate: 5.11Ah/5.18Ah
Est. Capacity after C/1 test - 67% (UPS software claims 63%)
Est. Capacity after C/10 test - 50% (UPS software says 58%)
Spookily, I seem to have drained 33% out of the battery (2.44Ah) and 67%
is estimated to be left, which means I haven't flattened the battery yet.
Battery: Yuasa NP1512, 5+ years old, 15Ah nominal
Internal Res: 76 mOhm (tested at 15 Amps)
Measured Capacity @ C/1 rate: 9.9Ah/10.5Ah
Measured Capacity @ C/10 rate: 15.15Ah/15.0Ah
Est. Capacity after C/10 test - 18%
Battery: Sonnenschein Dryfit A300, 5+ years old, 3Ah nominal
Internal Res: 270 mOhm (tested at 3 Amps)
Measured Capacity @ C/1 rate: 1.05Ah/1.08Ah
Measured Capacity @ C/10 rate: 2.7Ah/2.7Ah
Est. Capacity after C/1 test: 50%
Est. Capacity after C/10 test: 60%
What have I screwed up?
Mike.
--
--------------------------------------+------------------------------------
Mike Brown: mjb[at]pootle.demon.co.uk | http://www.pootle.demon.co.uk/
Tester, but I'm having confusing and conflicting results and would like some
input on whether the strategy I'm using is sensible ...
It's based around a constant-current load, controllable by the PIC chip.
The current can be set over a range of C/40 C/20 C/10 C/5 C/2 and C (where
C is the nominal Ah rating of the battery). The unit can monitor the
voltage of the battery. All the voltage measuring/current measuring and
current setting is appropriately calibrated, so any problems are down
to approach or duff assumptions on my part
The basic idea is a "brute force" test of the capacity of the battery: Load
it up at C/n and time how long the battery takes to reach a discharged state.
In theory, it will take "n" hours for a 100% battery (aside from the obvious
that discharging faster will reduce apparent capacity etc.)
From my research so far a fully charged SLA cell should read 2.16v at rest,
and 1.75v when discharged. So for a 12v battery, the useful capacity is
found between terminal voltages of 10.5v and 12.96v. Right?
The capacity of the battery can be estimated using this voltage range, which
I'm assuming to be a linear function (it nearly is...). So the first feature
on the tester is a battery capacity estimator, based on offload voltage.
Then loading the battery at (some constant current) begins the discharge
process, and the voltage starts to fall. I time how long it takes for the
voltage to reach 10.5v, and then stop the timer, and compute the capacity
actually achieved. It should be that simple, however ...
I have a dilemma: When things say "don't discharge the battery below 10.5v"
do they mean on-load or off-load? There is a difference!
I noticed that loading the battery, especially at higher currents, causes
the terminal voltage to fall, due to the internal resistance of the battery.
So I added a feature to try and measure this resistance at the start of the
discharge process. I did this by taking the off load voltage, the on-load
voltage at C/n, and working out the voltage being dropped across that
resistance. This gives me a figure of <100 mOhm on new batteries, and
anything up to 4-5 Ohms on older "suspect" batteries. It also gives me a
"delta voltage" to adjust the 10.5v limit figure downward by.
My tester currently records two capacities. One: Discharge to 10.5v on load.
Two: Keep discharging until 10.5v MINUS the voltage-delta calculated above.
Obviously capacity 2 is always >= capacity 1. This seems to give reasonable
figures, of capacities that are <= nominal.
The snag is that when this test terminates, the battery really *really*
should be flat. As I've compensated for the voltage-drop across the internal
resistance, I should get 10.5v when the load is taken off, and the battery
is empty, right?
Wrong.
The voltage bobs back up to an embarrassing "You have 60% left" level. That
shouldn't happen I flattened it to 10.5v on-load. I flattened it further
to 10.5v minus the voltage across the internal resistance ... how can it
still have power in there?
Trying to load the battery up further (repeating the discharge) just causes
it to immediately terminate as the voltage is too low.
Is there anything particularly wrong with this strategy?
Should I even be compensating for the voltage drop across the internal
resistance? I know it's meant to be low order mOhm, but even that can cause
a significant drop at e.g. 30A test current on large batteries. So I think
yes, I should be.
And does the internal resistance significantly change, meaning I should be
re-testing it at intervals? I thought the internal resistance was a function
of how sulphated the plates were, and would only vary if the battery was
deep discharged and left, or charged and rejeuvenated.
Why does there appear to be useful power left in the battery (from the
terminal voltage) and yet there .. isn't!?
If the discharge is genuinely flattening the battery, yet the terminal
voltage comes back up so high, how can "battery fuel gauge meters" ever
reliably indicate the state of charge of the battery? I'm beginning to
think that they can't!
Maybe I should take the load off every so often, wait for the voltage to
bob back up and settle, and if it's > 10.5v, go on a bit longer?
Here's an example of the kind of readings I'm getting. Each capacity is
stated twice, once for "discharge to 10.5v" and once for "discharge
below 10.5v by dV"
Battery: Camden Europa Plus, 20 months old, UPS battery. 7.2Ah nominal.
Internal Res: 94-126 mOhm (tested at 7.2 Amps)
Measured Capacity @ C/1 rate: 2.44Ah/3.16Ah
Measured Capacity @ C/10 rate: 5.11Ah/5.18Ah
Est. Capacity after C/1 test - 67% (UPS software claims 63%)
Est. Capacity after C/10 test - 50% (UPS software says 58%)
Spookily, I seem to have drained 33% out of the battery (2.44Ah) and 67%
is estimated to be left, which means I haven't flattened the battery yet.
Battery: Yuasa NP1512, 5+ years old, 15Ah nominal
Internal Res: 76 mOhm (tested at 15 Amps)
Measured Capacity @ C/1 rate: 9.9Ah/10.5Ah
Measured Capacity @ C/10 rate: 15.15Ah/15.0Ah
Est. Capacity after C/10 test - 18%
Battery: Sonnenschein Dryfit A300, 5+ years old, 3Ah nominal
Internal Res: 270 mOhm (tested at 3 Amps)
Measured Capacity @ C/1 rate: 1.05Ah/1.08Ah
Measured Capacity @ C/10 rate: 2.7Ah/2.7Ah
Est. Capacity after C/1 test: 50%
Est. Capacity after C/10 test: 60%
What have I screwed up?
Mike.
--
--------------------------------------+------------------------------------
Mike Brown: mjb[at]pootle.demon.co.uk | http://www.pootle.demon.co.uk/