[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: (ET) Battery charger question



Hi,
I’ve been charging my Elec-Traks with the built in charger since 1980 and I’m NOT a battery expert. A timer failure 15-20 years ago prompted me to replace the timers with Landis controllers. There have been pro/con discussions about them over the years but, I think that they work great. I have mine set to 41V (38V is the recommended setting) and average 7 years of functional use out of a pack (typically East Penn). I add water about twice a year. Now before people say that I should get more life out of them, I cut about 4 acres of lawn a week (for 5-6 months of the year) and the battery pack gets abused. 

The built in charger may be a bit crude (so are the batteries!) but gives me the convenience to plug in anywhere. Some of the lawn I cut is the neighbor’s vineyard. Being able to plug in on location if needed, is handy. Maybe you could do the same by mounting a smart charger in the tractor but if you have multiple tractors, chargers could get expensive.

The Landis takes the guess work out of charging a pack. No overcharging or undercharging a pack. No checking battery voltages to see if you set the timer right. Just plug it in and forget it. My tractors are typically left plugged in all season and I let the Landis do its thing. However, once they sit unused for extended periods of time (like my snowblower tractor), I usually unplug them after a charge. About every two months I’ll plug it in again to bring the pack back up and then unplug it again. That is my charging maintenance mode. It also takes the worry of overcharging due to a charger failure.

It may not be perfect but works pretty good for me. 

Regards,

Dean A. Stuckmann
5432 County Road U
Newton, WI 53063



On Oct 9, 2021, at 1:15 AM, David Roden <etpost drmm net> wrote:

The GE charger is actually not an awful choice for the flooded golf car
batteries that most people use.  It has some voltage regulation because
it's a ferroresonant design.  Left to its own devices, it would probalby
overcharge, but the timer (if it's working) puts something of an upper
limit on that overcharging.  

Think about it.  Golf car batteries typically are rated for a life of
around 700 80% DOD cycles.  If you use your ET every week, that's about 14
years.  From what I hear, most folks get around 10 years.  So - not awful.

But suppose your GE charger is kaputt, or you just want something easier on
your battery.

Below is a rather long article on smart charging, a piece that I wrote for
the EVDL about 12-15 years ago.  Lead batteries haven't changed much in
that time. :-)  It might help you choose a smart charger for your ET.

One thing I want to underscore in it is the importance of ENOUGH CURRENT.  
You might be temtped to buy a small charger because it's cheap.  Don't do
it!  As explained below, your charger should be able to supply 20-25 amps.  
At the very least, don't go smaller than 10 amps.

Besides, charging at 5 amps takes about a day and a half just to get to 80%
charged, and easily another day or so to finish.  If you have work to do
with your ET, you may not have that kind of time.

----

Smart Charger Algorithms - How smart chargers "think"

Why You Need a Smart Charger: As long as you're using simple golf car
batteries, actually, you don't. Golf car batteries are relatively easy to
charge, and fairly tolerant of a few charging errors now and then.

But you might WANT one. Charging manually takes a fair bit of work and
attention. These days, practically every other rechargeable gadget has an
automatic charger, or at least one we don't have to pay much attention to.
So most of us just aren't used to manual charging.

What a Smart Charger Does: It provides basic battery care and feeding,
automated, so you don't have to sweat it. It might have a microprocessor
"brain," or discrete logic, or just linear circuits; but in some way it
tries to figure out the battery's current state of charge, and how to get
it to a full charge as quickly and safely as possible. The rules it follows
in doing that we call a charging algorithm.

Alphabet soup: Probably the most common algorithms in smart chargers are IU
and its variants, IUI and IUU. Here each letter represents one phase or
period of charging, so this is two phase charging or three phase charging
The I stands for constant current, and the U stands for constant voltage.
I'll explain those terms in a moment.

Charging Phase One: The first phase of the charge is what we call bulk
charging.

Initial Charging Rate: Theoretically, as long your charger is mindful of
the battery's temperature, in this phase it can stuff in the electrons
about as fast as the battery can dish them out when you discharge it. This
can be in the hundreds of amps for golf car batteries, and some AGM
batteries can handle charging currents in four figures!  But the rule of
thumb on initial charge rate is somewhere between C20 / 10 and C20 / 4.

That looks like some kind of code, doesn't it? C20 is the battery's 20-hour
amp-hour rating -- that is, how many amp hours it can produce if you
discharge it over 20 hours' time. (The faster you discharge a lead battery,
the fewer amp-hours you can get from it. Most battery manufacturers specify
the battery's capacity at at least 2 different discharge rates.) Amp-hours
are not the same as amps, but they're a handy way to express the size of
the battery and its current requirements, so in this case, we use them that
way. Thus for your typical 220 amp-hour (20 hour rate) golf car battery,
you want to use an initial charging rate between C20 / 10 (22 amps) and C20
/ 4 (55 amps).

Why Initial Rate Matters: Lead batteries aren't like the nickel cadmium and
nickel metal hydride batteries you use in flashlights and cameras. Those
little batteries appreciate being charged slowly, and they'll last longer
(for more charging cycles) when they're treated that way. However, lead
batteries actually LIKE and NEED an initial charging rate of at least C20 /
10, some even more.

I'm just a hobbyist, not an electrochemist, so I don't know the
electrochemical reasons behind this. What I do know is that that lead
batteries lose capacity (wear out) faster if they're not given this high-
current jolt for at least a few minutes at the start of the charge cycle.

Hawker Genesis AGM batteries from the 1990s were poster children for this.
They could lose half their capacity in under a year without it.  However,
all lead batteries benefit from high initial current. The engineers know
what they're doing when they recommend C20 / 10 as a minimum.

Constant current: When a battery is flat, its voltage is low. This means it
can take (and wants) a huge charging current. As it charges, its voltage
rises, so a fixed-voltage charger's current falls. This slows down the
charge.

But one of your smart charger's missions in life is to charge the battery
as fast as it can. To do this, it sets its own voltage so the charging
current is as high as it and the battery can tolerate. Then, as the
battery's voltage rises, the charger keeps bumping up its own voltage so
the charging current stays high until the last possible minute (we'll see
when that is soon). This process is called constant current charging.

During the bulk charging phase, nearly all of the charging energy goes into
the charging reaction. As the bulk phase proceeds, with the current held
constant, the battery's voltage rises. When the battery is about 80%
charged, it reaches the gassing voltage. From here on, more and more of the
charging energy goes into heating the battery and dissociating the
electrolyte's water into hydrogen and oxygen. We'll soon see why this heat
matters.

Gassing voltage depends on the battery's design -- the composition of the
positive and negative grids, and the chemical makeup of the electrolyte.
For a typical flooded golf car battery, it's 2.4 volts per cell (VPC) at
25° Celsius. Other battery types will vary from 2.35 VPC to 2.5 VPC. Check
your batteries' datasheet.

Temperature Compensation: A good charger will adjust this voltage, and all
the voltages that follow below, for battery temperatures significantly
higher or lower than 25° Celsius. You get the adjustment factor from your
battery manufacturer, but a typical one is -3mv or -4mv per cell per
Celsius degree deviation from 25° C.

This isn't ambient (air) temperature, but rather battery temperature. The
ideal way to read it would be to immerse a temperature sensor in the
battery's electrolyte. However, the usual way is to bury a sensor between
two batteries in the middle of the pack. I've also heard of attaching a
sensor to a battery terminal post.

Temperature compensation (TC) is more important for valve regulated (AGM
and gel) batteries than for flooded ones, but if you have it available on
your charger, there's no reason not to use it with flooded batteries, too.

Charging Phase Two: Reaching the gassing voltage ends the bulk phase and
begins the absorption phase. The battery is now about 80% charged.
Depending on the charger's algorithm, the remaining 20% may take about as
long as the first 80% did!

In the absorption phase of an IU charging algorithm, the charger holds the
voltage steady (constant voltage charging) at the gassing voltage. Remember
how the voltage rose when we held the current steady? Now the charger holds
the voltage steady, so the charging current falls. The charger sits tight
until the charging current has declined to about C20 / 50. For our example
220ah golf car battery, that would be 4.4 amps.

At this point the battery is essentially full. The charger can shut off
now, or you can pull the plug manually.

But maybe you shouldn't, at least not every time. That's because although
the battery is full, some of its cells are a little fuller than others.

Cell Imbalance: The cells in a battery vary a bit in how fast they charge.
Part of this is down to slight differences in their manufacturing
tolerances.

A larger factor in cell imbalance is that the cells vary in temperature,
sometimes a lot. In each battery, the inner cell or cells will usually be
warmer than the ones toward the outside of the battery. In a large battery
pack, the inside batteries will also be warmer than the outside ones. So it
isn't unusual for cell temperatures to differ by 10 or more degrees.

Temperature affects a cell's fully charged voltage, and also its charge
efficiency. See the problem?

In the short run, cell imbalance isn't a big deal. But over time, as you
charge and discharge the battery, the differences get wider and wider.
Eventually the lowest cells can end up chronically undercharged. That will
limit how much energy you can get from the battery, not just because of the
cells' lower charge, but because chronic undercharging causes permanent
loss of battery capacity.

You might think that the way to fix this is to charge every cell
individually. In fact, that's how many lithium EV batteries work. The cells
are charged in series, as with any other battery, but each cell has its own
bypass regulator. When the cell is full, the regulator diverts the charging
current round the cell, so the charged cell can kick back while the rest of
the cells finish charging.

Most road EV owners with long strings of AGM or gel batteries use similar
regulators. However, they can't put a regulator on each cell, because
modern lead batteries aren't built so you can access the individual cells.
Thus they can only regulate the charge to each individual battery. That's
better than nothing, but they still need to somehow balance the individual
cells in each battery.

Equalization (Charging Phase Three): Your charger fixes cell imbalance with
equalization -- deliberately, but carefully, overcharging the battery. The
fully charged cells dissipate the wasted energy through them as heat and as
gassing, and the cells that aren't yet at 100% get topped off.

To do this, instead of shutting off at the end of the constand voltage
absorption phase, the charger switches back to constant current charging.
This time, it uses much lower current -- the same C20 / 50 that signaled
the end of the absorption phase. Now our IU profile has become an IUI
profile. The charger holds that C20 / 50 current steady until the voltage
rises to 2.5 VPC, temperature compensated.

Or not; some engineers say to go to 2.55 VPC. Some say to hold C20 / 50 for
2-4 hours, no matter how high the voltage goes. Some suggest constant
voltage float charging (2.3 VPC) with no limit instead, which is an IUU
profile. So as you can see, some opinion is involved here. If your charger
has configurable equalization options, you might want to ask your battery
manufacturer which one is best for their batteries.

On the other hand, you might want to not to ask the manufacturer how often
to equalize. Some of them -- US Battery is one -- will tell you to equalize
on every single charge. That does give you the absolute maximum amount of
stored energy, and thus the maximum range. However, overcharging stresses a
battery, so too-frequent equalization is about as bad for your battery as
too-infrequent equalization. My recommendation is to equalize only when
it's necessary.

How often is that? Ah, there's the rub. Small differences in cell states of
charge (SOC) can be hard to detect until they become big differences.
Unfortunately, although I know of one high quality (large, expensive)
industrial battery charger that keeps track and equalizes every 7 cycles,
most smart chargers aren't all that smart about equalization. Typically
they either don't equalize at all, or they equalize every time.

If your charger is a never-equalizer, you might equalize your battery every
5 to 15 charges by restarting the charger after it's shut off. If yours is
an always-equalizer, you could try to pull the plug on it when it gets to
the equalization phase most of the time. The problem with these schemes is
that now you're doing some manual charging. You paid good money for a smart
charger so you wouldn't have to do that, no?

Voltage-Based Charging Problems: Equalization strategy isn't the only
weakness in smart chargers. Straight IU and IUI chargers have another one
that not many charger and battery makers own up to.

A battery is a little like the water heater in your house. As your water
heater ages, it starts to build up sediment in the bottom of the tank, so
it holds less water. Well, as a battery ages, it builds up sediment too.
This is not a joke; it really happens: the active material in the grids
crystalizes, falls off, and sinks to the bottom of the battery. With less
active material in the grids, the battery's capacity to hold energy
declines. That also means that its fully-charged voltage declines.

If your charger is programmed for a new battery's fully-charged voltage, it
may overcharge an old battery. In fact, an old battery might never reach
that magical 2.5 VPC above, so the equalization phase can go on too long.
As the battery ages more, eventually it might not even reach 2.4 VPC. If
that happens, the charger can get stuck in the bulk phase. Your battery
will get severely overcharged, aging it even faster.

OTOH, another symptom of a battery in its golden years is that its internal
resistance increases. This can cause the exact opposite problem -- the on-
charge voltage rises very fast, and that fools the charger into stopping
the charge too early. Then the battery is undercharged.

One way to help this situation is to add a safety time- or amp-hour-limit
to your charger. I'll talk more about this later.

DV/DT and DI/DT Charging: These are a more elegant solution to battery
aging. DV/DT and DI/DT stand for derivative of voltage (or current) with
respect to time. If you took calculus in college, this probably brings back
memories. You might remember that derivatives calculate the slope of a
curve at a given point. In this case, the curves are the rising voltage and
falling current in a charge cycle.

DV/DT charging takes advantage of the fact that as a battery charges at a
regulated (constant) current, the voltage rise slows and eventually stops,
regardless of voltage. DI/DT charging is based on the similar idea that at
a constant voltage, the decrease in current slows and eventually stops.

During the constant current bulk charging phase, in addition to watching
for gassing voltage, the charger's brain watches for the voltage rise to
slow down. A typical value to watch for is between 2.5mv and 5mv per hour
per cell.

During the constant voltage absorption phase, in addition to watching for
the current to fall to C20/50, it watches for the current decrease to slow
down. A typical value here is between 0.2 and 0.4 amps per hour.

The battery makers usually specify DV/DT and DI/DT in hour increments (if
they specify them at all). But IMO it's better to sample voltage or current
more often than every hour. Also, the charger should look for the
specification (divided by samples per hour of course) to be met in 2 or 3
consecutive samples, or for the delta (change) to fall to some much smaller
amount. For example, Lester's Lestronic DV/DT chargers check the voltage
slope every 15 minutes.

Safety Limits: Whether the charger uses DV/DT or not, it should also have
one or more backup methods that will halt the charge in case of really
weird or dangerous situations.

One of the pesky qualities of lead batteries is that their fully charged
voltage is lower at higher temperatures. This is called a negative
temperature coefficient. It explains why you should use temperature
compensation, but it's also a matter of safety.

I mentioned above that once a battery charger goes beyond the battery's
gassing voltage at 80% charged, an increasing amount of the charging energy
goes into heating the battery and generating hydrogen and oxygen. Well,
when the battery reaches 100% charged, all the energy goes to waste this
way.

It's the heat that causes trouble. As the battery gets hotter, its voltage
falls. That makes a constant voltage charger send more current through the
battery, which heats it up even more, which increases the current more ...
and before you know it, you have thermal runaway.

At best, the result is a hot, overcharged battery. At worst, the battery
can actually catch fire. (Yes, this has happened, though I'm glad to say,
not to me yet.)

To prevent this, a smart charger should check for at least one of two
things. The first is negative DV/DT. If the on-charge voltage falls, the
charger should stop the charge immediately. The second is high battery
temperature. If the charger's already using a temperature sensor to carry
out temperature compensation, it should be able to keep an eye on this, and
stop the charge if battery temperature exceeds something like 50° or 60°
Celsius.

If one of these conditions forces the charger to shut down, it's also a
good idea for it to let you know that something's gone wrong, maybe by
turning on a red warning light. That way, you don't find out the hard way
that your EV isn't fully charged.

Another good safety backup: checking whether the amount of charging so far
makes sense. The charger should keep track of the total amp hours and/or
charging time. If the charger knows what kind of battery it's charging,
it'll know if it exceeds, say, 150% of that battery's rated amp-hours. If
it doesn't know, it can still make a guess at a reasonable amount. Either
way, it should stop charging if things look odd, and warn you that
something might be wrong.





David Roden - Akron, Ohio, USA

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
Note: mail sent to the "etpost" address will not reach me.  To send
me a private message, please use the address shown at the bottom
of this page : http://www.evdl.org/help/
= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =




_______________________________________________
Elec-trak mailing list
Elec-trak cosmos phy tufts edu
https://cosmos.phy.tufts.edu/mailman/listinfo/elec-trak