From: who where on
On Sun, 04 Oct 2009 19:48:53 -0700, Jeff Liebermann <jeffl(a)cruzio.com>
wrote:

>On Mon, 05 Oct 2009 10:17:21 +0800, who where <noone(a)home.net> wrote:
>
>>Ah, I didn't say external ;-) It was an on-board jumper selection,
>>so the user would need to crack the case.
>
>Grumble. That's what I want on my chargers. Actually, what I want is
>complete control over just about every parameter involved in charging,
>but would create its own collection of problems. The market for such
>things is probably fairly small.

Indeed. You'd want one (or two), I'd keep a couple, a few more into
the s.e.d types, and probably a dozen for the rest_of_world.

It was a jumper on pin headers, extending it to the front panel would
be a snap.

The controller I used was the MAX1737. You get a certain amount of
flexibility designing around it, and we (the client and I) preferred
their regime to the others we considered.
>
>>This charger was for
>>"commercial/industrial" users and supplied as a companion device to
>>their custom 1/2/3/4-cell packs. T'was documented for the vendor so
>>he could set it to best suit the end-user application.
>
>I'll bet that the vendor does NOT supply this information to the
>customer. Like I previously mumbled, I haven't seen any Li-Ion
>chargers that give the customer any EOC control.

The selection was vendor-made based on the end user's stated role, and
was selected in_conjunction_with pack sizing. Any time the role
looked like prolonged high SOC and temperatures above 30C, he'd go
with 4v10 and the pack size would then be determined. He didn't want
premature failures, unlike laptop makers who don't give a rats.
>
>>>I think (not sure) that simply disabling the CV part of the charge
>>>cycle would be sufficient to stop charging at about 80% of full
>>>charge.
>>
>>(Checks project report ...) On my 18650 testing, transition occurred
>>at ~59% when charging at 0.55C. Asthere is obviously a finite ohmic
>>impedance characteristic, transition would occur later at lower rates.
>
>Ok, bad guess on my part. Maybe estimating the time needed for a CV
>charge to get to 100%, and cut it in half to get 80%.

It's linear - CC - in CL mode.

>>>Yep. The math for calculating how far down a Li-Ion battery pack is
>>>discharged is fairly simple if I make a number of assumptions.
>>
>>(snip)
>>
>>Estimating SOC is a *lot* easier, trivial linear calculation.
>
>I wasn't looking for the SOC. That can be done by counting coulombs
>(amps and seconds). What I was calculating was the run time of the
>computer until the battery pack gives up. That's the mysterious
>specification offered my many laptop vendors that reeks of science
>fiction and cooked data. The number of variables involved in an exact
>calculation is sufficiently high that most vendors will simply use an
>empirical number, rounded up to the nearest integer.
>
>>From
>>fully charged (and preferably "rested"), discharge until the PACK
>>shuts off the pooter. Observe run time. Deicde what % you want left
>>in your pack, repeat above and terminate when that proportion of the
>>full runtime remains.
>
>No problem except you don't specify what the computer is doing while
>discharging the battery. There's a huge difference between sitting at
>standby keeping the dynamic RAM alive, and beating up the CPU with
>compressed video, spinning DVD drive, and full brightness
>backlighting. It's as bad as the spec for the number of pages a laser
>printer toner cartridge will deliver.

Maybe I missed your objective. I understood it to be determining when
to stop discharge (in the laptop) to achieve a chosen SOC. Any time
you are playing with discharge the laptop activity is fundamental, but
for a known target SOC it isn't hard to invoke a known task (eg screen
saver) and do the linear maths.
From: who where on
On Sun, 04 Oct 2009 19:35:03 -0700, Jeff Liebermann <jeffl(a)cruzio.com>
wrote:

>On Sun, 04 Oct 2009 10:59:04 +0800, who where <noone(a)home.net> wrote:
>>The pack protection modules we used were preset to open the series FET
>>switch at 3v0. If you look at the discharge curve of Li-Ions at
>>constant current (or with a constant load impedance) you will notice a
>>distinct droop below about (from memory here) 3v3. While cell
>>deterioration starts at/below 2v5 there is very little useful capacity
>>gained by proceeding below 3v0.
>
>Well yes.... any storage device, with a low internal series resistance
>will exhibit a fairly flat discharge curve followed by an abrupt
>droop. See the first graph at:
><http://www.mpoweruk.com/performance.htm>
>The problem is that the sharp knee is somewhere between 5% and as
>little as 1% of capacity.

It wasn't a sharp drop that we saw. More of a curve than a cliff.

>To prevent running the battery into the
>ground (and possibly reverse polarizing some of the cells when
>connected in series), the dropout point is as close to the beginning
>of the droop as possible. That's fine for a new battery, but as the
>battery ages, the same threshold slowly moves up the charge curve as
>the terminal voltage decreases. I don't think any of the SOC chip
>vendors compensate for this.

I'm not aware of any that do. But with the (IIRC) Mitsumi chips we
used, there was less need than with say nickel chemistries. The
module monitored cell voltage differences, and would prevent operation
if the differences exceeded a preset threshold value. With EOD set at
3v0 (average cell), there is no way any cell would get near to 2v5.

>>>No problem. However, I'll stand on the Wikipedia 20%/year loss at
>>>100% charge at room temperature for commodity laptop batteries. My
>>>results were even worse. I'll concede that there are new chemistries
>>>that offer substantial improvements in self-discharge and
>>>self-deterioration, but I haven't seen any in laptops.
>>
>>Cost. Commodity chemistries have been around for over a decade and
>>are cheaper than newer solutions that aren't into the same part of the
>>volume/cost curve yet.
>
>Agreed. It does take time for new technology to decrease in price.
>However, there little incrimental benefits to switching to a superior
>chemistry or technology. For a few percentage points increase in
>performance, the exponential increase in cost makes it a bad
>investment. Mediocrity tends to be permanent until a new mass market
>can be found, or until some external influence (environment, scarcity
>of materials, hazards, safety, etc) demands a replacement. Methinks
>we'll be seeing the commodity Li-Ion battery, with its 20% capacity
>loss per year, for quite some time.

just like the corner of the engine bay on automobiles still features
lead-acid ....

>>Remember that the laptop manufacturer
>>generally sees the battery pack as a non-warranted item (wear and
>>tear), and even when it IS warranted it only has to function for that
>>period without any capacity guarantee. So cheap is good for them.
>
>Generally true but there are exceptions. The Sony manufactured
>batteries full of metal shavings that would catch fire with little
>provocation was covered under various warranties. I had 4 laptop
>batteries (out of maybe 200) replaced under this warranty.

That's an identified manufacturing fault, totally different from a
wear-and-tear situation.

>However,
>for general use, you're correct. There is no battery warranty. About
>10 years ago, I received 4ea Compaq Presario 1620 series laptops, each
>with a spare battery. Most of the batteries died within 5 months
>including the ones that were left in the original packaging and not
>used until tested. Compaq (pre-HP) declared this to be "normal
>battery life" and refused to do anything. 3rd party Li-Ion battery
>packs were somewhat better and lasted about a year. We switch to the
>older NiMH batteries, which were half the price, and lasted 3 years.
>Your horror stories may vary.
>
>>>The user can set the Windoze low battery warning to trip at a much
>>>higher level than the ridiculously low default value of 10%. That
>>>will prevent excessive discharge.
>>
>>Yes (see earlier) but I was referring to end-of-charge setpoint.
>
>Ok, got it. Still, the Windoze low battery warning feature is quite
>useful. I have mine set to warn me at 40% and shut down at 25%. Too
>soon to tell if this will extend the life of the battery pack.
>
>Incidentally, some interesting reading on SOC (state-o-charge)
>technology:
><http://www.mpoweruk.com/soc.htm>

From: Jeff Liebermann on
On Mon, 05 Oct 2009 16:45:09 +0800, who where <noone(a)home.net> wrote:

>It was a jumper on pin headers, extending it to the front panel would
>be a snap.

I was thinking more of something with a built in ethernet or USB port.
All the charge parameters can be setup on a web page. Once one has a
suitable processor, adding features such as battery history, battery
test, counterfeit detection, run time calculation, and fire detection
are mostly software. I'm half way inspired to design and build one
for myself but suspect that I can't make much money on it at consumer
price levels.

>The controller I used was the MAX1737.
<http://www.maxim-ic.com/quick_view2.cfm/qv_pk/2217>
>You get a certain amount of
>flexibility designing around it, and we (the client and I) preferred
>their regime to the others we considered.

Ok, but that's pure analog. Analog is not a problem but it does limit
what weird things can be done with a Li-Ion charger. I was thinking
more in the way of a digital (i.e. PIC controller) design, such as:
<http://www.microchip.com/stellent/idcplg?IdcService=SS_GET_PAGE&nodeId=1406&dDocName=en024090>
<http://ww1.microchip.com/downloads/en/DeviceDoc/51515a.pdf>
and increasing the output current capabilities to run larger battery
packs. For example, when the battery pack is in the charger and
allegedly fully charged, it would be fairly easy to apply a load and
discharge it for perhaps a minute or more. The asymptote of the
terminal voltage curve can be extrapolated to produce an estimated
runtime. Some of the remote battery management systems already do
this quite accurately. One could also include some RAM and add a data
logger and coulomb counter (amp-seconds). The area under the current
curve is the charging and discharging energy. This would give a good
clue as the battery packs comparative quality (something that I
suspect the manufacturers would not be interested in supplying).

>The selection was vendor-made based on the end user's stated role, and
>was selected in_conjunction_with pack sizing. Any time the role
>looked like prolonged high SOC and temperatures above 30C, he'd go
>with 4v10 and the pack size would then be determined. He didn't want
>premature failures, unlike laptop makers who don't give a rats.

Good plan. However, there's always going to be the customer that
plugs in a new battery pack, runs it as long as possible, and then
proclaims that they're not getting the specified run time.

>Maybe I missed your objective. I understood it to be determining when
>to stop discharge (in the laptop) to achieve a chosen SOC. Any time
>you are playing with discharge the laptop activity is fundamental, but
>for a known target SOC it isn't hard to invoke a known task (eg screen
>saver) and do the linear maths.

It was to see how close to a full charge was being used and where the
battery droop detection was set. That's measured in coulombs
(watt-seconds) or run time (hours). I have a Kill-a-Watt meter that
measures power consumption from the 117VAC line. I run the laptop
only from the charger, with the battery removed, for about 30 minutes
(the limit of my attention span), doing what I consider to a typical
applications mix. The Kill-a-Watt meter records the watt-seconds
(actually watt-hrs) used. It also compensates for power factor. I
throw in the switcher efficiency of about 85-90%:
<http://www.energystar.gov/index.cfm?c=ext_power_supplies.power_supplies_consumers>
and calculate the energy consumption per hour of use. I use that as
the average load for run time calculations.

--
Jeff Liebermann jeffl(a)cruzio.com
150 Felker St #D http://www.LearnByDestroying.com
Santa Cruz CA 95060 http://802.11junk.com
Skype: JeffLiebermann AE6KS 831-336-2558
From: Jeff Liebermann on
On Mon, 05 Oct 2009 16:57:03 +0800, who where <noone(a)home.net> wrote:

>>Well yes.... any storage device, with a low internal series resistance
>>will exhibit a fairly flat discharge curve followed by an abrupt
>>droop. See the first graph at:
>><http://www.mpoweruk.com/performance.htm>
>>The problem is that the sharp knee is somewhere between 5% and as
>>little as 1% of capacity.
>
>It wasn't a sharp drop that we saw. More of a curve than a cliff.

Sorry. I shouldn't have said "abrupt". It does tend to dribble off
with a rather soft knee. Also, if you have several mixed cells in
series, of different ages, the knee will appear at different points in
the discharge curve for each cell. The knee will also be less
defined. (A good reason not to mix different age cells).

>>To prevent running the battery into the
>>ground (and possibly reverse polarizing some of the cells when
>>connected in series), the dropout point is as close to the beginning
>>of the droop as possible. That's fine for a new battery, but as the
>>battery ages, the same threshold slowly moves up the charge curve as
>>the terminal voltage decreases. I don't think any of the SOC chip
>>vendors compensate for this.
>
>I'm not aware of any that do.

AN AGING MODEL FOR LITHIUM-ION CELLS
<http://etd.ohiolink.edu/send-pdf.cgi/Hartmann%20Richard%20Lee%20II.pdf?acc_num=akron1226887071>
Warning: 278 page of a grad student's dissertation.
Skipping to Chapter VI - Conclusions, where it says:
A direct correlation was found between the cell capacity and
the open-circuit voltage of a fully discharged cell. Cell
resistance increased at a linear rate throughout the life
of the cells.

>But with the (IIRC) Mitsumi chips we
>used, there was less need than with say nickel chemistries. The
>module monitored cell voltage differences, and would prevent operation
>if the differences exceeded a preset threshold value. With EOD set at
>3v0 (average cell), there is no way any cell would get near to 2v5.

The circuit doesn't monitor individual cells. I would think that one
cell with a bad case of premature aging might cause problems. I'm
beginning to think that I'm worrying over a non-problem. Never mind.

--
Jeff Liebermann jeffl(a)cruzio.com
150 Felker St #D http://www.LearnByDestroying.com
Santa Cruz CA 95060 http://802.11junk.com
Skype: JeffLiebermann AE6KS 831-336-2558
From: Jeff Liebermann on
On Tue, 06 Oct 2009 11:26:50 +0800, who where <noone(a)home.net> wrote:

>The brief was "KISS".

KISS is often a euphemism for "cheap".

>The end-users were real industrial users, and
>quite disinclined to fiddle of even treat the pack and charger as
>other than a black box.

True. There are probably some safety issues involved. It would not
do to have the customer twiddle the charging characteristics and
potentially turn the battery pack into a incendiary or explosive
device.

>If they weren't getting the specified run time, it would be the result
>of improper sizing (vendor fault or user-supplied misinformation), or
>faulty pack or charger (vendor responsibility). Easily resolved.

Nope. There's also the possibility of creative testing. The
applications mix used for testing battery life by MobileMark 2007:
<http://www.bapco.com/products/mobilemark2007/index.php>
has a huge effect on measured battery life. However, there's nothing
standard about the selection of test apps, which could easily be
tweaked by the equipment vendor. I'm starting to see this with
Netbooks, where fairly long battery run times are predicted, but
rarely demonstrated. I had an Acer Aspire One (9" screen) and
currently an Asus 700. Neither has come close to the rated run time
when I use them normally at a local coffee shop.

>Recall that the pack size was chosen AFTER the CV limit was
>determined.

Good. That covers the vendor in case anyone actually tests for the
claimed capacity or run time.

Just thinking about it, there's enough info here to build a table or
graph of the calculated battery life versus run time terminating at
different EOC's. As you note, the closer to depletion I run the
battery pack, the shorter the battery life (measured in charge
cycles).

>>... The Kill-a-Watt meter records the watt-seconds
>>(actually watt-hrs) used. It also compensates for power factor. I
>>throw in the switcher efficiency of about 85-90%:
>><http://www.energystar.gov/index.cfm?c=ext_power_supplies.power_supplies_consumers>
>>and calculate the energy consumption per hour of use. I use that as
>>the average load for run time calculations.
>
>The problem with those meters is that - being cheap/chinese - they
>tend to poorly handle the line current "blips" into a rectifier. And
>even if they returned true RMS, that doesn't itself reflect the actual
>power drawn in those circuits.

Well, yes. The frequency and transient response of these meters is
rather lousy. My guess(tm) is that it has to be about 10 times lower
than the 50/60Hz it's trying to measure. That would put it at about
5Hz (200msec), which is not all that horrible. Also, the filter caps
in the typical laptop will smooth out most transient current spikes so
that the meter never sees the spikes. I don't have a power line
impairment tester to check this, but can probably trace out the
schematic to see how it works. Here's the patent with block diagram
and description:
<http://www.google.com/patents?id=G3MDAAAAEBAJ&dq=6095850>

The top photo is the inside of the older 4 button version. The lower
photo is the current 5 button version:
<http://802.11junk.com/jeffl/pics/drivel/slides/kill-a-watt.html>

Litigatory trivia:
<http://greenpatentblog.com/2008/12/24/smartlabs-enjoined-parties-smart-management-focuses-issues-in-energy-meter-litigation/>

--
# Jeff Liebermann 150 Felker St #D Santa Cruz CA 95060
# 831-336-2558
# http://802.11junk.com jeffl(a)cruzio.com
# http://www.LearnByDestroying.com AE6KS