18650 Battery results not expected (check math)

BobinTN

Member
Joined
Feb 23, 2017
Messages
32
I need help understanding battery math.

I am trying to power my Patio lights as a learning process before I start going for a Tesla Powerwall or Solar Energy Storage Scenario. So, the logic of why I am choosing this method to power my patio or better ways to light my patio is really not part of the discussion.

I thought I had designed a Li-ion battery to power my patio lights for 4 hours. It only powered the lights for 2 hours and 40 minutes. I am trying to figure out why.

I went with a 24V battery because the Lead Acid Specifications of most inverters out there (or at least the ones I wish to buy at this time, have max Volts/Min Volts closer to the specs of the 18650 24V constructed Battery.

I have constructed what I will call a 480 Watt (24V * 20amp) 18650 battery Pack. I am using a 24V inverter (made for lead acid to create 110V/120V output.
With a Kill-A-Watt meter I had determined that my Patio Lights pull 120 Watts/hr. I want to power my patio lights for 4 hours.
So, the above is the simple math of what I am trying to do.
What I have done:
Battery Construction:
I have constructed 3 8V 20AH batteries (2S10P) using the 5x4 18650 cell holders (20 18650 Batteries).
Voltage checks out:
8.4V max
7.4V nom
6.2V min (3.1 considered full drain. This is the lower setting on my charger for capacity testing)
20AH
I have wired 3 of the above 8V Batteries in series to construct my 24V Battery.
Voltage check out:
25.2V max
22.2V nom
18.6V min
20AH
If I calculate from max charge and Amps of Battery I get:
25.2V * 20AH = 504 Watts (It appears to be plenty of Watts for 4 hours of lighting; my 18650 batteries average capacity is more like 2.5AH)
The specifications on the Inverter are:
Input: 24V(20-30V) DC
Output: 110V AC
Continuous Power: 400 Watt
Efficiency: 90%
Output WaveForm: Modified Sine Wave
No Load Draw: <0.5A DC
Battery Low Alarm: 20+/- 0.5 V DC
Batter Low Shutdown: 20+/- 0.5 V DC


After 2.667 hours the lights went out.
Voltage on the Battery Pack was 21V (inverter cut-off 20V +- 0.5)
Each 8.4V cell (7.0V, 7.0V, 7.5V)
If I multiply the hours of runtime versus the Watts/Hour useage I get:
2.667h * 120 W = 320 Watts.
I got 320 Watts out of a projected 480W (504W) Battery.
If I divide the predicted Battery Wattage by the actual wattage output I get:
504w/320w = 1.575
Is 1.575 considered an Overbuild multiplier?
If I want a battery that truly provides 480 Watts (using 18650 cells and this inverter):
I should multiply my desired 480 Watt output by 1.575 (my overbuild multiplier).
480W * 1.575 = 756 Watts.
I should build a 756 Watt battery then I can get 480 Watts of useage (4 hours of patio lighting).
Lets say:
My battery starts at 25.2 volts and my inverter shuts down at 21 volts.
I have used 4.2 volts of my battery energy.
I have not used 2.4 volts (21V 18.6Vmin discharge)

Where are my mistakes and bad assumptions?
Or am I right-on-the-money using a 1.575 overbuild multiplier?
 
First, you should estimate your battery capacity at nominal voltage, not max voltage. A battery under load drops in voltage quickly down to about 3.8 and then remains relatively flat throughout the rest of the discharge cycle until about 3.6v, and then drops quickly again (google Lithium ion discharge curve). Also look at the label on the laptop battery your took apart, it will say 10.8v or 11.1 volts. not the max voltage.

next, your inverter cut out before you were fully discharged, so there is some additional unused capacity in the battery.

Also, the inverter is only 90% effecient, so that takes 10% off your initial estimate.
Also, the inverter draws 0.5 amps.

In this situation, your 1.5 overbuild is probably good since it is based on the actual components to be used with the battery. But I would consider the battery starting capacity to be 22.2v * 20 Ah = 444 Wh, not 480 Wh.
Multiply that by 1.5 and you should be pretty close.
 
Thanks for the schooling. I will start using Nominal Voltage for my calculations.

"your inverter cut out before you were fully discharged, so there is some additional unused capacity in the battery"

So, what are you guys using for inverters where you can take full advantage of the 18650 capacity?

I have looked but found, not I would call, clear answers for this.
 
You will NEVER get full battery usage. You can get close, but never all of it. The reason being is if you try to pull all of the power from a cell, you kill the cell. Plus, some other factors to consider is heat generated during discharge.

Also, you should probably calculate from 3.6v - 4.1v usage of a cell, not 3.4v - 4.2v. Not only will this increase the life cycle of the cells, but it will also create a longer stronger run time. If you look at the lithium curve APD mentioned, you'll see the curve starts to level out about 4.0v - 4.1v and semi-stabilizes as 3.8v. Then it runs at 3.8v for some time until it reaches it's next power drop, at which point it drops down 3.4v or lower; at which point you want to stop discharging so as to not over-discharge the cell.
 
All smart ones where you sett voltages. Like the more advanced full blown systems including chargers do that. Example: PIP4048.

Regarding your Math. I was to slow today and APD had most of the points there already :)

I Suggest you measure the current actually drawn from the batteries instead of on the side of the Light. As pointed out by APD you have a efficiency factor. But you also have the factor that your watt-a-meter may be wrong and depending on the source it may now show whats actually drawn from the inverter....

If your light actually draws 120w (Resistive load) then at the batteries based on 3.7v nominal you would have a draw at about:

120*(1/0.9) (add the loss of the inverter) = 134w.

(134/22.2) + 0.5 = 6.5A => Thats 0.65A per cell. So that is fine.

Lets base above on the calculated 444wh that i agree on.

Above tells us that we draw 144.4w/hour. So on above if you utilize full potential you get 444/144.4 = 3 hours of runtime.

You got 2h 40min.

Lets say your inverter died at around 20.5v -> That give us 3.41V per cell left. This could potentially give us 0-10% left depending on cells used... Lets say 6% that some Panasonic have.

And we basically had 20minutes left based on our theory.
20 minutes would be = 6.5*22.2 * (20/60) = 48wh needed for that lamp.
6% of our battery would be = 0.06*444 = 27wh.

So basically now we are missing 21wh that equals to around 9 minutes. You might have more losses in the pack somewhere but im guessing your 120w is not accurate actually and you might be using more than 6.5A from the battery.

Nevertheless you need to design your pack atleast 20% larger than what you are going to cycle. Or else you will kill ur pack quickly. 50% bigger is even better :)
 
Thanks daromer andKorishan ;

I am learning.

Appreciate you guys taking the time.
 
Also one thing to note.
Is that you have 2 different way of measure capacity.

And this is where the last Wh is that i missed....

For instance if you should go ahead and measure your cells at 0.6A you will most likely get 2500mAh out of them... But why cant you use that?

Thats because how the test is done. Its done at CC-CC. The last juice taken out of the cell is done at a lower current!! Ie many stops at 100mA. So it actually pulls out the last 10-15% with a lower current. To be able to use those you need a bigger pack. One of the reasons you should have a big pack and low current instead of a small pack and high current.

On my page i did a couple of tests (http://diytechandrepairs.nu/fake-scam-18650-batteries/)

http://diytechandrepairs.nu/wp-content/uploads/2017/01/b11_graf_1.jpg

Look at the Current curve. And you will see that the last mA taken out of the battery is not done on the initial 0.5A.


Of course the calculation you had of the Ah was also in the lower end. But now i think you have all the parameters needed to build your pack needed :)
 
I am starting on my Battery redesign based on my new knowledge.
I guess I have a few conceptual questions.
One thought is "if I should keep my cell usage between 3.6 and 4.1 volts, do I need to do a capacity test to see how many mAHs I have when constrained to these parameters ( I am using 2Ah for the cells 3.1 to 4.2 charge) ? Is it too small to worry about?

I am also constrained by the inverter. I cannot change the 20V cut-off. Would the plan be to over build the battery so that it never has a chance to reach the 20V (3.41V per cell) inverter cut-off? I guess the plan would be to still have 3.6V per cell after 4 hours of use.

Also, The inverter can accept up to 30V input. I could make the battery (4.2max * 7 series = 29.4Volts). Does that get me anything in this scenario. Let's say instead of increasing the AmpHours (adding more cells in parallel to the existing 22.4Volt battery.

I apologize if I am missing the point in your responses.
 
No i don't see the need to retest them unless you personally want to. Just check a state of charge chart. Base your number on that and add a factor of 1.5. Dont forget to test it....

Dont change the cut of. That's fine. Note that one issue you can get is voltage drop. A battery at 3.6v could easily drop to 3.4 under just smallest load. So dont forget that resting voltage will be higher than during operation. And the drop is higher the more you stress the cells.


Yes if you go to 7s you get 1/6th more Wh if you keep your pack at same size. And you dont have the inverter to cut of at low voltage
BUT... Then you have no under voltage protection at all.... So if you can live with that or if you can add under voltage protection go to 7s. If not stay where you are and use the built in one in the inverter.
 
Back
Top