BobinTN
Member
- Joined
- Feb 23, 2017
- Messages
- 32
I need help understanding battery math.
I am trying to power my Patio lights as a learning process before I start going for a Tesla Powerwall or Solar Energy Storage Scenario. So, the logic of why I am choosing this method to power my patio or better ways to light my patio is really not part of the discussion.
I thought I had designed a Li-ion battery to power my patio lights for 4 hours. It only powered the lights for 2 hours and 40 minutes. I am trying to figure out why.
I went with a 24V battery because the Lead Acid Specifications of most inverters out there (or at least the ones I wish to buy at this time, have max Volts/Min Volts closer to the specs of the 18650 24V constructed Battery.
I have constructed what I will call a 480 Watt (24V * 20amp) 18650 battery Pack. I am using a 24V inverter (made for lead acid to create 110V/120V output.
With a Kill-A-Watt meter I had determined that my Patio Lights pull 120 Watts/hr. I want to power my patio lights for 4 hours.
So, the above is the simple math of what I am trying to do.
What I have done:
Battery Construction:
I have constructed 3 8V 20AH batteries (2S10P) using the 5x4 18650 cell holders (20 18650 Batteries).
Voltage checks out:
8.4V max
7.4V nom
6.2V min (3.1 considered full drain. This is the lower setting on my charger for capacity testing)
20AH
I have wired 3 of the above 8V Batteries in series to construct my 24V Battery.
Voltage check out:
25.2V max
22.2V nom
18.6V min
20AH
If I calculate from max charge and Amps of Battery I get:
25.2V * 20AH = 504 Watts (It appears to be plenty of Watts for 4 hours of lighting; my 18650 batteries average capacity is more like 2.5AH)
The specifications on the Inverter are:
Input: 24V(20-30V) DC
Output: 110V AC
Continuous Power: 400 Watt
Efficiency: 90%
Output WaveForm: Modified Sine Wave
No Load Draw: <0.5A DC
Battery Low Alarm: 20+/- 0.5 V DC
Batter Low Shutdown: 20+/- 0.5 V DC
After 2.667 hours the lights went out.
Voltage on the Battery Pack was 21V (inverter cut-off 20V +- 0.5)
Each 8.4V cell (7.0V, 7.0V, 7.5V)
If I multiply the hours of runtime versus the Watts/Hour useage I get:
2.667h * 120 W = 320 Watts.
I got 320 Watts out of a projected 480W (504W) Battery.
If I divide the predicted Battery Wattage by the actual wattage output I get:
504w/320w = 1.575
Is 1.575 considered an Overbuild multiplier?
If I want a battery that truly provides 480 Watts (using 18650 cells and this inverter):
I should multiply my desired 480 Watt output by 1.575 (my overbuild multiplier).
480W * 1.575 = 756 Watts.
I should build a 756 Watt battery then I can get 480 Watts of useage (4 hours of patio lighting).
Lets say:
My battery starts at 25.2 volts and my inverter shuts down at 21 volts.
I have used 4.2 volts of my battery energy.
I have not used 2.4 volts (21V 18.6Vmin discharge)
Where are my mistakes and bad assumptions?
Or am I right-on-the-money using a 1.575 overbuild multiplier?
I am trying to power my Patio lights as a learning process before I start going for a Tesla Powerwall or Solar Energy Storage Scenario. So, the logic of why I am choosing this method to power my patio or better ways to light my patio is really not part of the discussion.
I thought I had designed a Li-ion battery to power my patio lights for 4 hours. It only powered the lights for 2 hours and 40 minutes. I am trying to figure out why.
I went with a 24V battery because the Lead Acid Specifications of most inverters out there (or at least the ones I wish to buy at this time, have max Volts/Min Volts closer to the specs of the 18650 24V constructed Battery.
I have constructed what I will call a 480 Watt (24V * 20amp) 18650 battery Pack. I am using a 24V inverter (made for lead acid to create 110V/120V output.
With a Kill-A-Watt meter I had determined that my Patio Lights pull 120 Watts/hr. I want to power my patio lights for 4 hours.
So, the above is the simple math of what I am trying to do.
What I have done:
Battery Construction:
I have constructed 3 8V 20AH batteries (2S10P) using the 5x4 18650 cell holders (20 18650 Batteries).
Voltage checks out:
8.4V max
7.4V nom
6.2V min (3.1 considered full drain. This is the lower setting on my charger for capacity testing)
20AH
I have wired 3 of the above 8V Batteries in series to construct my 24V Battery.
Voltage check out:
25.2V max
22.2V nom
18.6V min
20AH
If I calculate from max charge and Amps of Battery I get:
25.2V * 20AH = 504 Watts (It appears to be plenty of Watts for 4 hours of lighting; my 18650 batteries average capacity is more like 2.5AH)
The specifications on the Inverter are:
Input: 24V(20-30V) DC
Output: 110V AC
Continuous Power: 400 Watt
Efficiency: 90%
Output WaveForm: Modified Sine Wave
No Load Draw: <0.5A DC
Battery Low Alarm: 20+/- 0.5 V DC
Batter Low Shutdown: 20+/- 0.5 V DC
After 2.667 hours the lights went out.
Voltage on the Battery Pack was 21V (inverter cut-off 20V +- 0.5)
Each 8.4V cell (7.0V, 7.0V, 7.5V)
If I multiply the hours of runtime versus the Watts/Hour useage I get:
2.667h * 120 W = 320 Watts.
I got 320 Watts out of a projected 480W (504W) Battery.
If I divide the predicted Battery Wattage by the actual wattage output I get:
504w/320w = 1.575
Is 1.575 considered an Overbuild multiplier?
If I want a battery that truly provides 480 Watts (using 18650 cells and this inverter):
I should multiply my desired 480 Watt output by 1.575 (my overbuild multiplier).
480W * 1.575 = 756 Watts.
I should build a 756 Watt battery then I can get 480 Watts of useage (4 hours of patio lighting).
Lets say:
My battery starts at 25.2 volts and my inverter shuts down at 21 volts.
I have used 4.2 volts of my battery energy.
I have not used 2.4 volts (21V 18.6Vmin discharge)
Where are my mistakes and bad assumptions?
Or am I right-on-the-money using a 1.575 overbuild multiplier?