Be cautious ordering generic blue wrap LFP cells

Rando

Explorer
It sounded to me that you were suggesting a glycol bath for the batteries.

But even if it is some other way of getting the glycol to transfer heat to the batteries, is the complexity of some sort of boiler or heat exchanger, a circulator pump, a thermostat etc. really a reasonable solution here?

My point here is that this appears to be adding a huge amount of complexity to address something that has not actually been established to be a significant issue. Again, if you are into steam punk and would enjoy doing this, go right ahead, but be cognizant of the cost/benefit analysis.
 
Last edited:

hour

Observer
Thanks for the information, graphs and such.

@Rando I'm a little nervous to simulate a BMS disconnect situation but I unplugged the BMV this morning so nothing to lose there.

Cutting off charge input @ BMS level would simulate both a over-voltage event and a too-cold-to-charge event (which would be needed at BMS level if I remove the BMV from the project). As long as the MPPT stays connected them I'm happy, and it seems like you're saying it would because the BMS can keep discharge enabled while charging is disabled, despite being common port, because of MOSFET stuff inside.. Would the MPPT just sense that the battery was incapable of accepting any amps? I'd prefer not blow it up, so all of this apprehension is a result of not wanting 200 watts of solar @ 38v ready to go and suddenly changing the battery @ MPPT from connected-disconnected-connected-disconnected-etc

It got to freezing last night so I set the can outside at 75% SOC with a 30w panel stretched out in to where the sun lands first - box kind of in the shade against my house. Around 8am the box was 38.5x*F inside-avg of both probes. I adjusted the panel and got it reading above 13v but still too low to charge and the heat was on within 5 seconds.

Ran for 7 minutes until it hit 45*F average of the probes, with top probe (inside BMS) a little warmer, but matching the temperature read by the BMV - sensor also located atop the cells - and thus matching MPPT (low temp cut-off set to 41*F). The temp continued to rise for a few minutes finally settling at 48*F. The heater didn't have to run again before PV input was sufficient and MPPT started charging, about 45 minutes later with another panel adjustment.

So it worked as intended and I'll do the same test tonight->tomorrow. Wouldn't mind pulling the BMV out though, I just rely on it for sharing temp data with the MPPT to block charging and sharing voltage with MPPT from a more accurate point. The latter just results in overcharging, and you can't disable voltage sharing from the Victron SmartNetwork. So if the BMS gives me accurate SOC information just like the BMV and can block charge current by temp, pack overvoltage, cell overvoltage, etc.. while still keeping the MPPT connected to the battery.. I'll yank it out today.
 

Rando

Explorer
It works just fine - try setting your charge under temp to 30C (and release value to 35C). Loads will still work fine, but it won't accept a charge current.

Your MPPT charger will be fine - it will crank the voltage up to absorb setting, hold it there briefly, notice the current is zero, think the battery is charged, then drop back to float.

This is how all the drop in lithium batteries work - and as far as I know, no one is reporting issues with this scheme.
 
Last edited:

john61ct

Adventurer
Trying to get more than 10+ years from a battery bank is noble, but most folks aren't building a $30k pack for an off grid cabin.

5,000 charge cycles with no capacity loss? Not possible. Less than 5%? Maybe, if the cycles are very shallow. 5,000 cycles with less than 10% loss is doable, but calendar aging has to be taking into account.
I think your statements are only true for those treating vendor specs as care recommendations rather than stressful maximums to be avoided with comfortable margins.

Many people claim to cycle between "20-80% SoC", which I consider crazy wasteful of capacity.

3000 cycles with a 2-4% capacity loss from benchmarked peak is bog standard, afaic barely broken in, and still far above mfg rated capacity as sold.
 

luthj

Engineer In Residence
3000 cycles with a 2-4% capacity loss from benchmarked peak is bog standard, afaic barely broken in, and still far above mfg rated capacity as sold.

This is because many batteries are often delivered at 5-10% above labeled capacity. Those 3,000 cycle tests are often performed over short calendar times as well. Calendary aging, especially at ambient temps above 75F degrees, becomes the dominant factor. Even avoiding the top 5%, you can't stop electrolyte and anode breakdown.
 

hour

Observer
It works just fine - try setting your charge under temp to 30C (and release value to 35C). Loads will still work fine, but it won't accept a charge current.

Your MPPT charger will be fine - it will crank the voltage up to absorb setting, hold it there briefly, notice the current is zero, think the battery is charged, then drop back to float.

This is how all the drop in lithium batteries work - and as far as I know, no one is reporting issues with this scheme.

You are correct. MPPT was still putting out 4 watts in float but BMS reported nothing in or out. Maybe it was going directly to discharge (4w from router + other gizmos). Once I heated box up it cleared the low temp alarm and the MPPT started putting out more. This is great.
 

Rando

Explorer
I think your statements are only true for those treating vendor specs as care recommendations rather than stressful maximums to be avoided with comfortable margins.

Many people claim to cycle between "20-80% SoC", which I consider crazy wasteful of capacity.

3000 cycles with a 2-4% capacity loss from benchmarked peak is bog standard, afaic barely broken in, and still far above mfg rated capacity as sold.

What data is this based on?

The actual research (posted before) doesn't support this at all - even under 'mild' conditions the actual measurements are showing 5-10% loss in capacity over a couple of hundred cycles.

If you are basing these on the Chinese manufacturers 'spec sheets' - there is no way these are based on actual data.
 

luthj

Engineer In Residence

My own LFP bank is now beyond 1200 cycles (took a very long time and lots of work to do that) and has exhibited virtually no capacity loss, and no cell drift. In the first 50 cycles I actually saw a minor bump in capacity.
 

john61ct

Adventurer
He actually hit over 2000 some time ago. Purposefully cycling to 80% every cycle, even when he doesn't need to.

See, most people think you should charge to the "maximum stress" rating 3.6Vpc, when in fact that's idiotic. And 99.99% of the time there is absolutely no reason for **any** Absorb Hold Time for a CV stage, certainly not any longer than endAmps=0.05C.

Unless occasionally precisely measuring capacity, and you want to replicate your original benchmark, say twice a year going to that "mfg theoretical" definition of 100% SoC,

e.g. holding 3.65V until a taper as low as 0.02C (2A per 100Ah)

won't do much harm.

Even with a gentler "normal usage cycling" definition 3.45Vpc, endAmps = 0.05C, that AHT is only needed to reset your BM to 100% Full.

Cycles when you're not doing that, just do CC-only charge termination.

And only when loads are ready to run SoC back down, sitting anywhere near Full is very harmful to cycling longevity.

While cycling, setting LVC between 3.1-3.2Vpc depending on discharge current rates, no reason to allow **resting** voltage to get anywhere near 3.1Vpc.

Low charging rates, ideally below 0.2C, certainly no higher than 0.4C even in warm weather.

But if you get internal cell temps up to around 30°C, then pushing up to 0.5-8C, maybe even 1C, may well have little significant impact on cycling lifespan.

Of course once no longer cycling, then cooler the better, but not much below 0°, depending on cell specs.
 

shade

Well-known member
What I keep seeing is that staying in the middle of a battery's specs maximizes life, but going to the limits occasionally (100% SoC briefly for periodic balancing & meter calibration, <0.3C rates at suitable temperatures, <90% DoD) won't necessarily kill an LFP battery. Routinely going to the limits may decrease longevity, and exceeding limits may do so dramatically. Avoid doing both, and it'll be fine.

Determining the limits seems to be the tricky part. Considering how finely tuned a BMS can be to properly care for an LFP battery, I find that surprising. I wonder if the lack of information is a deliberate attempt to prevent undercutting their promise of a "drop-in" solution.
 

john61ct

Adventurer
Yes, I don't believe in drop-ins at all, so my LFP-related statements likely do not pertain.

If a cheap BMS is how you balance for example, with likely a very low balance rate, you may need to sit holding 3.55V or more for many hours or even days after your cells already passed your desired specification for "Full" SoC.

How do you even know what the "start balance" voltage setpoint is, or when the cells are finished getting balanced, how often it needs doing?

The whole premise is, "don't concern yourself with these details, just trust us, we'll back up our warranty just drop in and go".

IMO should only discuss drop-ins, in threads where they specifically are the topic.

Same with a full packaged system from Lithionics or Victron, each has its own "canned care specs" embedded within their proprietary protective circuitry, get trained by the installer and follow their protocols.
 

Alloy

Well-known member
It sounded to me that you were suggesting a glycol bath for the batteries.

But even if it is some other way of getting the glycol to transfer heat to the batteries, is the complexity of some sort of boiler or heat exchanger, a circulator pump, a thermostat etc. really a reasonable solution here?

My point here is that this appears to be adding a huge amount of complexity to address something that has not actually been established to be a significant issue. Again, if you are into steam punk and would enjoy doing this, go right ahead, but be cognizant of the cost/benefit analysis.

Glycol heating here at 1:55

 

john61ct

Adventurer
Note BTW that the underside of solar panels generate a lot of surplus heat that lowers cell efficiencies.

I've seen systems that boost PV output by cooling the panels, and capture significant "passive solar" energy used for hydronic heating.

With decent insulation may even work in arctic conditions. . .
 

shade

Well-known member
Yes, I don't believe in drop-ins at all, so my LFP-related statements likely do not pertain.

If a cheap BMS is how you balance for example, with likely a very low balance rate, you may need to sit holding 3.55V or more for many hours or even days after your cells already passed your desired specification for "Full" SoC.

How do you even know what the "start balance" voltage setpoint is, or when the cells are finished getting balanced, how often it needs doing?

The whole premise is, "don't concern yourself with these details, just trust us, we'll back up our warranty just drop in and go".

IMO should only discuss drop-ins, in threads where they specifically are the topic.

Same with a full packaged system from Lithionics or Victron, each has its own "canned care specs" embedded within their proprietary protective circuitry, get trained by the installer and follow their protocols.
You're right. I should've specified "drop-in" batteries with regard to slippery specs & charging requirements, but can't the same be said for some bare cell providers of questionable provenance?
 

Forum statistics

Threads
185,897
Messages
2,879,321
Members
225,497
Latest member
WonaWarrior

Members online

Top