Understanding Battery separators versus battery isolators, for charging via a car alternator.

Wrathchild

Active member
I’m genuinely curious about my Silverado.
If alternators output lower voltage the warmer they are, and batteries don’t like to sit at 14+ volts, then why does my voltage get the highest after 45 min of driving?
Ive got a voltage readout on my cigarette lighter plug and I see my highest voltage 14.8-15.3 after I get to work(45min drive) and let the truck idle for a few min. Longish 2 hr drives do the same while still driving. Initial voltage is something like 14.3.

Newer silverados have “smart” alternators. I think it has set output voltages of 14, 13 and 12 something based upon what your actually battery charge is at. Mine used to always take a several hour drive before the alternator dropped into the 12s. Now that my solar is topping up the truck Battery along with the house I see 14 ish for the first few mins then it drops down to 12 ish.

Your battery current sensor may have died and that’s why you’re not seeing alternator output variations.
 

rayra

Expedition Leader
I’m genuinely curious about my Silverado.
If alternators output lower voltage the warmer they are, and batteries don’t like to sit at 14+ volts, then why does my voltage get the highest after 45 min of driving?
Ive got a voltage readout on my cigarette lighter plug and I see my highest voltage 14.8-15.3 after I get to work(45min drive) and let the truck idle for a few min. Longish 2 hr drives do the same while still driving. Initial voltage is something like 14.3.
surface excitation of the battery, among other things. Park the car and check it a couple hours later, you get a lower reading.
 

moose545

Active member
Big difference between wire gauge sufficient for passing max charging amps from your alternator rating, vs the juice needing to be passed for an starter (which on most vehicles is 1/0 or 4awg. I would not be relying on 8awg to power a starter. Guess it depends on the draw of your starter.

a6448405-101-wire_chart.gif




btw, I can vouch for these inexpensive 20' 4awg jumper cables, they are a true fine-strand copper wire set

even with the bunch of cheap accessories in the kit, even at full kit price for just the wire it comes to <75cents/foot.

So the good news is my cables are run to power the AUX in the rear compartment; bad news is I should've gone 25' not 20' as I'm short now. Direct run I would've had slack, but I had to run to the end on the truck to use the grommet there, then back forward to the rear of the second row seats, then to the passenger side where the box will be. When adding the extension of 3-4' to the battery now, are Anderson plugs the best way to splice the 4AWG? If not what would you use? I'm off to get the wire at Home Depot, connectors will be cheaper on Amazon I'm sure, doubt HD even has them. thanks
 

burleyman

Active member
The 50 dollar dual-battery thread is golden, and great for simple auxiliary battery charging, but It pays to know what you have to begin with. Either you have a charging system with a high enough voltage or you don’t. If you don’t, either boost the alternator’s charging voltage somehow or use other charging systems with the required higher voltage. If you have the higher voltage, you have amps flowing or you don’t. You don’t know how many charging amps are flowing unless you measure it. Installing the 50 dollar special does not guarantee high current/amps house battery charging from the alternator even if it has the required high voltage..

Using the alternator for charging an auxiliary battery with a solenoid/isolator OR a DC-DC booster can really tax the alternator.

Probably TL;DR. I’m not a Legitimate ExPo Guy, and basically interested in keeping a 12v fridge running with a simple system. This is not for the highly technically skilled posters who’ve advanced way beyond my needs and experience, and surely needs editing.

The values below are from over three years ago, old man memory, might not jive mathematically, and are meant for system operation rather than specifics.

I purchased a 1994 Econoline with a factory installed simple solenoid-connected system that seemed to be the fifty dollar special, and a 130 amp alternator that maxed out at 14.8vdc. The voltage reduces magically to about 14.0 when both batteries are charged and everything’s heated up. Batteries up front on both sides, connected with a #4 copper wire. 100ah AGM for house. Solenoid energized except in OFF and START. I later found out why the batteries aren’t connected in START. I installed a toggle switch to turn off the solenoid when desired.

Happy, lucky me until noticing that when the house battery needed full charge, the voltage measured from start to house battery positive posts was .6vdc or so, or the house battery read .6vdc less than the charging battery.

That made it seem the solenoid’s contacts were iffy, causing a resistance, reduced amp flow and power loss. The continuous duty solenoid is hidden underneath the start battery’s box, accessible by removing the headlamp and looking through a fist-sized hole behind it. I finally noticed the smallish (no larger than #10) yellow wire on the main contact right side. The other side has the red #4 headed to the house battery. ?????? Picture below.

I “won” a 1994 EVTM (wiring) manual and discovered the yellow wire on the solenoid was fed from a hot all the time 60 amp fuse “T” in the engine fuse panel. DC voltage measured directly across the solenoid’s contacts at maximum charging read .009VDC. Good contacts. The drop of .6VDC measured from battery to battery was actually due to resistance from the fuse box source to the solenoid on that small yellow wire instead of across the contacts. It seems the Ford designers deliberately undersized the yellow wire for resistance to limit current flow and the fuse made sure it didn’t rise above 60 amps maximum. That 130 amp alternator suddenly seemed much smaller. Diagram picture below.

Then it got worse. My biggest mistake was not measuring actual current flow to the house battery when maximum charging was needed as step #1. Stupid me. Without knowing actual DC charging amps, everything else is speculation. That importance of measuring amps is mentioned more in older threads, but not so much recently in the DC to DC charging/solenoid/isolator debates. Whatever system is used, device amp ratings or voltages mean little if the amps aren’t flowing.

My 100 amp hour AGM will accept amps in the mid-upper forties connected to a 14.8vdc source with heavy cables when down about 12.2vdc, but that charge current goes down fairly rapidly to the thirties. When using the original factory supplied small yellow wire feed through the solenoid, the initial charge current is slightly less than thirty amps, and drops to about sixteen amps in short order as battery voltage/internal resistance rises and possibly the yellow wire heats up. Let’s hear it for the 14.8v, 130 amp alternator and the amp flow limiting wiring!

ALL of that 130 amp alternator’s possible output flows through two #12 gauge grayish fusible link wires, also in the diagram below. Go dig around the web and find out what current those fuse links can pass for how long before blowing. That is not just a Ford thing. When they heat up, they become resistors. Try to feed a too-hungry battery or accessory load, possibly poof, and you’re hoping to make it somewhere on battery power alone. I can see why sometimes links get replaced by fuses.

The next experiment was with a 2000 watt inverter connected to the house battery, with a 120vac, 500 watt floodlight load. House battery drained to about 12.2V, not connected to start battery, engine not running. DC amps from the house battery to inverter/floodlight was upper forties. Fire up the engine, and amp flow from the factory solenoid’s previous maximum of thirty amps now reads fifty-something. ?????. Getting close to the 60 amp factory fuse rating. That 60 amp fuse also probably explains why the solenoid is not energized in START. Imagine a discharged start battery.

It seems the factory system was intended for house battery charging ONLY, and not the house battery AND significant fixed loads connected to it at the same time. It seems internal battery resistance was considered and its resistance would quickly rise and amp flow fall, maybe for longer alternator life? Warranties?

Next experiment. No load on the house battery. House and start batteries both drained to about 12.2vdc, and a #1/0 AWG welding cable with clamps ready to connect as soon as the engine started. That ought to solve the voltage loss from the factory system. It did. Got almost fifty initial amps to the house battery, don’t know how much to the start battery and some protesting from the alternator, engine, and belt at idle. A hungrier battery chemistry accepting more amps could really make something angry fast.

For over three years, when needing faster charging, before leaving camp, I let the factory solenoid system bring the house battery up a little to limit the wallop to the alternator from the big cable, then close the manual 100 amp switch, and drive. Turn off the big cable when parked. That manual switch will someday be something smart. I’m becoming more forgetful.

Two digital voltmeters, both reading identically when connected to the same source, checked against a calibrated meter, now reside under the dash within eyesight. One connected to start, the other to house. It is depressing sometimes how long it takes for them to equalize. When solar voltage is greater than alternator, it’s interesting going under the shadow of an overpass. Voltage blip blip. The batteries equalize sooner via the 1/0 cable. I know how to replace the alternator, as well as the brush/regulator assembly and bearings. They are also quite inexpensive, long lived, but aggravating to repair on the road.

Measuring amp flow is critical, especially on a simple system without current monitoring. Photo of old man collection going back forty years.

When tenths of volts are important, so are accurate voltmeters. Attaching multiple voltmeters, all connected to one 12v battery, can be a real eye-opener. Most of the readouts on charge controllers and other solar doodadery I’ve seen are close. Some multimeters have adjustments inside to calibrate. My only source for comparing against a bonafide calibrated meter is at a battery supplier/starter/alternator rebuilder forty miles away I occasionally visit.

If there is ever a next time involving house/auxiliary battery charging, step #1 will be to to connect the house battery, discharged to an acceptable low voltage, connected to the start battery with the cables to be used, and measure DC current flow and voltage loss from post-to-post, while listening for engine/alternator protests. Then maybe install.

If using a DC-DC booster, are the rated amps actually flowing? Not often mentioned is that the current/amps flowing into a DC-DC boost charger can be 125% of the output. A 20 amp rated DC-DC charger could be drawing 25 amps, 40 amp could draw 50 amps from the alternator. So both an isolator/solenoid or DC-DC boost charger places strain on the alternator/belts/pulleys, and you don’t know how much strain unless amps are measured. The 50 dollar special works for my application, but charging amps to the battery aren’t known without measuring.



IMG_2967.jpgIMG_2966.jpgIMG_2978.jpg


Found this Youtube video after typing this. It explains much better than the blather above.

 

burleyman

Active member
Thanks Verkstad. I hadn't thought of that. I can't even hear the fridge's low voltage alarm/shutdown squeal without a hearing aid.

The thought that a typical factory-supplied alternator is a huge bulk charging device that can be utilized with some big wire and a 200 amp solenoid, then simply driving while providing maximum alternator rated amps may need reconsidering. Even the 50 amp input to a 40 amp dc-dc charger can load the alternator. Without measuring charging amps, it's all speculation.

If the alternator is heavily loaded, its life is shortened, and there is an increased strain on the belt system.

Instead of heavily loading the vehicle's charging system, it's enough to make me consider running my seldom used generator, powering an inexpensive dumb charger rated for the desired bulk amps, while driving.
 

DiploStrat

Expedition Leader
Dude, you are WAY overthinking/worrying this. I have a all electric camper, my second, and blow through and recover around 125Ah a night. My Chevrolet with a 200A alternator set and my Mercedes with a 100A/24v alternator (and B2B) have handled these loads easily, for years. I do have a healthy solar charging system as well, but your vehicle's alternator is a great place for bulk charging.

Things have changed with the adoption of lithium batteries, but a cooler/fridge and a 100 Ah AGM battery is simply not enough of a load to worry about. Really.
 

burleyman

Active member
I'm not worried and stated the alternator system works just fine for my measured charging needs for over three years and will continue. I like it, also in conjunction with solar. My worrisome blather was related to the 100+ amps some posters claim can be had from their stock alternator systems for charging, by basically adding a large wire to the other battery. That could be a problem, especially if those vehicles have smaller rated alternators and charging current remains high for a longer period of time.

That overthinking was just a warning that overloading your stock charging system is possible and that ammeter measurements can be valuable to avoid problems. Know your system's capabilities beforehand.
 

rayra

Expedition Leader
I'm not worried and stated the alternator system works just fine for my measured charging needs for over three years and will continue. I like it, also in conjunction with solar. My worrisome blather was related to the 100+ amps some posters claim can be had from their stock alternator systems for charging, by basically adding a large wire to the other battery. That could be a problem, especially if those vehicles have smaller rated alternators and charging current remains high for a longer period of time.

That overthinking was just a warning that overloading your stock charging system is possible and that ammeter measurements can be valuable to avoid problems. Know your system's capabilities beforehand.

blather indeed. You're reading things that weren't said, while reading into and giving far greater value to stuff like 'increased load on alternator = drag = increased belt wear. A $15 / 100,000mi part, at that.

Been pretty clear about factory ALT amp ratings. And aftermarket ALTs with higher ratings. Nor am I even making the blanket statements you infer. I'm describing what works for me, and why I think / believe it is an adequate solution for Aux battery charging and moreso a far better solution for charging while driving, than some DC-DC restrictive device that add a stiff $ cost while LIMITING power transferal to an aux battery.

Quite simply all those vehicle elements are wear parts, they'll need to be replaced if one keeps / uses a vehicle long enough. Not getting the maximal wear-life out of them is an asinine line of argument. Especially if the concomitant argument is 'hey spend hundred$ more on this new charging device that 'solves' your mismatched battery chemistry'.


Just as an aside, bought my next standard grp78 battery today, in as my Starter / vehicle battery. Bumped that battery to my Aux. Bumped my Aux to my old pickup. about 18mos between buys. My batteries are relatively fresh, heading into the hottest part of the year. I tend to replace stuff before it fails. 'Preventative maintenance'. Another reason I find some nebulous argument about increased belt wear in a charging discussion to be, well, ludicrous.
 

fratermus

FT boondocker
It's Cargo Cult nonsense

A user with no higher-voltage charging source (solar, shore, or suitable alternator) to reach battery mfg charging setpoint recommendations might get benefit from a DC-DC charger. As for me, I use a regular VSR since solar handles absorption/float duties. I have no dog in this fight. Rather I am pushing back a bit on the "cargo cult nonsense" angle.


I can't see the use case for these things for a House battery. Other than the need to charge an AGM to the Nth degree.

To my mind, charging a deep cycle battery to mfg specs is "charging correctly" rather than a case of "Nth degree", but maybe that's po-tay-toh / po-tah-to on my part. I am on a very small stipend and live in my camper offgrid full-time. Battery health and performance are critical to me.


during which time you are driving. And instead of the 100A+ you could be putting into that Aux battery from your vehicle alternator, you are instead limited to the ~20A (or 50A or whatever)

AGM batteries charged at their max rate hit typical alternator voltage (high thirteens) from 50% DoD in just a few minutes. Like less than 5 minutes. At that point the battery will no longer be accepting that admittedly foxy 100A rate. Smaller AGM, say 200Ah, would only be able to pull ~70A max from the alternator. My 220Ah of flooded GC will pull ~42A max at 50% DoD, right in line with the C/5 rule of thumb. I'm not going to buy a DC-DC charger, but if I did the 50A would be more than my bank could use.

But alternator-only charging lead chemistries in land-based vehicles really isn't about proper charging; it's about the least bad form of undercharging (slowing down the battery murder). Are we better off going with sufficient current and insufficient voltage, or sufficient voltage and insufficient current? It's expensive to get both sufficient current and voltage from an alternator, and even if we laid out the $$$ it still wouldn't fix the Absorption duration problem.

Hence adding some solar (as I think we agree on).


Or even more useful, vehicle solar with a charge controller that will properly top off that AGM battery, on top of the bulk charge provided by your vehicle?

Yes, that or a battery chemistry that doesn't care about partial states of charge.
 

fratermus

FT boondocker
So how many amps can a discharged AGM take?

Typically C/3 until the alternator's voltage or controller setpoint is reached. After that the battery takes less and less current until it bottoms out (or you run out of gas).



How long to charge via the alternator?

More time than most folks are willing to drive/idle on a daily basis. Correctly charging (proper absorption and duration) a deeply cycled AGM takes many hours and a proper charger. Typically 5+ hours for a healthy AGM and longer for an old/abused one.

Charging at normal alternator voltages is not recommended by any lead deep cycle battery manufacturer that I know of. RVs used to do that with single stage converters that basically charged at Vfloat, and it wasn't pretty.

I am torn between using a selonoid, a dc/dc charger, or an inverter and a 110v charger while on the road camping. I have a maintainer where the battery lives 99 percent of its life in the garage. Vehicle is a Subaru Outback

  • the most robust solution would probably be house battery + solenoid + a bit of solar
  • I'm not a fan of those lithium power pack all-in-one things, but if you are weekending with light power requirements it might be a good fit.
  • Since you get back to shore power regularly, carbon-foam and a regular solenoid would also work. Carbon-foam doesn't care as much about proper charging while you're out as long as it gets a good charging when you get back.
  • Same for lithium house battery (BB, etc), but that's a lot of capacity you might not need
  • if you are driving for many hours several times a week the DC-DC by itself would be fine-ish
  • if your power needs are very light you could mount a panel on the roof rack and run the charge controller to the starter batt. The loads would come off a low voltage disconnect so you couldn't run down the starter batt. You'd charge devices, etc, when there was excess power coming in and only run tiny stuff like LED lights and a personal fan at night.
 

rayra

Expedition Leader
well fratermus I'm coming at it reacting to the sudden rash of DC-DC topics flooding this board. And most by folks who seem to have picked up the idea somewhere and don't seem to really grasp the the technicalities of what they are cobbling together. They aren't even aware of 'first principles' and only vaguely cognizant of what they are trying to accomplish. Of course it is their money to spend and I'm not trying to make a virtue out of poverty. I'm just laughing and cursing when I see a person trying to poorly execute a design with a specialty device that costs more than my entire system cost to rig. So I find it difficult not to deride it. I do try to inform and provide rationales and info sources for what I post in these regards. Just a lot of very overpriced stuff under the 'overlanding' marketing umbrella and seemingly very little grasp of tech and design goals. More of 'hmm, I'll just stick this $500 device in here and totally throttle the available power to the thing I want to have as much power as possible.' Just makes no sense.
 

Forum statistics

Threads
185,529
Messages
2,875,563
Members
224,922
Latest member
Randy Towles
Top