Power question: REDARC DC-DC charger? What gauge of wire would be needed?


Wiffleball Batter
This is purely an academic question but I know there are a lot of knowledgeable electric system gurus here so I thought I'd ask this:

I keep seeing ads on Expo for the Redarc DC-DC battery charger. Here's an amazon link, although I'm sure most of you have seen the ad on the forum:

I THINK I know what this is but I'm not 100% sure. So tell me if I've got this correctly:

Right now in my Suburban I have a dual battery setup with both batteries in the engine compartment (because the Suburban has a dedicated spot for a 2nd battery, as most modern GM vehicles do.) My "house" battery is connected with the - terminal going to a chassis ground and the + terminal going to an isolator that is, in turn, connected directly to the "starter" battery. Thus, when the engine is running the alternator charges the starter battery, power flows to the isolator, which is closed because the ignition is on and thus charges the "house" battery. When the ignition is off the isolator is "open" and thus the "house" battery will not discharge the "starter" battery when the ignition is off because the two batteries are not connected. So far so good, right? A basic "isolated dual battery" setup (my "house" battery is pretty much dedicated to the Truck Fridge and I don't use it for much of anything else except maybe occasionally to charge a phone.)

So let's say I have a vehicle that does not have space for a 2nd battery under the hood and I decide I want to run a "house" battery in, let's say, the bed of a pickup (appropriately sheltered, of course.) Certainly I COULD set it up the same way as my current setup is, with a direct connection from the House battery to an isolator and then from the isolator to the starter battery, but that might require me to run heavy 1 or 2AWG cable from the house battery to the isolator and then from the isolator to the starter battery - which is both difficult and expensive. But the cable needs to be 1 or 2AWG because of the immense amp load it's carrying. Depending on the size of the truck and the location of the starter battery, that could mean the necessity to run 10 - 15' of 1 or 2AWG cable.

Have I got all that right?

So if I'm understanding right, the REDARC DC-to-DC charger is a way of keeping a 2nd battery charged WITHOUT needing to have a direct connection to the main battery and/or a battery isolator, yes? So in theory, I would not need an isolator (because the batteries would not be directly connectord.) I would run a power cable from the starter battery to the bed of the truck, and then a much shorter heavy-gauge cable from the REDARC DC-DC battery charger to the battery.

EDITED TO ADD: And as a bonus the REDARC also seems to function as a charge controller so I should be able to hook a solar panel directly to the REDARC and it will regulate the charge to the battery without the need for a separate solar panel controller? Is that right?

So other than trying to understand the value of a DC-DC charger, my only other question would be: What gauge of wire would be needed to connect the REDARC DC-DC charger to the battery? Presumably it would not have to be the same 1 or 2AWG that I would need for a direct battery connection because it's not carrying that much power (I think the one I linked above is 40a max.) So what gauge would be needed? Isn't there some kind of chart or guideline that says "for X amps and Y length you need Z gauge of wire minimum?"

As I said above, I'm not contemplating doing this, I'm just trying to figure out how this would work.

Thanks in advance for any help!


Wiffleball Batter
Broadly, speaking, While there may be 'immense amp load' carried, Depending on how far discharged and size of the low battery, that load may not be all that immense afterall, or last for very long before it drops to a modest load.
Large size conductors also reduce resistance of that charging circuit. It does not take much resistance and charging voltage drops too low for timely charging.
Blue Seas Systems website has a good chart. Its suggested values are from USCG and AYBC. There are others...
So in my hypothetical, let's say I'm setting up a "remote battery" in the bed of a truck that I want to connect to the main battery through an isolator. Let's say the wire distance is 10' (it would probably be less than that, but round up to 10' just to be safe.) Assuming I have a 150a fuse at the battery end, what is the smallest gauge of wiring that I could safely use?

EDITED TO ADD: Using the chart that Dreadlocks posted above, my question would be: How do I figure the maximum amp load between batteries? Do I use the amp rating of the bigger battery?

IOW, let's say my starter battery is a 12v AGM rated at, say, 60AH. The "house" battery is a 12v AGM rated at 68ah. When using the calculator to calculate "amp load", what do I use as the "amp load?" It seems to me I have to assume the possibility of a deeply discharged "house battery" that is going to try and "pull" a lot of amps from the starter battery and/or alternator, right? So what do I use? The AH rating? The CCA rating? Something else?
Last edited:


Well-known member
batteries really wont overload wiring like loads will.. yeh see if the wiring is too small, the voltage just drops far enough its no longer changing battery.. so you can basically limit a batteries charge rate with the wiring, while it may get a lil warm its not gonna catch cabling on fire like trying to run something like say a motor on too thin of a cable.. the motor is going to need lets say 200W and as voltage drops the amps goes up and the wattage remains constant.. so the wiring is going to catch on fire.. whereas charging a battery, the loads drop linearly with the voltage.. the wattage is not constant, if you cant provide higher voltage than the battery its not charging.

If you want your AGM's to charge as fast as possible, I believe they want to be setup for 0.4C but its not that much faster than 0.2C in the end, 1C = Capacity in AH.. so a 68AH would be between ~14A-28A ideally to charge it.


Did not read all that, but

You do not charge literally from battery to battery, the Starter and House labels are representing the circuits, normally isolated, each with its bank and charge source(s).

The DCDC will limit current, so can size wire based on that.

If using a VSR instead, then don't worry about amps flowing from bank to bank.

Size wire for the max current from charging, or biggest loads, and/or combined.

Blue Sea ML version of their ACR can even handle the possible 600A from self jumpstarting, basically bulletproof.


Expedition Leader
Thats almost impossible to get a precise number for. At best you could assume worst case scenario.
If you know the maximum current a discharged battery could potentially draw at whatever voltage is available. The available voltage and current is also a variable depending on specifics of the charged battery and/or alternator plus whatever resistance in the circuit.
That's the crux. Battery charging is complex and depth of discharge doesn't strictly follow the water flow analogy everyone uses to explain electricity. A discharged battery isn't exactly like an empty bucket and having a flow of water (electrons) isn't guaranteed to be huge to refill it necessarily. The circuit is inherently limiting, be that due to the internal resistance of the batteries, the voltage/current relationship of the battery or alternator or through the wire.

It's probably easiest to point out that truck manufacturers don't put a 1/0 AWG hot feed in a trailer harness because they know a 10 AWG feed will never overload regardless of 10% or 90% depth of discharge. The current through the harness resistance means a voltage drop that implicitly guarantees that the discharged battery end of the circuit can never see a voltage large enough to force any more than the circuit is capable of supplying.

The reason you put in a DC-DC charger is to (1) guarantee a max current and (b) a known voltage. By decoupling the source and load in the circuit you transfer a fixed amount of power. This allows you to control the voltage primarily to best utilize the current you do have, which might be higher or lower than a simple parallel circuit. With a dumb circuit you force current but there's no way to actually efficiently charge a battery. It will recover some charge, it might not. It all depends. With a Redarc or similar device the aux battery gets a consistent charge.


The wire and fuses will determine the voltage drop, and that combined with the battery will determine max amp load. Generally speaking a low internal resistance AGM will accept C/5 charge rates at ~14V at 50% SOC. This will start to taper pretty rapidly, and by 85% SOC, it will be C/10 or less in some cases.

So to wire for max charge rates. Take your alternator voltage, and subtract 14V. Say 14.2-14=0.2V. Using ohms law, your entire charging circuit needs to be 0.2/(C/5)=R So for a 100AH battery that is 0.2/(100/5)= 0.01 ohms, or 10milliohm.

For 1AWG it is 0.1239 milliohms per ft. So you have 10/.1239=80.7ft. Now you need to include fuses, terminals etc. So really you would max out at around 30ft for the 0.2V drop. With 75C insulation, 1AWG can carry 130A continuous, or easily 200A for a minute. So that would handle all but a very large bank connected to a 250A plus alternator.

You can look up the resistance of common fuses to use in the math. Add 1-1.5 mOhm for each terminal connection. Don't forget to estimate the alternator and engine wiring (if you are using that in your circuit). Repeat the exercise with your run distance/resistance and alternator voltages. You will likely find that a 2-4AWG will limit a 100AH battery to 20A continuous with any considerable run length.

For example I have a 500AH agm bank with very low internal resistance. With 30ft round trip of half 2/0 and 4/0 cable, 2 fuses etc, I can just barely max out my 200A alternator at 50% SOC. I have a 200A ANL fuse that has never blown in this circuit.

Of course this is all moot with a dc-dc charger. with a dc-dc make sure to have less than 0.5% voltage drop between the dc-dc and the battery. Size the alt-charger cables for less than 2% drop at peak load, fuse to protect the wire, call it done.
Last edited:


Expedition Leader
Battery to Battery chargers, aka B2B/DC-DC/etc. are great toys when your vehicle's regulator/alternator system does not match the voltage requirements of your auxiliary/camper/house/winch/whatever battery.

Common applications include:

-- Vehicles which run at 13.9v. This includes many Toyotas and some Mercedes. Lead acid batteries typically want 14.4v @ 20C/70F. (Note: This varies with temperature.)

-- Vehicles with new "smart" alternators that may include regen braking, etc.

-- Secondary batteries with very different profiles, e.g. lithium.

A B2B typically works by loading the starter battery so that the voltage drops to a nominal 13v. This tells the voltage regulator to ramp up the amps. The B2B then takes those amps and boosts (or, on occasion, drops) the voltage to whatever is required to charge the secondary battery. It can do this because it draws more amps than it can provide.

A B2B also serves as a form of intelligent isolator, so it will not turn on unless certain conditions are met, typically, that the starter battery be at a voltage greater than 13v. So also incorporate an ignition trigger.

So, in your case, the wiring has to be able to carry slightly more than the rated output of the B2B - because all B2B lose some energy as heat (that is why most have fans and heat sinks) and most trade amps to get volts. Read the owners manual and check the tools mentioned in this thread.

In the case of a Chevrolet and a lead acid battery, there is simply no reason to use a B2B as the Chevrolet alternator/regulator is already a very sophisticated, temperature sensing, system. Install an intelligent relay, like the Blue Sea ACR, properly sized wires, and be happy. If your secondary battery is connected to solar or shore power, then you will also gain the benefits of a maintenance charge for your starter battery(s).

Hope this is useful and, given its length, free of typos and id skids. ;-)


Expedition Leader
Overcoming voltage drop from very long wiring runs may still remain.

When the gauge copper gets past $40 a foot installed for example.
Eliminating the effect of voltage drop on the supply side is one of the greatest benefits of using a DC-DC power supply.


Supporting Sponsor | SEES


Expedition Leader
Eliminating the effect of voltage drop on the supply side is one of the greatest benefits of using a DC-DC power supply.
But only to a point. All of the B2B makers spec BIG wires and recommend going larger if there is any question.

Remember a B2B typically has to draw more amps than its rated output in order to have a reserve to boost the voltage. But, as most B2B are only 25-50A, you are often dealing
with smaller wires.

Also worth noting that voltage drop is related to amperage, as the amps drop, so does the voltage drop. Prove it to yourself, play with the voltage drop calculator linked in this thread.
Last edited:


Expedition Leader
But only to a point. All of the B2B makers spec BIG wires and recommend going larger if there is any question.
It's never going to hurt to oversize or reduce voltage drop. Efficiency and stability might vary, so you should certainly follow the designer's recommendations. But in general the size of the cables you use should be unlikely to critically affect the operation of a DC-DC supply.

So say a supply is rated 40A @ 12.5V (say 500 watts) and you use 10 AWG it'll probably work fine even with close to a drop of 1 V for a 20 foot run. The question is about what's safe, though.

This depends on how long the current is applied and what type of insulation you select. Cheap 75ºC PVC under the hood is probably going to warm enough to melt through after a few minutes supplying that 500 watts supply using 10 AWG. My hasty calculation is this conductor will see 33ºC of temp rise. But if you use 125ºC EPDM on that 10 AWG you may be safe indefinitely.

So from a safety standpoint a 500 W supply is better spec'd for something large, perhaps at least 6 or 8 AWG, because a conductor of that size at ~40 A won't warm up as much and then the lowest common denominator, an Overlander using an old extension cord dug out the garage with an unknown insulation type, won't be writing negative reviews on Amazon about his battery charger causing his brand new Tacoma to burn to the ground when a wire melted through its insulation and into fuel line he zip-tied it to.

This is the basis for all the rule of thumb charts for wire sizing you see, be it from NEC, USCG or SAE or otherwise. They assume a wire size and current which results in a safe temperature rise for whatever recommended insulation and then add margin based on expected environment. In the NEC they have charts for conductors in free space and for ones bundled into conduit. The marine charts have them for raceways or behind bulkheads, etc.