Hrmmm...I jumped the gun with last night's quicky reply just before I hit the sack. Should have slept on it first. Woke up this morning and realized that stock automotive "ammeters" generally aren't exactly either of the types I described last night (at least not the modern ones).
Duh.
Got a little more time and a fresh cup of coffee now, so any goofs can't be explained by being tired or in a hurry - they'll have to be put down to just plain stupidity.
So lemme see...a stock automotive "ammeter" measures the difference between the alternator and the battery. Looking at the one in my truck (76 E-250) I see it's not really an "ammeter"...what it is, is a "charging indicator". It's only marked D and C and doesn't indicate amps.
Digging out the Chilton's for that truck, I see that the gauge is tapped off the hot wire from the alternator at two points.
So okay, technically it's "shunted" by creating an extra path for current to flow through, and the meter is not getting the full current flowing though it because the "meter circuit" (wires that lead from the taps to the meter, and the meter itself) has a higher resistance than the main charging wire that the meter circuit is tapped off of.
But the gauge, on my truck at least, is not a calibrated meter - it just indicates that there is a difference, not precisely what that difference is. My truck has a 100a alternator, an engine start battery and a deep cycle aux battery connected to the engine battery with a split-charge solenoid.
Even when I run my aux battery down very low (which I do frequently), the needle on that charging indicator never gets anywhere near full deflection. Not even close; About 1/4 of the way to full deflection is the most I've ever seen it do.
If the one in your truck is like the one in my truck, then the whole lash-up is so imprecise that it doesn't matter much what you do. You should be able to just tap the charging wire from the alternator at two points (probably at both ends) and get a relative indication of alternator vs. battery on the gauge.
But what if the gauge is calibrated in amps instead of just a dumb "D...|...C" charging indicator like the one in my truck?
Well, for the meter to be accurate, the resistances have to be right. The resistance of the section of alternator charging wire between the two taps has to be a known value, and the resistance of the meter circuit also has to be a known value, and the relationship between the two resistances has to be correct.
The Chilton's I'm looking at, which covers Ford vans 61-88 doesn't indicate anything about how that is done. Probably just all the wires involved are of known length and gauge (known resistance). Change the length or size of the wires, and you alter the meter's calibration.
OR, the full amperage has to flow through the ammeter (internal shunt as in my previous post). Basically, the charge wire runs from the alternator to the meter, though the meter, and then to the battery. There is no external shunting (tap off the charging wire). That's how a lot of older vehicles ammeters were set up (and often disconnected and bypassed as a fire hazard).
(Leaving aside for the moment that the meter is going to be a bit off due to the altered resistances...)
So let's say you've got a meter that reads in amps, and it's done with the same "tap the charging wire at two points" as shown in this Chilton's.
Then you change the charging wire out from a #10 to something fat like a #4 and install a bigger alternator. Let's say the meter is calibrated to a 40a +/- range and now, with the bigger alternator and fatter wire, you could exceed that range (needle hits full deflection).
But you want to keep that gauge instead of changing it out for one with a wider range...
Well, then what you'd have to do, is alter the resistance of the ammeter circuit to re-calibrate the meter to a different range (BUT, the range wouldn't be correct in regards to the amp scale printed on the face of the meter). You could alter it so that the meter only reads half of the actual value, so if there is 30a showing on the meter, then you know 60a is actually flowing.
That would only get your 40a meter up into the 80a range though, so you'd probably have to set it up for a different ratio. Say you setup the meter to show 25% of the actual value. Then if the meter hits full deflection (shows 40a), you'd know 160a was actually flowing.
That's a bit of a kludge, but could be done.
There's a decent article here about how to add resistors to change an ammeter's operating range:
http://www.allaboutcircuits.com/vol_1/chpt_8/4.html