Here is the thing about wire gauge, it is about controlling voltage drop. And why it really doesn't matter so much on an alternator.
Lets say you have a 100A alternator with a 14.7V setpoint. You fire up the rig that you have been listening to the radio for a few hours and it is run down to 12.0V. That alternator will be at full output to bring the battery up. Lets say you have a half volt voltage drop at that 100A load. What's really happening? The wire is acting like a resistor. Half volt at 100A is 50 watts of energy being lost. It goes to heat. About what an upgraded highbeam headlight puts out. That amount of heat is spread out across the whole wire, radiated off the surface, conducted through the attachment points.
As the battery is charged the amperage output drops off. While this happens so does the voltage drop. Lets say you take 40A to drive, this is about what EFI with wipers, headlights, and old school A/C will draw. I checked it once. That 40A load will reduce the voltage drop to about 0.2V. Now the lost power, and heat dissipation is down to about 8 watts.
That is the math behind why a wire does what it does and why it needs to be sized. If you can't handle the heat dissipation, either the amperage needs to drop or the wire gauge needs increase. But it isn't a magic it works or it doesn't size for everything.
Go back to original starter motors that can draw 300-400 amps of current, on a dainty little 6-gauge wire or so. It's easy to get a full volt drop in that wire. There is 300-400 watts of heat that wire must handle. But it is only handling it for a few seconds. The starter upgrade of going to 4-gauge or even 2-gauge wire was to cut the voltage drop to the starter. Not really the added amperage capacity, but the reduced voltage drop. More voltage to the starter, it spins faster. If you look at many aftermarket starter motors you will see them rated in kw. FYI, 0.746 kw = 1 HP so you can also do this with starters that have a HP rating with a little conversion). If you drop 200 watts of heat dissipation, that is 200 watts more for the starter (0.2kw).
Amperage through a circuit doesn't change regardless of the length of the wire. Very large buildings with a central HVAC system will often have thermostats that measure temperature and will output in mA current. The length of the wire, less than perfect splices in the wire, etc. won't matter. Amperage through a circuit doesn't change regardless of voltage drop.
To that there is still a case where wire can be too small for a charge circuit. The stock 10 gauge charge wire, inside a harness, with a splice at the ampmeter. Try running 130A through that and you will have a lot of heat. Being inside a bundled harness the heat can't escape. That is where you melt a wiring harness. Drop 2V, you have 260 watts of heating element and no place to let that heat out.
If you are running a charge wire from the alternator straight to the battery, 90A and 8 gauge is probably fine if left out in the open to stay cool. Want to bundle that into a harness, not let heat escape, you are going to have to go larger. Heat managment, don't let the wire generate as much heat with you have it in a bundle that can't get rid of heat. Running a lot of high amp sustained loads, where the alternator won't be backing off to a lower output level, go a little larger again. How far to go? I want to say an OEM 200A alternator with the charge wire in a harness, as bad as it gets, is about a 2 gauge wire. There is no reason a 90 or even 130A load will need 2-gauge. That is "I don't understand wiring, go bigger, and bigger, and someone else told me that it should be bigger" internet wire sizing. No engineering went into it. Go to a worst case possible generic wire size chart.
OK, enough rant. What is the OEM wire size for that size alternator? Probably don't need 5 sizes larger.