In some power generation systems what you describe is more true. It is NOT true for an automotive alternator. Instead of generating a constant amount of power and "throwing away" the excess, its never generated in the first place. The stator is an electromagnet. The current through the stator controls the strength of that magnet. So, if you reduce the current through the electromagnet, you decrease the strength of the magnetic field, which in-turn reduces the amount of electro-motive-force induced into the alternator coils, and thus the required torque on the alternator pulley.
Now you've got me confused again. For a "simple" generator: let's say a bundle of wires wrapped around a stick, being turned inside a fixed magnetic field, does the current drawn from the output effect how much torque it takes to turn the rotor? IOW, is the stick harder to spin if the generated electricity is being used across some load?
I realize that the alternator in a car does not use a fixed magnet, but an electromagnet, the strength of which depends on how much power is needed to charge the battery and other such electrical housekeeping. In that setting, it makes sense that turning up the power of the electromagnet will make the rotor harder to turn, and will generate more power.