Since the manufacturer of the tube preamp I'm using (600Ohms output impedance, balance) reccommends to use 20KOhms (or higher) load power amp, should I be worried? If it's indeed lower than 20KOhms, which gain select position should I be using to help the preamp drives the amps better?...........
Not that my tube preamp(s) and 7B's don't sound great together right now, but I'm moving to larger room and switching to speakers that might be harder to drive. So I'm not sure if that theory on input and output impedance ratio would come into play. (It would be 16.7 times if input impedance really is @ 10K Ohms; 25 times @ 15K; and 33.3 times @ 20K.)
There's not much to be worried about, and it really has nothing to do with driving your new speakers. It's a frequency response issue. The typical interface between a preamp (the source) and power amp (the load) is a line level, high impedance connection known as a voltage bridge connection where the preamp is acting as a voltage source and almost no current is drawn. In fact, a perfect voltage source would have an output impedance of zero ohms at all frequencies. This would result in all the output voltage from the source being dropped across its load with no voltage being lost to the output impedance of the source.
Output impedance is basically the internal resistance of an amplifier seen at its output. In its simplest terms you can view the internal resistance of a source and its associated load as a voltage divider. The higher the output impedance of the source, the more voltage will drop across itself instead of its load. This in effect means less voltage will be received at the load. If the input impedance of a load device is not significantly higher than the sources impedance, the signal will be reduced at the load end and its signal to noise ratio and frequency response will suffer.
Generally, a high output impedance requires close attention to cable lengths. The concern is that the high reactance (frequency dependent resistance caused by capacitance) of the longer cable, combined with a high output impedance of the source creates a low pass filter which adversely affects bandwidth. This distortion of the higher frequencies increases with higher output impedances. The interconnects capacitance results in a parallel reactance that will roll off the higher frequencies. To put it simply, your highs will suffer.
Anyway, there's an old rule of thumb that says, the input vs output impedance of a voltage bridge interface should be a minimum of 10:1. That's minimum to ensure high frequency preservation. So, if I fed a load that has a 10Kohm input impedance, then I would be OK with a 1000 ohm or less output impedance source, but I certainly wouldn't want to run extremely long interconnect cables.
If you're concerned, it's not difficult to make rough calculations on the severity of the low pass filter effect. If you take a typically good interconnect with a 25pf / foot capacitance, then a 3 foot cable would realize a parallel reactance of about 106 Kohms at 20Khz (worse case). This is quite high and can be considered insignificant in relation to a 600 ohm output impedance and a load impedance of about 10 Kohms. No high frequency rolloff will result.
But, if I used the same cable and ran it 50 feet, then at 20Khz the cable would be introducing a reactance of about 6.3 Kohms. You've broken your 10:1 rule. You can do the voltage divider math.
brucek