10K Tape Input vs 50K Tape Input

0 Members and 1 Guest are viewing this topic. Read 987 times.

rhart

  • Jr. Member
  • Posts: 214
10K Tape Input vs 50K Tape Input
« on: 4 Jan 2009, 02:27 am »
Hi All. Is there any sonic advantage of using the 10K Tape Input vs 50K Tape Input for connecting my Rega Planet 2000 to my B60 SST? The Rega manual states that an input with a minimum of 10K is recommended...I know I could just listen for myself (which I'll do), but what is the difference between a 10K input and a 50K input? (ps. I use low impedance VDH interconnects). Thanks.

brucek

  • Full Member
  • Posts: 474
Re: 10K Tape Input vs 50K Tape Input
« Reply #1 on: 4 Jan 2009, 02:47 pm »
Quote
but what is the difference between a 10K input and a 50K input?

The short answer to your question is that it depends on the source impedance. In your case, you would not notice any difference.

The long answer requires a simple take on input and output impedance.

The typical interface between a preamp (the source) and power amp (the load) or in your case between a CD source and a preamp, is a line level, high impedance connection, where the output impedance of the source is usually in the order of about  50 ohms to 300 ohms and the input impedance of the load is in the order of 10Kohms to 50Kohms. 
 
This is known as a voltage bridge connection where the source is acting as a voltage source and almost no current is drawn. In fact, a perfect voltage source would have an output impedance of zero ohms at all frequencies. This would result in all the output voltage from the source being dropped across its load with no voltage being lost to the output impedance of the source.

Output impedance is basically the internal resistance of an amplifier seen at its output. The value can be slightly frequency dependent because of reactance caused by inductance and capacitance, but either way, a low output impedance is desirable.

In its simplest terms you can view the internal resistance of a source and its associated load as a voltage divider. The higher the output impedance of the source, the more voltage will drop across "itself" instead of its load. This in effect means less voltage will be received at the load.

If the input impedance of a load device is not significantly higher than the sources impedance, the signal will be reduced or "loaded down" and its signal to noise ratio and frequency response will suffer. Certainly the load can become too much for the source to supply adequately.

Generally, a high output impedance requires close attention to cable lengths. The concern is that the high reactance (frequency dependent resistance caused by capacitance) of the longer cable, combined with a high output impedance of the source creates a low pass filter which adversely affects bandwidth. This distortion of the higher frequencies increases with higher output impedance's. To put it simply, your highs will suffer.

So, unlike speaker cables (where we don't give a darn about capacitance), with an interconnect it becomes important no matter what the output impedance, although the situation is exacerbated with a higher output impedance. The interconnects capacitance results in a parallel reactance (frequency dependent resistance)  that will roll off the higher frequencies. It becomes more and more a factor with a higher impedance mismatch between the source and load.

Anyway, there's an old rule of thumb that says, the input vs output impedance of a voltage bridge interface should be a minimum of 10:1. That's minimum to ensure high frequency preservation. So, if I fed a load that has a 10Kohm input impedance, then I would just be OK with a 1000 ohm output impedance source, but I certainly wouldn't want to run long interconnect cables.

brucek

rhart

  • Jr. Member
  • Posts: 214
Re: 10K Tape Input vs 50K Tape Input
« Reply #2 on: 4 Jan 2009, 07:06 pm »
Thank you, brucek. Very comprehensive and interesting answer.