0 Members and 1 Guest are viewing this topic. Read 10979 times.
Tomy2Tone, you need to find out from Klaus what the input sensitivity of the amplifier is for full RMS power out. For example my amplifier has a gain of 26dB and a input sensitivity of 1volt in for 110watts out into 8ohms RMS, it will clip if it sees more than 1 volt on its input. With digital sources having outputs of 2volts or more I am always attenuating the incoming signal to less than 1volt. Because my loudspeakers have a sensitivity of 95dB I use a zero gain active buffer to do this and I never lack adequate SPLs.Scotty
As long as the amp's input impedance is low (ie lower than the output impedance of the vast majority of sources), that's about all you need to be really concerned with in that regard. Scotty posted the ratio, but I'm not certain this is an absolute. Low input impedance is basic knowledge to any good amp designer........but, every now and then problems can happen : Sometimes a source might have a low output impedance. Sometimes an amp might have a high input impedance. The result of connecting the two is typically some kind of distortion. I think the most common is the sound being dead and lacking dynamics and drive. Active, buffered, and transformer / autoformer based preamps are designed to alleviate these types of possible mismatches.
A very nice education and enjoyable read guys! Thanks. Might make a good permanent post, no?Best,Blu99Zoomer
A while back I had a chat with a forum member about amps with high watt output vs amps with lesser watts but designed different and still providing excellent bass and slam. Neither one of us understood why an amp that supposedly puts out far less watts could perform just as good if not better than an amp with twice the wattage.I guess I've always been under the impression that more watts the better when it comes to an amp and would often hear the old " there's no replacement for displacement " when asked why. I've had some class d amps over the years with often at least 500 watts per channel and recently have had some Crown XLS 1500 amps bridged putting out 1500 watts per channel. Yet when I inserted a Job 225 stereo amp putting out about 180 watts per channel into 4 ohms I got as good dynamics and very comparable if not better bass and slam. Is it just the difference between class d and class a/b?Does the overall design philosophy of an amp trump high wattage capability? I think this is what I'm trying to ask... Any comments or thoughts is appreciated and any informational links is a plus!Thanks!
My current setup as of right now is a Rogue Perseus preamp that has an output impedance of 450 ohms going into an Aaron No.3 stereo amp that has an input impedance of 47k ohms and it sounds great with the volume at 12 0'clock.Looking at Herron's site the output of the VTSP 3a is 100 ohms and the input of a Odyssey Stratos mono ( no word yet on Kismet) is 22k ohms.
"As long as the amp's input impedance is low (ie lower than the output impedance of the vast majority of sources), that's about all you need to be really concerned with in that regard. Scotty posted the ratio, but I'm not certain this is an absolute. Low input impedance is basic knowledge to any good amp designer........"This is a completely backward statement.As long as the input impedance of the AMP is a general rule about 10 times HIGHER than the output of your preamp you should be fine. Some say as much as 20 times... In either case with your 450 ohm output on the preamp and 22,000 ohm on the AMP input you are just fine. 22,000 / 450 = 48.8 times greaterYour other combination with a 100 ohm preamp and new kismet at 22k is just fine as well. 22000 / 100 = 220 times greater Any good amplifier designer knows the HIGHER the input impedance the more universal that amp can be, specifically for lower frequencies.... Now really easy to drive solid state OR tube amps have 100,000 ohm input. Most are between 10,000 and 50,000 though. You can run into trouble with older Oddysey design amps because I had one the Stratos plus. Its input was 10,000 ohm. No good with the 1,600 ohm preamp I was running because it sounded "Tinny" with very little low end extension, no fat midbass... With a 10k ohm input like that your better using some Conrad Johnson or Audio research preamps that run from 50 ohm to around 200 ohm output impedance.By the way depending on the preamp and IF your running SUBWOOFER plate amps this does come into play sometimes. Most preamps will PARALLEL the impedance. So if you have a 22,000 ohm amp hooked up to that channel AND you have a 10,000 ohm Plate amp also connected this drops the impedance your preamp handles now down to = 6,875 ohms....This is a far more difficult load so you still will be okay with a 450 ohm preamp = 15 times or 100 ohm preamp = 69 times greater
This section deals with impedance "mismatch", distortion and frequency response changes. What is the effect of amplifier input impedance VS active preamplifier/source output impedance. To be more precise, the input impedance (Z) of the amplifier verses the output Z of the preamplifier or source. (The specifications of both impedances can be found in the owner's manuals.) Most recommend a 10:1 ratio. I also recommend a 10:1 ratio to be safe. (The RCA Radiotron Designers Handbook recommends a 5:1 ratio.) Using a 10:1 ratio, the amplifier input impedance should be 20,000 ohms (20k ohms) with the preamplifier output impedance of 2000 ohms (2k ohms). However, some claim/market a 100:1 ratio to "reduce distortion". For an amplifier input impedance of 20k ohms, one would need the preamplifier/source to have an output impedance of only 200 ohms. So does adding a low output impedance buffer stage lower distortion? Understand that teaching the 100:1 ratio attempts to legitimize the use of a buffer stage while inferring that those who use a 10:1 ratio are inferior. Not only does adding a buffer stage not significantly reduce distortion, but deteriorates the musical quality, increases the complexity, increases "crosstalk" problems between channels, and adds to the cost (which increases the profit margin). Let's check out an example. We have an amplifier with 20k ohms input impedance (Z). Let's compare a preamplifier with a 100 ohm output Z to a 2000 (2K) ohm output Z. As such, we are decreasing the ratio from 200:1 to 10:1. The total harmonic distortion of a JJ E88cc tube, at 2v rms output measures approximately 0,01% (-80db) using the 200:1 ratio. changing the ratio to 10:1 raises the distortion by approximately 0,0012% to -79db. So the distortion rises from -80db to -79db. The extra buffer stage, itself, would add more distortion than the savings. Other types of stages may give different results, but then other problems are introduced. How about frequency response changes. This section deals with the high frequency response of our active preamplifier with and without a buffer stage. We will use a 50pf IC vs 250pf IC. The output impedance with buffer stage is 100 ohms. Without is 2,000 (2k) ohms. First, the high capacitance 250pf interconnect cable and the buffer stage, 100 ohms. The high frequency response drops approx 100udb at 100 khz, and approximately 6udb at 20 khz. With output Z of 2khz, the drop is ,045db at 100khz and ,002db at 20khz. Not much is it. Now we use the 50pf interconnect cable. The result is less than 150udb drop at 100 khz, and ,05db at 20 khz. With output Z of 2khz, the drop is 0,002db at 100khz and 75udb at 20khz. Again, not much different. (Rarely, a longer IC with higher capacitance is neccessary as there is no choice.)As one can see, the added buffer stage not only does not lower the distortion, but also does not appreciably extend the high frequency response. Yet the additional stage adds cost while degrading the music. If it sounds better adding the buffer stage, then either the IC capacitance is very large, or the previous stage(s) have problems. So the question is, why not just design a single stage, low output impedance, wide bandwidth, and low distortion design to begin with and forget the additional buffer stage with its associated problems and cost to you?