Mo watts mo better?

0 Members and 1 Guest are viewing this topic. Read 10969 times.

Tyson

  • Full Member
  • Posts: 11481
  • Without music, life would be a mistake.
Re: Mo watts mo better?
« Reply #40 on: 12 Jun 2015, 11:57 pm »
I'm not a big fan of passive preamps either.  But some people love them and who am I to say they are wrong?

undertow

  • Full Member
  • Posts: 925
Re: Mo watts mo better?
« Reply #41 on: 13 Jun 2015, 12:05 am »
12 db is below average for most preamps so that's more or less low gain.... Most typical I have seen are in the 14 to 20db gain range, but again this is less Important it's more dependent on the whole gain chain. The job at 35 db with a 12 db gain preamp is fine with 90 db or below speakers... Running 100 db maybe even as high as 95 db speakers may not work out to well.

I am a big believer now when it comes to active preamps that using all sources have their own gain controls as well IE. Phono amp, and DAC is far more convienent trying to get everything leveled out, however the argument to this is some purists scoff at having any more attenuation in the signal, but honestly in my experience having this specific flexibility for enjoyment of the matching in your system far outweighs the possible 2% downside to transparency with such attenuation.

*Scotty*

Re: Mo watts mo better?
« Reply #42 on: 13 Jun 2015, 12:18 am »
Tomy2Tone, you need to find out from Klaus what the input sensitivity of the amplifier is for full RMS power out. For example my amplifier has a gain of 26dB and a input sensitivity of 1volt in for 110watts out into 8ohms RMS, it will clip if it sees more than 1 volt on its input.
 With digital sources having outputs of 2volts or more I am always attenuating the incoming signal to less than 1volt. Because my loudspeakers have a sensitivity of 95dB I use a zero gain active buffer to do this and I never lack adequate SPLs.
Scotty

Tomy2Tone

Re: Mo watts mo better?
« Reply #43 on: 13 Jun 2015, 01:42 pm »
Tomy2Tone, you need to find out from Klaus what the input sensitivity of the amplifier is for full RMS power out. For example my amplifier has a gain of 26dB and a input sensitivity of 1volt in for 110watts out into 8ohms RMS, it will clip if it sees more than 1 volt on its input.
 With digital sources having outputs of 2volts or more I am always attenuating the incoming signal to less than 1volt. Because my loudspeakers have a sensitivity of 95dB I use a zero gain active buffer to do this and I never lack adequate SPLs.
Scotty

This is where I wish there was a "how to" manual when selecting components. Seems a lot of manufacturers only post input impedance and not input sensitivity. I also have a Lampi L4 dac on order and have put in a question to Fred about its output voltage. Looking up an older generation L4 I read it had an output of 3v.

So to help me understand, input impedance is different than input voltage? Is one more important than the other?

*Scotty*

Re: Mo watts mo better?
« Reply #44 on: 13 Jun 2015, 02:56 pm »
You need to contact Klaus. The amplifier input impedance is good to know in case your preamp has a high output impedance, a 10 to 1 ratio or higher of power amp input impedance to preamp output impedance is desirable to maintain full bandwidth and avoid a premature high frequency roll-off.
However, knowing what the power amp input impedance is does not tell you the input sensitivity of the power amp. The specifications on the website are incomplete in this regard.
Scotty

RDavidson

  • Restricted
  • Posts: 2872
Re: Mo watts mo better?
« Reply #45 on: 13 Jun 2015, 02:59 pm »
The result of connecting two components of mismatched impedance is typically some kind of distortion. I think the most common is the sound being dead and lacking dynamics and drive. Active, buffered, and transformer / autoformer based preamps are designed to alleviate these types of possible mismatches.
« Last Edit: 13 Jun 2015, 05:58 pm by RDavidson »

Tomy2Tone

Re: Mo watts mo better?
« Reply #46 on: 13 Jun 2015, 03:12 pm »
As long as the amp's input impedance is low (ie lower than the output impedance of the vast majority of sources), that's about all you need to be really concerned with in that regard. Scotty posted the ratio, but I'm not certain this is an absolute. Low input impedance is basic knowledge to any good amp designer........but, every now and then problems can happen : Sometimes a source might have a low output impedance. Sometimes an amp might have a high input impedance. The result of connecting the two is typically some kind of distortion. I think the most common is the sound being dead and lacking dynamics and drive. Active, buffered, and transformer / autoformer based preamps are designed to alleviate these types of possible mismatches.

My current setup as of right now is a Rogue Perseus preamp that has an output impedance of 450 ohms going into an Aaron No.3 stereo amp that has an input impedance of 47k ohms and it sounds great with the volume at 12 0'clock.

Looking at Herron's site the output of the VTSP 3a is 100 ohms and the input of a Odyssey Stratos mono ( no word yet on Kismet) is 22k ohms.

undertow

  • Full Member
  • Posts: 925
Re: Mo watts mo better?
« Reply #47 on: 13 Jun 2015, 04:45 pm »
"As long as the amp's input impedance is low (ie lower than the output impedance of the vast majority of sources), that's about all you need to be really concerned with in that regard. Scotty posted the ratio, but I'm not certain this is an absolute. Low input impedance is basic knowledge to any good amp designer........"

This is a completely backward statement.

As long as the input impedance of the AMP is a general rule about 10 times HIGHER than the output of your preamp you should be fine. Some say as much as 20 times...

In either case with your 450 ohm output on the preamp and 22,000 ohm on the AMP input you are just fine. 22,000 / 450 = 48.8 times greater

Your other combination with a 100 ohm preamp and new kismet at 22k is just fine as well. 22000 / 100 = 220 times greater

Any good amplifier designer knows the HIGHER the input impedance the more universal that amp can be, specifically for lower frequencies.... Now really easy to drive solid state OR tube amps have 100,000 ohm input. Most are between 10,000 and 50,000 though.

You can run into trouble with older Oddysey design amps because I had one the Stratos plus. Its input was 10,000 ohm. No good with the 1,600 ohm preamp I was running because it sounded "Tinny" with very little low end extension, no fat midbass... With a 10k ohm input like that your better using some Conrad Johnson or Audio research preamps that run from 50 ohm to around 200 ohm output impedance.

By the way depending on the preamp and IF your running SUBWOOFER plate amps this does come into play sometimes. Most preamps will PARALLEL the impedance.

So if you have  a 22,000 ohm amp hooked up to that channel AND you have a 10,000 ohm Plate amp also connected this drops the impedance your preamp handles now down to = 6,875 ohms....

This is a far more difficult load so you still will be okay with a 450 ohm preamp = 15 times or 100 ohm preamp = 69 times greater

RDavidson

  • Restricted
  • Posts: 2872
Re: Mo watts mo better?
« Reply #48 on: 13 Jun 2015, 05:21 pm »
Sorry. :oops:
That's correct : An amp should have high input impedance and low output impedance. It's Saturday. :wink:

I will delete the useless info in my earlier statement to alleviate confusion. Scotty and undertow are on their game much more than I today. :thumb:

Tomy2Tone

Re: Mo watts mo better?
« Reply #49 on: 13 Jun 2015, 05:48 pm »
Thanks guys!

Blu99Zoomer

  • Jr. Member
  • Posts: 208
Re: Mo watts mo better?
« Reply #50 on: 14 Jun 2015, 06:07 pm »
A very nice education and enjoyable read guys!  Thanks.  Might make a good permanent post, no?

Best,

Blu99Zoomer

Tomy2Tone

Re: Mo watts mo better?
« Reply #51 on: 14 Jun 2015, 06:24 pm »
A very nice education and enjoyable read guys!  Thanks.  Might make a good permanent post, no?

Best,

Blu99Zoomer

Very nice indeed!

I just got a response from Keith Herron about the input sensitivity of his VTSP 3a r03 preamp and he said it can take up 25v. He designed it so it can play with anything. I'll call Klaus sometime this week on the Kismet input.

Steve

Re: Mo watts mo better?
« Reply #52 on: 16 Jun 2015, 09:34 pm »
A while back I had a chat with a forum member about amps with high watt output vs amps with lesser watts but designed different and still providing excellent bass and slam. Neither one of us understood why an amp that supposedly puts out far less watts could perform just as good if not better than an amp with twice the wattage.

I guess I've always been under the impression that more watts the better when it comes to an amp and would often hear the old " there's no replacement for displacement " when asked why.

I've had some class d amps over the years with often at least 500 watts per channel and recently have had some Crown XLS 1500 amps bridged putting out 1500 watts per channel. Yet when I inserted a Job 225 stereo amp putting out about 180 watts per channel into 4 ohms I got as good dynamics and very comparable if not better bass and slam. Is it just the difference between class d and class a/b?

Does the overall design philosophy of an amp trump high wattage capability? I think this is what I'm trying to ask... :scratch:

Any comments or thoughts is appreciated and any informational links is a plus!

Thanks!
A lot depends upon the system total, so yes and no. Most systems use less wattage than one thinks. If the design is correct for the "first watt", then most of the time the sound, dynamics, soundstage which I include depth, width, will be fine. If a speaker requires a lot of power, or one listens in the stratosphere, then one may notice a difference. Many amplifier distortion figures increase as the power output increases. It may be different for different designs.

Most dynamics occur in the midrange, so high current due to the woofers capacitance (very low frequencies) may not come into play, if the woofer plays the midrange.

Another problem, if higher power design does sound lesser quality, is that the stray, interelectrode, and/or Miller capacitance may be much higher, lowering the high frequency response. Might check the frequency response specs. This will occur with multiple tubes in parallel, or larger dimensional tubes. Higher capacitance will alter the slope of the waveform, the rise time/"attack time", which the ear is very sensitive to.

I decided to just add this. If higher voltages are used, the power supply may have filter capacitors in series, and other differences that will affect sound.
The output transformer will need more insulation, thus more leakage inductance and poorer high frequency response.

I think you can understand better why I stated yes and no, a lot depends upon the tube and design used.

I did not read any previous posts, busy, so my apologizes if one or more points have already been posted.

Cheers
Steve
« Last Edit: 25 Jun 2015, 04:51 pm by Steve »

Steve

Re: Mo watts mo better?
« Reply #53 on: 16 Jun 2015, 09:38 pm »
My current setup as of right now is a Rogue Perseus preamp that has an output impedance of 450 ohms going into an Aaron No.3 stereo amp that has an input impedance of 47k ohms and it sounds great with the volume at 12 0'clock.

Looking at Herron's site the output of the VTSP 3a is 100 ohms and the input of a Odyssey Stratos mono ( no word yet on Kismet) is 22k ohms.

The only problem you will have Tomy is the size of the output capacitor, if one is used. (Assuming the capacitor is listening tested for true accuracy.)

Cheers
Steve

Steve

Re: Mo watts mo better?
« Reply #54 on: 16 Jun 2015, 09:45 pm »
"As long as the amp's input impedance is low (ie lower than the output impedance of the vast majority of sources), that's about all you need to be really concerned with in that regard. Scotty posted the ratio, but I'm not certain this is an absolute. Low input impedance is basic knowledge to any good amp designer........"

This is a completely backward statement.

As long as the input impedance of the AMP is a general rule about 10 times HIGHER than the output of your preamp you should be fine. Some say as much as 20 times...

In either case with your 450 ohm output on the preamp and 22,000 ohm on the AMP input you are just fine. 22,000 / 450 = 48.8 times greater

Your other combination with a 100 ohm preamp and new kismet at 22k is just fine as well. 22000 / 100 = 220 times greater

Any good amplifier designer knows the HIGHER the input impedance the more universal that amp can be, specifically for lower frequencies.... Now really easy to drive solid state OR tube amps have 100,000 ohm input. Most are between 10,000 and 50,000 though.

You can run into trouble with older Oddysey design amps because I had one the Stratos plus. Its input was 10,000 ohm. No good with the 1,600 ohm preamp I was running because it sounded "Tinny" with very little low end extension, no fat midbass... With a 10k ohm input like that your better using some Conrad Johnson or Audio research preamps that run from 50 ohm to around 200 ohm output impedance.

By the way depending on the preamp and IF your running SUBWOOFER plate amps this does come into play sometimes. Most preamps will PARALLEL the impedance.

So if you have  a 22,000 ohm amp hooked up to that channel AND you have a 10,000 ohm Plate amp also connected this drops the impedance your preamp handles now down to = 6,875 ohms....

This is a far more difficult load so you still will be okay with a 450 ohm preamp = 15 times or 100 ohm preamp = 69 times greater

Nice post undertow. I only have two points to address.

1) The RCA Radiotron Designers Handbook recommends 5 times or greater. However, I also recommend the same as you, 10 times.

2) The point assumes, say 20k parallel amps, total input impedance (Z), or higher, thus enough output capacitance etc. (Capacitance limited the impedance, work performed). Here is a quote from a white paper I wrote years ago concerning the output impedance (Z) of the preamplifier. (Assuming the amp's input Z is 20k ohms. and descent output circuit.)

Quote
This section deals with impedance "mismatch", distortion and frequency response changes.

What is the effect of amplifier input impedance VS active preamplifier/source output impedance. To be more precise, the input impedance (Z) of the amplifier verses the output Z of the preamplifier or source. (The specifications of both impedances can be found in the owner's manuals.) Most recommend a 10:1 ratio.

I also recommend a 10:1 ratio to be safe. (The RCA Radiotron Designers Handbook recommends a 5:1 ratio.) Using a 10:1 ratio, the amplifier input impedance should be 20,000 ohms (20k ohms) with the preamplifier output impedance of 2000 ohms (2k ohms).

However, some claim/market a 100:1 ratio to "reduce distortion". For an amplifier input impedance of 20k ohms, one would need the preamplifier/source to have an output impedance of only 200 ohms. So does adding a low output impedance buffer stage lower distortion?

Understand that teaching the 100:1 ratio attempts to legitimize the use of a buffer stage while inferring that those who use a 10:1 ratio are inferior. Not only does adding a buffer stage not significantly reduce distortion, but deteriorates the musical quality, increases the complexity, increases "crosstalk" problems between channels, and adds to the cost (which increases the profit margin). Let's check out an example.

We have an amplifier with 20k ohms input impedance (Z). Let's compare a preamplifier with a 100 ohm output Z to a 2000 (2K) ohm output Z. As such, we are decreasing the ratio from 200:1 to 10:1.

The total harmonic distortion of a JJ E88cc tube, at 2v rms output measures approximately 0,01% (-80db) using the 200:1 ratio. changing the ratio to 10:1 raises the distortion by approximately 0,0012% to -79db. So the distortion rises from -80db to -79db. The extra buffer stage, itself, would add more distortion than the savings. Other types of stages may give different results, but then other problems are introduced.

How about frequency response changes.

This section deals with the high frequency response of our active preamplifier with and without a buffer stage. We will use a 50pf IC vs 250pf IC. The output impedance with buffer stage is 100 ohms. Without is 2,000 (2k) ohms.

First, the high capacitance 250pf interconnect cable and the buffer stage, 100 ohms. The high frequency response drops approx 100udb at 100 khz, and approximately 6udb at 20 khz. With output Z of 2khz, the drop is ,045db at 100khz and ,002db at 20khz. Not much is it.

Now we use the 50pf interconnect cable. The result is less than 150udb drop at 100 khz, and ,05db at 20 khz. With output Z of 2khz, the drop is 0,002db at 100khz and 75udb at 20khz. Again, not much different. (Rarely, a longer IC with higher capacitance is neccessary as there is no choice.)

As one can see, the added buffer stage not only does not lower the distortion, but also does not appreciably extend the high frequency response. Yet the additional stage adds cost while degrading the music. If it sounds better adding the buffer stage, then either the IC capacitance is very large, or the previous stage(s) have problems.

So the question is, why not just design a single stage, low output impedance, wide bandwidth, and low distortion design to begin with and forget the additional buffer stage with its associated problems and cost to you?

Unless one is using very high capacitance interconnect cable from pre to amp, due to long runs or inferior cable, very low output Z is not needed. In fact, the sonics should be better ridding of a sonic robbing buffer stage, one less stage and thus simpler.

Forgot to mention, with a passive volume control, the preamplifier gainstage is in the amplifier, called an integrated amplifier. So one is not really ridding of a gainstage as one may think. The audio system does require a certain amount of gain, by definition.

My amp is two stage, the separate preamplifier is one stage, so still a total of three stages in my system. However, I do not have the problem of the interconnect capacitance between the wiper arm of the passive to the integrated amplifier's input stage. This capacitance varies from interconnect to interconnect, and unless very low capacitance, will cause an increased loss of high frequencies as the volume is cranked up. One is tossing the dice as far as quality sound.

Cheers
Steve
« Last Edit: 25 Jun 2015, 04:53 pm by Steve »