0 Members and 1 Guest are viewing this topic. Read 6745 times.
At a given volume level both amps should be outputting the same watts. A higher wattage amp should handle transients, peaks, and bass better. Especially if the volume is such that the lower wattage amp is at its rated output.
This is starting to look like another measurements vs ears debate.
Well according to Nagy you are dead wrong because thats pretty much what i said. Everything you said all ends up driving the devices into overheat which basically causes breakdown.
Again according to Nagy you are dead wrong because this is basically what i said. If your amp can handle bass better than the entire sound stage will benefit.
OP's question is ridiculous to begin with. It's on the same level as when people claim that certain parts in a schematic are "IN" the signal path and some are "NOT" in the signal path.But to answer his question completely is like this: The speaker's power is rated at a nominal impedance. In this case probably 8 ohms. Which is probably at 1000Hz. These particular speakers have been rated at 150 watts. The amp however can put out 500 watts at 8 ohms. So... Connect the speakers to the amp and run a 1000Hz tone. Raise the volume until the power reaches 150 watts. This is the max that the speakers can handle. The amp however can output more power (500 watts at 8 ohms), keep turning the volume up and the speakers will eventually fail.
pjchappy - Your question is impossible to answer. There are no 8 ohm speakers, the impedance is nominal. So if you're talking about the entire frequency range, the impedance will vary drastically. This question has no useful answer, trust me.
My question was not ridiculous. . . it just wasn't asked very well. Please see my last post above to see what I was getting at.Paul
Say you have an amp with a power rating of 500 Watts into 8 Ohms / 250 Watts into 4 Ohms. You then use that amp to power 87dB efficient monitors, with a max rating of 150 Watts.
Say you have an amp with a power rating of 500 Watts into 8 Ohms / 250 Watts into 4 Ohms. You then use that amp to power 87dB efficient monitors, with a max rating of 150 Watts.Now, would the speakers be the determining factor into how much power the amp is actually putting out? In other words, would the amp be able to drive the speakers to the "loudest" they could handle and still not actually be putting out 500 Watts of power? (not considering transient, or whatever, outputs here and there)Thanks!Paul
Thanks, folks. My question has been answered. I'm not trying to get more volume, etc., just wanted to confirm it was "safe" to use a super high-powered amp on low-power "rated" speakers. I know there are some technical aspects to it, but my generic question/concern has been answered.I understand the issues of impedance changes, etc., too. Am not concerned with that. To clear it up for others, I was simply wanting to confim this simplified premise:An amplifier driving an 8 Ohm speaker (say, throughout it's entire frequency response) driven by a 1,000 Watt amp using brand X pre-amp will output the same power at 90dB volume as the same speakers driven by a 100 Watt amp (same "brand" specs/topology as the 1,000 Watt amp) using the same brand X pre-amp at 90dB volume.Paul
Paul, not sure why this wasn't caught earlier.......but your hypothetical power amp you speak of would be considered a 'real turd'....for the simple fact that it's quite unstable..... whereas a stable , current capable amplifier has the ability to 'double down' (500 watts into 8 ohms & 1000 watts into 4 ohms).basically speaking your show the power 'cut in half' , now the speakers you represent are not terribly ineficient but still..........