Given the same decibel level, say....95 dB;
Which is more harmful for an extended period of time:
- 20Hz
- 200Hz
- 5,000Hz
- 15,000Hz
Just thinking out loud that it's not so much the level, but the content having something to do with it as well.
Bob
Great question, I wonder this myself. I don't have a handy and certain answer, but my feeling is that the sensitivity of the ear sort of tracks with the sensitivity to damage from noise. Most ears are most sensitive just above 3 KHz, and when permanent hearing loss appears due to overexposure, it is almost universally evident as a notch between 3 KHz and 6 KHz. The greater the damage, the deeper the notch; and the "noise notch" appears in this frequency range if you are a musician or a machinist -- it doesn't matter.
I found something about exposure to low-frequency sounds. You have to skip to section 7. Notice that the effect in some studies is most evident at 2 KHz, far from the spectrum of the noise:
http://www.defra.gov.uk/environment/noise/research/lowfrequency/pdf/lowfreqnoise.pdfIt seems that very few events are likely to contain enough very low (and for that matter, probably very high) frequency content in isolation to cause hearing damage. So, in your example, I'm going to hang my hat on 5000 Hz, but I wouldn't be shocked to hear that somebody smarter than me says 15 KHz is the real answer.