Some of you may find this article interesting.
http://www.audioholics.com/techtips/audioprinciples/loudspeakers/SpeakerBreakIn.php
d.b.
When I ask a speaker manufactuer to do a burn in for me, I usually specify 100 hours, varying load. Is this so the speaker will sound better?
Heck no. It's so anything bad that's going to happen to the speaker is going to happen on his floor, not mine. 100 hours continuous play will shake out driver problems, marginal crossover components, and give the enclosure a good shake down. By the time it's over, anything that wants to rip, shift, cook or crack, will have.
By the same token, crossovers don't break in. Coils get to their proper field in under a second when charged; caps in crossovers get to where they need to be even faster. Resistors shouldn't get warm in crossover designs, so thermal considerations aren't supposed to apply at all. "Break in" happens in under a second, and it happens each time you start playing a signal through the crossover. The only changes that happen to a crossover over time are *bad* ones - caps sometimes degrade over time.
And speaker wire and interconnects do not break in, period. If they did, you could do an A/B test on a new cable and a year old one from the same manufacturer and be able to tell them apart. And you can't.
Anyway, there's a very simple way to dispel the "consumer break-in" myth. Let's assume that just running a speaker, cable or amp at some load for some length of time would make it better.
Has anyone noticed that manufacturers of audiophile gear are a little... competitive? These folk are doing constant (and sometimes expensive) R&D to get microscopic improvements in accuracy. (Well, the honest ones are: the rest are in constant R&D to make irrelevant changes that the Marketing folk can write pages of pseudoscience babble about. Though surely I wouldn't be referring to using batteries to charge the /dielectric/ of a speaker cable or anything.)
So you're going to tell me that these folk, desperate for any competitive edge, spending wads of cash in R&D to gain just a handful of audiophile sales,
are not going to spend ten cents of electricity on a trivial break-in procedure that makes the product sound better? Get real!
The reality is, the honest ones ship their product having done anything and everything they know how to do, to make it as perfect as it can be, before you get it. If burn in helps, they've done it for you. If turning it upside down and waving a dead chicken over it helped, they'd do that too. Twice, to make sure, just in case their competitors only did it once.
So where's the break in myth coming from? Two places, I believe:
Consumers. Human ears adapt to what they hear. Anyone who wears glasses knows how this works with the eye: you get a new pair of glasses, it's different than your last pair, and your brain starts screaming about the differences. But a week later your vision is undistorted and clear and "broken in". The glasses didn't change, but some brain wiring sure did.
Ears do the same thing. Break in happens, but it happens between your ears.
Manufacturers. If they can get you to hang on to the speakers during "break-in", your brain wiring will adjust and you'll come to like the speakers: that is, your brain will stop screaming about how they sound "wrong" compared to your last set. And since it's *really* hard to sell the idea that "you should just keep listening - they will grow on you, honestly they will"... the break-in story gets the nod instead.
One caveat: I do believe that huge subwoofers with really big, thick surrounds take longer than 10 seconds to achieve final break in. I bet it's more like minutes.