Very interesting thread. I just finished building a pair of speakers from scratch, a project that went way too long, but there are some conclusions here that require some considerations.
Since cross-over networks theoretically should be based on the driver's measured resistance at the desired frequency (woofer measures 12.6 ohms at 3000 hz) for example, and after a period of time, "the woofer breaks in", and the spider and surrounds really soften up, shouldn't our measured resistance at 3000 hz change? The spider became more compliant, the surround became more compliant, resulting in the measured resistance of the woofer at 3000Hz to now be <12.6 ohms. Which then means, that the cross-over point is now screwed up, because the resistance value at the desired cross-over point has changed.
To further this scenario, would not the tuned frequency of the cabinet and port have changed as well, since resonance frequency is directly related to the woofer's compliance?
Then the conclusion could only be that every speaker ever made, by break-in time, is so far off the mark, in terms of cross-over frequency, as well as it's cabinet size and F3, that they should all sound like complete crap.
I can only draw one conclusion. That if there is any form of "breaking in", its marginal at best and must fall within natural tolerances of the drivers, cross-over components and enclosure size and porting.
My 2 cents worth.
Wayner