I'll be honest - what troubles me most about this thread is that we have manufacturers responding here - guys that make their living at this - who do not believe that break-in is a real component of audio design. Hey guys, put down your EE textbooks and pick up your material science and metalurgy texts books and study a bit about what happens to a piece of copper, steel, silver, etc. during the manufacturing process. I think you will find that in addition to the typical electrical properties there are also properties of grain structure within these metals which become disoriented during the manufacturing process. It is a proven fact that these grains re-align themselves over time and provide an "easier" path for electron flow.
OK, as far as the "your ears adjust argument" I have this to say: I have modified hundreds of pieces of equipment over the years using what many of you would consider "boutique" parts and have gotten to the point where during "Break-in" I first listen to the component immediately after I complete the modification, then place the component in the burn-in room with a CD on repeat for 200+ hours. Amazingly, when I re-install the component in the system after "Burn-in" it sounds smoother, more articulate, and most always has better bass depth and definition, amongst other things. The point is, I am obviously not adjusting to the sound over time because I am not listening to the component during this time.
Like I said, it troubles me that manufactureres will not allow themselves to think "outside the box" on this issue. Rest assured I would never purchase a product manufactured by any of these "shallow thinkers"
Fire away ..
I will be quite honest. It troubles me that people would provide such unsubstantiated (material science) statements and asserted that others are shallow thinkers because they do not believe such "hogwash". While they may indeed be "shallow thinkers", you have not posted anything upon which to base that assessment.
I do indeed work with the EE books. I also work with the material ones. As well as the physics ones. And the cryogenic ones.
Conductors do not alter their resistivity to any degree through cryogenic processing (at least not down to 1.8 Kelvin, the farthest I've worked at). Alteration of the grain structure does not alter resistivity at room temperature sufficiently to measure, it is about 8 orders of magnitude too low. It is possible to measure the impact of grain boundaries as well as lattice defects, but that requires taking the conductor down to liquid helium temperatures to extend the mean free path, where phonon dispersion all but goes away leaving the defects.
But it is not possible to measure a cryo forced difference at room temperature electrically, nor is it possible to find a difference in the conductors as a result of burn in.. This is from experience. (Note: I mention cryo as it has a large impact on many materials via stress relief of the lattice caused by manufacture. Annealing also affects the lattice, but not too many mention annealing of wires to achieve some sonic result. Use of room temperature current to achieve some difference has historically born no fruit, at least in the experience of those I work with. Nor from my own experience, from 30 Kiloamps to attoamps, and kilovolt to nanovolt.)
The only method which alters a conductor is electromigration. This occurs in aluminum which has current densities flowing through which are not sustainable beyond thick films bonded to silicon (the current density will vaporize normal metals. Superconductors do not suffer this, as there is no impact energy, however if one does not make the end solder joints of sufficient area, the tin/silver solder can be affected (experience). Even at 4.5 Kelvin, this level of current density will vaporize normal conductors in about 2 to 4 seconds (again, experience).
Your statement of experience with component burn in? It is indeed possible what you state is true. I do note, however, that you have made no mention of measurement of components before and after the burn in.
Was this burn in simply exercising the equipment under normal operating conditions? Or, was it elevated temperature burn in designed to accelerate initial failures due to component reliability? (experience)
I stated early on that the ears do indeed adjust to the localization parameters that are defined by the system. I have also indicated in the past that all recorded material made to date provides localization cues which do not occur in nature, forcing us to adjust to this erroneous stimulus.
Care must be taken to distinguish perceptual drifts which are a human response to cue distortions from perceptual alterations as a result of real shifts.
To all the posters within this thread:
Calling others "shallow", or idiots, or naysayers, or whatever, has no place within an intellectual discussion..I make the assertion (perhaps incorrectly) that most here wish that? Bolstering such accusations with pseudoscience is also objectional behaviour, to which I have taken to task.
Oh, one more thing..no matter how big the "box" you think within, you will find that others have a bigger box..
Cheers, John