Component variation, drift over lifetime and environmental conditions are all part of engineering. It wouldn't account for "break-in" because the values, and how they change, are all over a statistical map. They also change in predictable and describable ways that can be measured.
Break-in, as audiophiles explain it is not easily tied to a measurement. Sure.. you can measure the changes in crystalline structure, look at a scanning micrograph and see changes at the atomic level but none of those things necessarily relate to audibility. They are all in the grass. Most of them so far in the grass you would need a shovel to find them. They don't explain an audible difference as understood by people who do research on audibility of such things. When you design a transducer and a loudspeaker, there is natural variation in the process. There is MUCH more variation unit-unit in the transducer, than you will ever measure from one of these "in the grass" type of physical measurements. When you put that loudspeaker in a room, there are yet more variations, some of the order of 15-20dB which are certainly audible. At the same time, for perspective I've seen people sit and listen to speakers for hours without noticing the tweeter was hooked up out of phase and there is a 20-25dB hole in the midrange. I've done it myself although I'll typically notice something is askew fairly quick.
Scales, microphones, and pretty much any type of measurement device has to be calibrated against a standard. Why? Because their entire value is in accuracy. There are any number of environmental or accidental things that can happen to a mic, scale or other measurement device and that destroys its primary function. That is why they are calibrated, not because there is some unexplainable break-in occurring.