Friends, Seeing that I am now the high-end audio technical support person for NuForce, I suppose I should try to help answer your concerns regarding the amplifier input impedance issue. Please bear with me in the following, as the issue can be a bit complicated.
In a "theoretically" perfect world, the output and input impedances of audio components would appear as purely "resistive" terms. That means there would be no variation of the impedance with respect to frequency. If this were the case, then impedance matching (or lack thereof) would be a relatively very minor issue. In a situation where the output impedance of a preamp is high (above 1K-Ohm) and the amplifier input impedance is low (below 10K-Ohm), the only consequence would be a very slight (approx. –1dB) reduction of voltage being delivered from the preamp into the amplifier. Even in about the worst case where the impedances were the same (as in 600-Ohm professional equipment), the voltage being fed into the amplifier would be reduced by ½ (-6dB) as compared to what would be measured on the output of the preamp when it was not connected to the amp (no load condition).
A loss of -6dB would certainly be significant from a gain standpoint such that you would probably have to really turn up the preamp volume control, but unless the preamp simply could not provide that much extra gain and began clipping the signal, one would not expect to hear any difference in sound QUALITY. That being said, then a –1dB gain loss should not introduce any negative effects whatsoever. Again, in most cases a –1dB loss of gain due to an amplifier exhibiting a lower input impedance (10K-Ohm or so) is usually about the worst case we run into when it comes to high-end audio equipment. Therefore, we would not expect to hear any audible degradation of the sound, even in this "worst case" scenario.
Now… the above only holds true as long as both the output and the input impedances of the connected devices are "flat" from a frequency response standpoint. If a product were advertised as being "truly" high-end, then one would hope that we could expect the designers thereof to have made provision for such a flat impedance response. Due to complicated technical reasons though, it is not typically possible to provide a flat impedance response under all impedance miss-match conditions. This is especially true regarding a device's (a preamp) output impedance. Nevertheless, using good "modern" design techniques would suggest that a preamp should be able to meet such a "flat impedance response" requirement when faced with a typical "worst case" load impedance of about 10K-Ohms.
If a preamp cannot achieve this level of performance, it does not necessarily mean that the device will not "sound" good, especially when it's driving an amplifier that exhibits a higher input impedance. If this is so though, then the designer has severely restricted its possible use with other products. There is no "right" or "wrong" in this regard and it is simply a matter of choice on the part of the designer of the product. It is clearly a matter of opinion, but we question the wisdom of such a design philosophy from a marketing standpoint. At a minimum, owners of such products would be well advised that if they select such a product, they should understand that their peripheral equipment (i.e., amplifier) options will be limited to those that exhibit input impedances that more closely match the requirements of their preamp.
You see… there is strong motivation on the part of the amplifier designer to keep the amp's input impedance as low as reasonably possible. High impedance inputs are far more susceptible to external and internal noise pickup and distortion. Good "low noise" design then suggests that the input impedance be kept as low as possible. This then is in direct conflict with many (especially tube) preamp designs. Tubes are relatively high impedance devices and require special types and/or circuits to achieve good low output impedance performance. From a technical standpoint, designing tube preamps to have a lower output impedance is not really a significant design challenge, but doing so usually comes at a slightly higher financial cost due to the expense of better tubes and/or a higher parts count. Unfortunately, it seems a common practice is for designers to simply avoid the added cost and parts count, and then recommend the preamp be used with amplifiers of higher input impedance. Doing so though forces the amplifier designer to make potentially significant compromises in the performance of his product if he wishes to accommodate such a preamp.
And so the debate continues… "Whose responsibility is it to achieve optimal performance?" Seeing that a lower input impedance can benefit both tube and solid-state power amplifier designs equally, it would seem the "onus" is on the preamp designer. Good luck trying to tell him that though!
I hope this helps.
-Bob