Can you test an amp and determine what impedence in the LDR3X will work best with it?
Sorry for the slow response but was offline for a week of vacation.
The established guidance when connecting audio devices is to either match impedance (output impedance of source = input impedance of amp) or to have sufficiently high bridging impedance ratio (10:1 where amp input impedance at least 10x greater than source output impedance). Matching impedance may be ideal but it's impractical so most sources have less than 1k output impedance and most amps have input impedances of 10k or greater.
When you place a preamp between a source and amp you now have 2 instances of impedance bridging: between the source (DAC, phono stage, CD etc. ) and preamp, and between the preamp and amp.
When we were able to raise the input impedance of our LDRx preamps to 10k last year (the HiZ upgrade), this improved matters between the source and preamp.
However, as the graph below shows, any passive attenuator employing voltage division (like pots or LDRs) has a varying output impedance that reaches a max of 25% of the input impedance at -6dB (loud!!!). In the case of a 10k unit, that's a max output impedance of 2.5k. For a 20k unit, it's 5k. This would then argue for amps having at least 25k input impedance for a 10k passive attenuator or 50k for a 20k passive attenuator.
Then again, as a practical matter few of us listen to music at -6dB attenuation. In the more likely range of -40 to -20 dB attenuation, the passive preamps output impedance is more likely to be in the 1-2k range which means that an amp with 20k input impedance of greater should suffice but arguably 40-50k or higher may be better (that's "may", not "will").
Beyond this level of general guidance you get into more subtle and equipment specific issues.