The whole process of LDR calibration is just one of the benefits of managing the hardware with software. For example, when you specify and save an impedance value, that's the input impedance your source will see at all LDR attenuation levels for that specified impedance setting. You can have ten saved impedance values (20kohm is fixed for #1) and select any of your saved settings.
This means that you can fine tune the Tortuga's input impedance to the optimum level of the source. When configuring my LDRxB (balanced) for a new source, I start with saving broad steps (e.g. 10 kohm). This gets me into the right ball park. So, in my most recent impedance configuration process, I found that the 40k to 50k range seemed to sound the best. The next step is to fine tune that range. I use 2k to 4k steps are good to narrow down the range. Usually, one last cycle of 1k steps may find the winner.
However, that's an idealized end point. I've found that some recordings golden oldies verses recent) may benefit from the subtle difference at a different impedance. Being able to instantly switch from one to another saved setting makes the sonic difference more obvious.
As an example of how the Tortuga impedance control system optimized the source performance is when I upgraded my Lampizator Pacific DAC to the new Horizon DAC. I found that the Horizon had a totally different range of preferred Tortuga input impedance than I had been using for the Pacific. Also, these DACs are vacuum tube designs. Tube rolling is encouraged by the designer at Lampizator. After trying several tube variants in the Horizon, I've settled on a set that optimizes the realism in various ways. The point here is that tube change(s) each had their preferred optimum input impedance setting on my Tortuga.
I'm not aware of any other system that provides for the optimization of preamp/controller input impedance with this flexibility.