Inserting an A/D/A convertor in the chain is not the same as playing a redbook CD. Without having seen the actual paper, it seems that the true result would more likely be that the bandwidth limit and quantization noise of a 16-bit 44.1 kHz signal were not sufficient to enable listeners to statistically distinguish between a signal with and a signal without.
I think a more interesting test might be to digitally alter a high-rate 24-bit signal, and determine at what point listeners *are* able to distinguish the altered signal from the original.
Inserting an A/D/A converter is worse than playing a CD because of the extra A/D conversion involved which is a source for lost fidelity so a CD would be better than what was used for this comparison.
The proper way to make this comparison would be to apply a 44.1khz output digital filter to the high resolution data stream and dither the result to 16 bits.
You cannot compare SACD or DVD-A to the same release's redbook version because the studio's make them different to scam the consumer into thinking HD formats are necessary rather than just using the same data and filtering it to 44.1khz and then dithering to 16 bits so that both recordings are identical except for noise floor and bandwidth.
The redbook standard insures that any error will be below -96db or above 20khz so long as the equipment and technique used was up to par.
If the playback level is high enough however your ears can hear below the noise floor of CD's and many people can discern signals higher in frequency than CD can resolve but this is no more than a simple point of fact.
I would not say that Redbook is the same as SACD or DVD-A, but I would say that more resolution than the Redbook standard is completely unnecessary no matter how 'Hi End' you want to go unless you just want to get silly.