It's not likely the bit length that matters - whether 16 or 24 - as almost no consumer amplifiers have a signal-to-noise ratio of over 100 dB. Almost certainly no tube amps do. So, tho a difference may exist if we can hear it, it cannot be due to amplifier limitations.
It's the sampling rate, Denny/DGO. Redbook's 44,100 per second simply is insufficient a sampling of the complex waveforms that make up music. Further the very real 22Khz limit may be insufficient, as well, to capture the top-end of an event and record it for playback.
24/96 or 24/192 with MLP is indeed a very real advance on Redbook...but it has little to do with dynamic range betterment. It has to do with capturing the nuance of real live music and top-end/treble extension with higher sampling rates.
John, I know you think I simply prefer digital because of convenience. It really has to do with the fact that digital is more transparent and dynamic than analog tape as a recording medium.
Saying 44.1 and the filter cutoff is insufficient is mainly conjecture for logicians. It becomes a purely academic debate. The question is, can it be proved empirically?
Can you hear a difference between analog and digital? Many say yes. I say yes as well. But I say digital is better. I say it's better, because it is more linear, quieter and more transparent.
Many prefer the sound of analog. But is it that they prefer the (in theory only) better waveform reproduction, or is it because they simply
like the sound because they subconsciously relate to the fondness they've developed for the warm and fuzzies that analog recreates?
Does human perception require a higher sample rate in order to perceive a continuous analog waveform?
I've read theories that assert that human hearing is in fact not a continuous analog function, but in fact a digital stream of neurons firing one after the other, and our brain fills in the gaps, just like digital.
Ok, so let's play logician.
Let discuss sample rates. One millisecond is one thousandth of a second. 44.1 is 44,100 samples per second. So, roughly a millisecond divided by 44.
One sample equals .0226 of a millisecond. So each 16 bit sample at a filter cutoff of 22k has a duration of
0.0226 milliseconds.
The sample rate for motion pictures is 24 frames per second. The duration of a single picture in a motion picture is
44.6 milliseconds.
That's
1973.451 times longer than Redbook's sample rate, yet human's perceive a continuous analog stream of movement while watching a movie( there are a few claims of people who can indeed see a strobe effect of motion picture).
So how important is it that we double the sample rate of the Redbook format?
One thing further: If you want to number-crunch, here's some food for thought... some years ago a well-respected audio technician named Barry Fox did a mathematical comparison of the relative bit-rates (information-transmission density if you will) of CD vs. vinyl.
We know that domestic CD audio samples at 44,100 Hz, so that its maximum bit-rate is 705,000 bits per channel per second. Mr. Fox then suggested that, if we take some nominal values for things like minimum cantilever excursion to produce audible signal and the contact area of stylus/groove interface, then by the same application vinyl calculates out to have a resolution in the order of 2 MILLION bits per channel per second!
Again, that's just number crunching. That means analog has 2.83 times more bits than Redbook. And the extra analog bits are buried in noisefloor and non linearities.
Factor in all that is required to get a sound on and off an analog format. Lots of noise, and lots of distortion, lots of channel crosstalk.
How many people are even aware of how Dolby or Dbx noise reduction even works on analog tape? Encode/Decode.
Digital = encode/decode.
Dolby or Dbx typ noise reductions have been used on nearly every professional analog multi-track recorder since the sixties.
Cheers