So suppose I have a 16-bit CD of, say, Beethoven's 7th conducted by Kleiber Jr., and it's about 500 MB uncompressed. Does that become a file of size
(192/16)*500MB = 6GB
in the 192-bit version? Your terabyte hard drive would then go from holding 1800 symphonies down to 150. OUCH!
It would be interesting to see what a lossless compression algorithm would do with this thing. The 16-bit CD should be compressed to about 300 MB in, say, flac. I have a sneaking suspicion that if it's smart enough, the compression algorithm should be able to take that 192-bit file and compress it into... the same 300 MB? The reason being that 20 khz is 20 khz, no matter how many bits we use.
If Shannon's theorem says what we think it says, then when the signal is ultimately taken to analog, it should not matter whether it was 16 bits or 192 bits, since 16 bits already covers the spectrum up to 20 khz. There cannot be a difference between them if we assume that math is an accurate subject.
Shannon's theorem is correct, regardless of the font size used in its proof.
Now, if we want to convey the musical information beyond 20 khz, that's another matter. Our household pets may have something to bark on the matter...