All non-defective disk readers are 100% accurate (they use oversampling to make sure). Though the Redbook standard allowed for read errors, the technology was taken from the computer industry in the 1990s. There is 0% error (which is to say that oversampling catches any error that may occur and the correct data is handed down the line) and 0 room for improvement in read quality (except perhaps in ability to read through scratched or other defects). The fact that your computer can load a program from a CD and run it tells you that the pit-land read is 100%
Digital signals, until approaching the point of complete failure, don't "get fuzzy". All of the data gets there or none of the data gets there (there are exceptions in very short windows between the two). The fact that your computer can load a program from a USB attached CD and run it tells you that the data is 100% (yes, there is error correction which *may* not be used in PCM audio streams (I'm not sure); but as someone whose spent time looking at the error rates, they approach zero).
Jitter is a timing mismatch. The read windows create margins of error. It is actually this timing issue that effects maximum transmission rates. The needed rate for PCM is about 600MB per hour. The margin of error allows for transfer rates of 625MB per second. It is a non issue while still in the digital domain.
It does not matter how the PCM data gets to your DAC. It could be written on a piece of paper, carried over in a Winabego, typed into a hex editor and then dumped over any connection fast enough to feed data at 600MB per hour. It simply makes no difference.
The first spot where even theory allows for deviation (without something being defective or misused) is in your DAC; where a clock imperfection would affect the frequency output... anyone know what the margin of error is there?