Desktop vs. Laptop USB

0 Members and 1 Guest are viewing this topic. Read 9654 times.

audioengr

Re: Desktop vs. Laptop USB
« Reply #40 on: 2 Oct 2006, 05:07 pm »
If the computers are feeding digital signals to the same decoder and subsequent audio pre-amp, amp, speakers. Then the sound will be EXACTLY the same. Because there is no sound being fed but a code, a set of information that tells the DAC how to make the sound properly. If an error occurs during the transmission of this information the DAC will (in 100% of the cases) detect it and correct it perfectly. If it can't be corrected perfectly it will send a message back to the computer telling it to re-send the missed part. And the process will be repeated until it gets it right. This entire process takes fractions of seconds and is imperceptible because the information is buffered before it starts playing.

Welcome to the world of Boolean Algebra and digital electronics. A technology light years ahead of: tube amps, chariots, turntables, arrow and bow making, vinyl LPs, you name it.


The Data is exactly the same, its the timing that is not.  Non real-time Computer DATA is sent without the timing affecting the result, as long as the timing meets the setup and hold requirements (which results in no data errors).  Real-time audio streaming data on the other hand needs both content and timing and the timing affects the playback quality.

You are simply wrong about this.  The clock frequency and jitter are both critical to the DAC chip performance.

Steve N.
EE with 30 years digital design experience

welwynnick

  • Jr. Member
  • Posts: 31
Re: Desktop vs. Laptop USB
« Reply #41 on: 10 Oct 2006, 11:20 am »
This stuff is fascinating.  What is it about about digital signal processing that degrades sound quality? I use different digital connections at home, and the diffrences are very apparent, so all digital data is clearly not the same.  For as long as I can remember people have associated sound quality with jitter, but what I am wrestling with is the tangiable consequence of excessive jitter.  In what way does the final analogue audio signal differ when there is more jitter? 

Jitter is a time-domain perturbation of a digital stream, but it seems to me that everyone assumes this translates to mis-timing in the analogue output.  "The right amplitude at the wrong time is the wrong amplitude". 

As I understand it, the magnitude of jitter is typically a fraction of the duration of a bit.  There are 16 or 24 bits that go to make up just one amplitude sample, and at least two samples are needed for each cycle.  With lower frequencies and higher sample rates, there will be many samples per cycle - perhaps fifteen, for example. 

Now if jitter is equivalent to say, a tenth of a bit duration, then the consequential time perturbation of the associated sample would be a fifteenth of a sixteenth of a tenth of a bit duration.  So the timing for one of the samples in the cycle in this example may be 1/2400 of a cycle early.  And the timing for the following sample may be the same, or correct, or 1/2400 late.  Of course there are several samples in each cycle, and because of the nature of jitter, these will tend to average out over a period of time.

The consequence of this is that jitter may be considered, like other forms of distortion, to add an unwanted signal on top of the desired signal.  It's a phase distortion rather than an amplitude distortion, but it still corrupts the signal.  It appears to me that this distortion would be small, but we may be very sensitive to it, and we do need to minimise it.

Does that describe what people consider to be the mechanism for corruption of audio by jitter?

Nick

audioengr

Re: Desktop vs. Laptop USB
« Reply #42 on: 10 Oct 2006, 06:06 pm »
This stuff is fascinating.  What is it about about digital signal processing that degrades sound quality? I use different digital connections at home, and the diffrences are very apparent, so all digital data is clearly not the same.  For as long as I can remember people have associated sound quality with jitter, but what I am wrestling with is the tangiable consequence of excessive jitter.  In what way does the final analogue audio signal differ when there is more jitter? 

Jitter is a time-domain perturbation of a digital stream, but it seems to me that everyone assumes this translates to mis-timing in the analogue output.  "The right amplitude at the wrong time is the wrong amplitude". 

As I understand it, the magnitude of jitter is typically a fraction of the duration of a bit.  There are 16 or 24 bits that go to make up just one amplitude sample, and at least two samples are needed for each cycle.  With lower frequencies and higher sample rates, there will be many samples per cycle - perhaps fifteen, for example. 

Now if jitter is equivalent to say, a tenth of a bit duration, then the consequential time perturbation of the associated sample would be a fifteenth of a sixteenth of a tenth of a bit duration.  So the timing for one of the samples in the cycle in this example may be 1/2400 of a cycle early.  And the timing for the following sample may be the same, or correct, or 1/2400 late.  Of course there are several samples in each cycle, and because of the nature of jitter, these will tend to average out over a period of time.

The consequence of this is that jitter may be considered, like other forms of distortion, to add an unwanted signal on top of the desired signal.  It's a phase distortion rather than an amplitude distortion, but it still corrupts the signal.  It appears to me that this distortion would be small, but we may be very sensitive to it, and we do need to minimise it.

Does that describe what people consider to be the mechanism for corruption of audio by jitter?

Nick

Jitter is really just non-linear frequency modulation.  It creates "sidebands" outside the frequency range of the original signal in the frequency domain.  The amplitude of the sidebands is a function of the amplitude of the modulation or jitter.

As for human capability to hear this, I believe it depends on the spectrum of the jitter.  You can imagine a jitter that is modulating very slowly, say at 500 Hz.  The sidebands that this creates would be audible and probably at low amplitude, even a nanosecond of jitter.

The other thing about jitter is the measurements.  Jitter measurements on a single component are not very interesting because there is always a system involved.  A system comprises cables, at least two components, terminations, slew-rates, impedance matching, ground-loops, maybe some EMI effects etc..  Throw all of these in and you may have 2-3 nsec of total jitter that results, and some of the jitter is caused by 60 Hz noise on the grounds.  It always amazes me when I read that Stereophile has measured 50 psec of jitter on some component.  I can take that same component and reduce the jitter by probably 75% and hear the difference.

Steve N.

welwynnick

  • Jr. Member
  • Posts: 31
Re: Desktop vs. Laptop USB
« Reply #43 on: 12 Oct 2006, 08:58 am »
Right.  That's what I expected.   But I'm going to switch round and take a different direction. 

Phase or frequency modulation; I see them as much the same thing.  The point is that jitter is supposed to manifest itself as a corruption of the eventual analogue output waveform - in other words, the signal that is comprised of a sequence of re-constructed analogue samples.

Therefore the analogue audio will show the corruption if those samples are at the wrong time.  But it is the bits that are jittered. So there is a process that separates the bit jitter from the sample jitter -the pulse code demodulation, where the bits are re-assembled into sample values (the digital representation of the sample amplitude).  I'm trying not to generically call this digital to analogue conversion, because that includes all functions.

Now, I don't know enough about how DACs work, but I assume that it will derive a clock from the bits, or the bit/word clock, to time the output of the new analogue sample values.  And this clock is obviously different to the bit clock in frequency (kHz rather than MHz).  But what I don't understand is how it is related in phase/timing? 

Doesn't that DAC have it's own PLL that locks onto say the bit clock and tracks that, perhaps smoothing the short term timing variations with a filter in the phase control loop, but tracking the overall trend?  I think the timing variation in jitter varies from bit to bit, but overall, the frequency is stable.  So isn't the DAC PLL buffering the bit jitter before it becomes sample jitter.  Surely the DAC uses it's derived clock to time the samples, rather than simply taking every sixteenth (or whatever) bit as the timing reference?

To understand things better, I think I need to consider bit jitter and sample jitter as being separate and somewhat disconnected.  Have I explained myself sensibly?  This leads onto somethng else.....

regards,   Nick

audioengr

Re: Desktop vs. Laptop USB
« Reply #44 on: 12 Oct 2006, 04:58 pm »
Quote
Now, I don't know enough about how DACs work, but I assume that it will derive a clock from the bits, or the bit/word clock, to time the output of the new analogue sample values.  And this clock is obviously different to the bit clock in frequency (kHz rather than MHz).  But what I don't understand is how it is related in phase/timing?

If the signal is S/PDIF, then the receiver derives the clock from the data stream.  If it is I2S, then the clocks are all separate.  The DAC only uses the clocks to shift and convert the data into analog.

Quote
Doesn't that DAC have it's own PLL that locks onto say the bit clock and tracks that, perhaps smoothing the short term timing variations with a filter in the phase control loop, but tracking the overall trend?  I think the timing variation in jitter varies from bit to bit, but overall, the frequency is stable.  So isn't the DAC PLL buffering the bit jitter before it becomes sample jitter.  Surely the DAC uses it's derived clock to time the samples, rather than simply taking every sixteenth (or whatever) bit as the timing reference?

No buffering in DAC chips.  No PLL.  The receiver has a PLL only if it is also upsampling and using a local clock to do this, such as the CS8420.  The Receiver chip generally separates the data and clock from the S/PDIF stream, but it does not buffer or reclock.

Most modern DAC chips use the bit clock to clock in the bits and then also to do the conversion.  They do not use the word clock for this.  The word clock is only a qualifier that tells it where the word boundaries are.

Steve N.

welwynnick

  • Jr. Member
  • Posts: 31
Re: Desktop vs. Laptop USB
« Reply #45 on: 16 Oct 2006, 03:07 pm »
Thanks Steve, I think I'm starting to see where the benefits in computer audio come from.  You want to avoid having the source impart any timing information into the data.  A player puts timing information in from it's own internal clock, but a PC will just give you a data stream, and the Off-Ramp or DAC will be the only source of time (and hence jitter).  That assumes that the streaming data is adequately rate controlled to ensure there is no shortfall or overflow.  You can manage jitter effectively with a USB to I2S convertor, and the DAC then takes it accurate timing information from the convertor.  Do I understand correctly?
 
BR,  Nick

audioengr

Re: Desktop vs. Laptop USB
« Reply #46 on: 16 Oct 2006, 07:07 pm »
Thanks Steve, I think I'm starting to see where the benefits in computer audio come from.  You want to avoid having the source impart any timing information into the data.  A player puts timing information in from it's own internal clock, but a PC will just give you a data stream, and the Off-Ramp or DAC will be the only source of time (and hence jitter).  That assumes that the streaming data is adequately rate controlled to ensure there is no shortfall or overflow.  You can manage jitter effectively with a USB to I2S convertor, and the DAC then takes it accurate timing information from the convertor.  Do I understand correctly?
 
BR,  Nick

Yes, except that there is a PLL involved in the USB to S/PDIF conversion currently.  This is necessary because the USB protocol most commonly used is Adaptive Isochronous.  I have a software project in the works for my next generation that changes this to asynchronous.  This eliminates the need for a PLL.

In the meantime, I have a more near-term design in layout for a device called the "Pace-Car".  This generates the clock for the Off-Ramp I2S and also an output clock that is very low jitter, essentially the jitter of a Superclock.  I hope to have this finished and functioning for 2007 CES in January.  It is a 2-box solution, but should be the lowest jitter digital on the planet.

Steve N.

welwynnick

  • Jr. Member
  • Posts: 31
Re: Desktop vs. Laptop USB
« Reply #47 on: 16 Oct 2006, 10:01 pm »
Adaptive Isochronous?  That sounds like i-Link, and why I mentioned rate control.  Does asynchronous USB with no PLL mean that the SPDIF interface will be asynchronous as well? This is interesting stuff, as I keep repeating ad nauseum.  It sounds like you are trying to remove all sources of timing error starting at the front and working your way through to the last possible point, so that there are as few syncronous elements in the chain as possible.  Hence minimising jitter.  Please don't feel obliged to reply to every message, but I can't help thinking out loud somethimes.

regards,   Nick

ps. I do notice that some new AV amps are starting to use USB inputs now, presumably to accept media centres.  I suppose it;s for convenience rather than quality, but meybe there's some potential there.

audioengr

Re: Desktop vs. Laptop USB
« Reply #48 on: 16 Oct 2006, 11:28 pm »
Adaptive Isochronous?  That sounds like i-Link, and why I mentioned rate control.  Does asynchronous USB with no PLL mean that the SPDIF interface will be asynchronous as well? This is interesting stuff, as I keep repeating ad nauseum.  It sounds like you are trying to remove all sources of timing error starting at the front and working your way through to the last possible point, so that there are as few syncronous elements in the chain as possible.  Hence minimising jitter. 

Yes the S/PDIF, AES and I2S converters will be asynch.

Actually, most current converters are Adaptive Isochronous, except the Creative Audigy 2 NX which is asynchronous.  This is why pops and ticks can occur.  I want to create an asynchronous driver so pops will never happen.  This has nothing to do with jitter, just data underrun at the computer, but the side-effect is that the essential PLL causes some jitter, albeit quite low with the right design.

My Pace-Car deals with minimizing jitter with the current Adaptive Isochronous type converters.

Steve N.

welwynnick

  • Jr. Member
  • Posts: 31
Re: Desktop vs. Laptop USB
« Reply #49 on: 10 Nov 2006, 01:54 pm »
Steve,

Sorry in advance for the never-ending stream of stupid questions, but after hearing the Slim Devices Transporter at a London HiFi show last week, I am becoming a convert.

I've been trying to understand I2S better.  The purpose appears to be to keep the timing information away from the amplitude information.  But even then, taking the player's timing info and feeding it forwards to the DAC will inevitably add a small amount of addditional jitter.  I would speculate that this would be reduced if the timing came from the DAC, and could be fed back to the player.

I read up on I2S and found that either the source or the sink can have the master clock, suggesting that the clock lines can go either way.  I guess the practical implementation will be constrained by the configuration of the proprietary interface.  However - I don't know what that is! 

So the stupid question is - does the timing info usually go from the player to the DAC, or from the DAC to the player?

Cheers,   Nick

Edit:  I'm starting to think that the integrity of the amplitude data is almost inconsequential - almost any old player and connection can gt all the bits right.  Sound quality seems to be dominated by the jitter, or at least the jitter spectrum.  This part of digital links should probably be considered to be analogue rather than digital.  Digital means the data is coded in numbers, but timing info is still analogue - it's down to exactly when in time the pulse transistions happen to come along. 

audioengr

Re: Desktop vs. Laptop USB
« Reply #50 on: 10 Nov 2006, 10:10 pm »
Steve,

Sorry in advance for the never-ending stream of stupid questions, but after hearing the Slim Devices Transporter at a London HiFi show last week, I am becoming a convert.

I've been trying to understand I2S better.  The purpose appears to be to keep the timing information away from the amplitude information.  But even then, taking the player's timing info and feeding it forwards to the DAC will inevitably add a small amount of addditional jitter.  I would speculate that this would be reduced if the timing came from the DAC, and could be fed back to the player.

I read up on I2S and found that either the source or the sink can have the master clock, suggesting that the clock lines can go either way.  I guess the practical implementation will be constrained by the configuration of the proprietary interface.  However - I don't know what that is! 

So the stupid question is - does the timing info usually go from the player to the DAC, or from the DAC to the player?

Usually, there is no clock in the DAC.  The S/PDIF stream is decoded into data clocks by the "receiver" chip.  The Transport or computer converter generates the clock.  Usually if there is a clock in the DAC, it is only for decoding the sample rate or performing upsampling.  Most of the so-called "reclockers" such as the Monarchy just use an upsampling chip to asynchronously reclock the data.

A design with the clock located at the DAC chip might do better with jitter.  Most of the designs that implement this use the "word-clock".  There are only a few DAC's left that will actually benefit from this.  It turns-out that most modern DAC chips use the bit-clock (SCLK) or the master-clock (MCLK) to clock the D/A conversion, not the word-clock.

My newest invention, the Pace-Car, develops a low-jitter clock (non-PLL).  It inserts between the Off-Ramp I2S and a DAC with I2S input.  I seriously doubt if any clock inside a DAC will be any lower jitter than this.  Even though the current Off-Ramp clock is a PLL, it is extremely low jitter.

Your thinking about jitter is right on the money.  This white-paper I wrote should help:
http://www.positive-feedback.com/Issue22/nugent.htm

Steve N.