Using an RCA interconnect vs. a "digital" cable for a DAC

0 Members and 1 Guest are viewing this topic. Read 3188 times.

sonicxtc

  • Full Member
  • Posts: 403
Hi,
On the surface, this may seem like a silly question (and maybe it is), but....

I recently received information from a knowledgeable audio industry person. He suggested that an RCA interconnect can be utilized in place of a digital cable. The implication is that "digital cables" generally just "filter" the high frequencies to smooth out the high end.

Has anyone ever tried this?
How does it sound?
I have a stereovox digital cable (RCA/RCA) with a BNC adapter to fit my dac and this would be an interesting option if it works.

Any opinions?

Thank you.

john dozier

  • Jr. Member
  • Posts: 108
Unless your dac and source both have true 75ohm connections, it is really academic to even talk about cables. Regards

sonicxtc

  • Full Member
  • Posts: 403
Quote
Unless your dac and source both have true 75ohm connections, it is really academic to even talk about cables. Regards

My source is a mac mini to a hiface usb converter to stereovox digital cable to rca/bnc adapter to Buffalo II DAC.
Does that help clarify the situation? Thanks.

But, I'm also curious if this can work "in general."

Speedskater

  • Full Member
  • Posts: 2733
  • Kevin
I think that the person got it upside down.(or maybe not)

This is the deal, digital signals are pulses in the low radio frequency range.  While the (RCA/RCA)(S/PDIF) system is very robust,  it is best to chose co-ax cables optimized for these low radio frequencies. Good radio frequency co-ax cables have a published "Radio Frequency Characteristic Impedance", for S/PDIF and many other use's it's 75 Ohms.  The longer the cable the more important this number becomes. (at 50 feet it's very important, at 3 feet it's not).  But from a manufacturing point of view it's easy & inexpensive to make 75 Ohm co-ax even if your not trying to meet a spec. So many analog audio cables will work just fine even if not labeled for digital (remember the part about long cables). One co-ax that you don't want is the RG-6QS (Quad Shield) sold in the big-box stores for cable TV signals.

Maybe the person meant that S/PDIF is a rather low frequency signal in that it is low pass filtered to reduce interference.

At these low frequencies  the connectors have almost zero to do with the signal quality.

srb

I recently received information from a knowledgeable audio industry person. He suggested that an RCA interconnect can be utilized in place of a digital cable. The implication is that "digital cables" generally just "filter" the high frequencies to smooth out the high end.

I would have to dismiss the part of the statement referring to filtering.  The 75 ohm cable that the S/PDIF protocol specifies should ideally be carried through the connectors.  BNC connectors have the proper inner conductor to outer shield distance to maintain this impedance, so a BNC to BNC 75 ohm cable is closest to maintaining an impedance match.
 
RCA connectors don't have the necessary physical spacing to maintain true 75 ohm impedance, but there are a few available that use a coaxially secured shield connection, like an F connector, to get closer.
 
Analog connections are less sensitive to these small impedance variations so analog interconnects are available both with twisted pair construction (shielded or unshielded) as well as coaxial construction.
 
There is really no difference between one cable of a coaxially-constructed "analog" interconnect pair, a video cable, a subwoofer cable or an S/PDIF cable, presuming they are all constructed with the same 75 ohm coaxial wire.  The distinction is purely for marketing purposes.
 
Steve

Speedskater

  • Full Member
  • Posts: 2733
  • Kevin
The connector is much too short to matter!  This in respect to the frequencies' wave length. A 1 meter long co-ax doesn't act as a RF transmission line at these low frequencies.

randytsuch

Re: Using an RCA interconnect vs. a "digital" cable for a DAC
« Reply #6 on: 11 Jan 2013, 11:32 pm »
The connector is much too short to matter! 

Not true.

ANY impedance mismatch causes a reflection, reflections are bad, they degrade the signal.  This is regardless of frequency of the signal.

Ideally you have 75 ohms from the transmitter to the receiver, but unless you can DIY that is almost never true (AFAIK).

EDIT:
I remembered a thread a while ago on this subject
http://www.audiocircle.com/index.php?topic=90454.0

It has scope pics which show what happens when you use RCA connectors instead of BNC along with a lot of other information.
Pretty informative, but also very technical.


Randy

pansixt

Re: Using an RCA interconnect vs. a "digital" cable for a DAC
« Reply #7 on: 12 Jan 2013, 02:56 am »
Excerpt from another interesting read: http://www.m2tech.biz/knowledge.html#playback

Except a WBT model (NextGen), no 75 Ohms RCA connectors exist, so it’s not possible to obtain a true 75 Ohms matched connection with RCA’s. On the contrary, BNC’s come with certified 75 Ohms impedance, so a true matched connection is possible using BNC’s. S/PDIF generally works up to 192kHz. Alloed distance is up to 5m.
S/PDIF actually descends from a professional connection standard called AES/EBU. AES/EBU requires a 110 Ohms balanced line with XLR connectors and at least 2.0Vpp on a loaded line. As said before, the voltage measured on an open line will be twice the standard value. Many think that, being AES/EBU mainly used in professional setups (studios, live stages), it always transports professional mode streams. That’s not a rule: there are many consumer equipment which have AES/EBU ports in which consumer mode data streams are sourced or received.
Another flavour for AES/EBU requires a 75 Ohms BNC connector instead of a 110 Ohms XLR. As said before, in digital domain two channels are carried on a single wire. Nevertheless, AES/EBU also gives the opportunity to carry one single channel on one wire, thus using two wires to carry a stereo stream. This is generally referred to as AES3 or dual AES. This choice allow for reducing the data throughput for the single wire by half, thus using, for example, two 96kHz connections in place of a 192kHz one. AES/EBU, like S/PDIF, generally works up to 192kHz. Thanks to the balanced line and the high output level, an AES/EBU output port can support long cable runs (several meters).

James

JerryLove

Re: Using an RCA interconnect vs. a "digital" cable for a DAC
« Reply #8 on: 12 Jan 2013, 04:02 am »
ANY impedance mismatch causes a reflection, reflections are bad, they degrade the signal.  This is regardless of frequency of the signal

If the reflection makes the receiver incapable of identifying a 1 vs a 0 then it's important. If it does not then it is not important.

Your scope images focus on the wrong area. The correct test is to send a known binary and see if it's correctly reproduced on the other side. If so: the output sound is a perfect as it is possible to make and no change in the interconnect will improve it.

If there are digital errors, then we can look at what's going on electrically to try to understand *why*.