I've always been baffled about folks picking one HDMI cable over another. Since HDMI cables are truly digital with an encoded signal at the phy layer, hdcp encryption at the data layer for every single pixel, the cable either is compliant with the specifications (hdmi 1.0, 1.3 or 1.4) or it's not. Even the audio is delivered during the vertical/horizontal blanking phase with the video with the same encoding and encryption with the video and it's up to the receiver, processor or TV to re-clock the audio and send it to a DAC.
At this point it's no different then picking cat5, cat5e or cat6 ethernet cable to stream audio to your stereo or download this website... The data either arrives or it doesn't.. Selecting a cat5 cable for gigibit ethernet just doesn't work and the ethernet switch and network card will drop down to 100megabit ethernet.
I'm not even sure HDMI -> DVI-A cables would matter much.
I've had two different dealers try to sell me $200 HDMI cables and make the claim they look and sound better. All I can do is smile and nod and ask for the $40 version. I would even assume that's still too high but I do prefer a cable with a decent build quality and connectors.