On USB Cables / DACs:
Yes digital cables will transmit a bit-perfect digital signal that can be reconstructed DESPITE any noise/flaws/jitter, but the story does not end here. I will give you a simple example which you neglected to mention, and is a valid data point (if it matters for one thing, it should not be dismissed outright). When a USB DAC device is self-powered off of USB, and the 5V Line is connected to the computer’s motherboard, noise is transmitted over that 5V line in the USB cable, and a ground loop may also be formed if the USB cable is quite long. In recent years, motherboard makers have introduced specific “USB-DAC USB ports” that have extra noise filtering circuitry in them to remove garbage created by the cheap switch mode power supplies in computers. The D+ and D- pins of the USB cable are differential pairs and prevents EMI and the bit-stream can be reconstructed and interpreted properly by the DAC. The issue is that you have forgotten about how the device the DAC is in was designed, and some devices may be susceptible to noisy 5V lines inducing noise on the analog signal path of the DAC.
On Ethernet Cables: (a copy of my response to someone):
This was going well until he got to the part about "electrically isolate your network cables and go with fiber optics". Clearly he thinks this would help because in the audio world, Optical cables usually serve well to break ground loops, but it is misguided advice HERE specifically when relating to networking. A normal network cable (unshielded) isn't actually grounded to the chassis ground. Assuming your network's ethernet cable runs are functioning enough that there is 0% packet loss, and as long as the analog electrical impulses get to the destination device's (ie: stream box renderer's) network PHY transceiver chip (Layer 1 of the OSI model), proper digital packet decoding will take place in the MAC chip (Layer 2 of the OSI Model). Any noise or garbage on the ethernet cable will never propagate further than the PHY. Your audio is not going to sound better because you're breaking a cable run up with some optical fiber. Any "noise" in the cable would only serve to hinder digital packet decoding (and inconsequentially at that, due to automatic re-transmission by TCP/IP, and the balanced twisted pair nature of the cabling). And if your network is not functioning properly and you did have digital audio packets being lost because of noise/interference, putting a gap of fiber optics AFTER the issue is not going to remedy that. Please note its hard to explain this in 5 minutes but I did my best. To be more technical: any common-mode noise (ground loop) induced onto the balanced twisted pairs in a UTP cable are subtracted out by the transceiver (PHY) as part of the differential transmission algorithm. Not only that, but since they are almost always 50/60hz due to AC hum, they do not affect data transmission due to the balanced twisted pair design of the cable (rated for ~350 MEGAhertz).
-part 2:
Millions of engineers out there can prove you wrong, but they have better things to do. The concept of proven reliable "bit-perfect" streaming is different on each. As long as you are getting packets from one end to the other without loss (which is easily provable by CRC routines and other in-built checks in the Ethernet spec) then no high quality Ethernet cable is going to give you better audio. The same can be applied to USB/HDMI but they are harder to prove and there are more ghosts in the machine so to speak. Also, of the things we do understand, like jitter in USB cables, each technology is fundamentally different, and what applies to USB does not apply to Ethernet and so on for HDMI and SPDIF coaxial. So if you're going to buy a high quality audio cable, save your money on audiophile grade Ethernet which is 100% nonsense and buy other high quality USB, HDMI or power cables......