The shops know this. High-end hifi is full of grift. It literally relies on people with more money than sense who can be persuaded to spend a fortune on components and extras that either make no difference to the sound or make it objectively less accurate.
Which is why you get nonsense like directional cables, cable supports to keep cables off the floor, cables with magic insulation, magic rocks that go on top of components, magic knobs, magic creams, "quantum" anything, grounding boxes full of pebbles that don't ground anything, and so on.
Also, most of the people who can afford this hobby are rich old boomers who often can't hear much above 7k on a good day.
I had a friend in college who worked part-time at a shop that sold and installed audiophile gear. He was a music engineering major, so he was aware of how much of what they sold was snake oil. Example: a 3-foot RCA cable that sold for several thousand dollars. The jacks were gold-plated. Not just with ordinary gold, but "special" gold. From a particular mine in South Africa. He told me once "If I ever sell that cable, I don't know how I'll live with myself."
One time he and another guy had just completed an installation in a very wealthy customer's home. Hundreds of thousands of dollars on an ultra-high-end amplifier, CD player, speakers, etc. He started playing one of the customer's CDs on the systems. The customer went on and on, lavishing praise at how "warm" and "crisp" his music sounded on the new system. Suddenly my friend's co-worker leaned over and whispered, "Do you hear that?..." My friend knew exactly what he was referring to: they had accidently wired the speakers out of phase.
My favourite variant of this is high-end digital audio cables.[1] Presumably the ones are really spiky and the zeros are really round when you use such a cable.
[1] yes this really does actually exist as a thing.
If USB-C has taught me anything it’s that higher signal integrity in more expensive cables is actually really a thing. I have usb-c devices which are so temperamental (razer kiyo pro) that they won’t run stable without actual thunderbolt cables. Even high-end fasgear 10gbps/100w cables aren’t good enough signal integrity or the camera will eventually freeze and crash. The extra thunderbolt shielding/better termination really does make a difference even if the pairs are not being used.
Hdmi and DP cables also have a range of shielding and termination quality, and pushing more data through them will show the difference, especially in longer lengths that are at the limit of the spec. Everything looks good when you are only doing 1080p30, try doing 2160p120 on a tv or UWQHD 160 hz on a pc, with a max-length (2m) cable and tell me every cable is just as good. Even every DisplayPort 1.4 cable or hdmi 2.1 cable is not “as good” let alone a rando lower-tier cable with the older shielding.
Oculink folks know damn well that not every cable is just as good either. Copper oculink cables are a fucking mess.
I’m not saying Monster wasn’t ripping you off for some gold-plated connectors, especially back in 2000 when hdmi stated with 1.x in the name. But the reality is that if nothing else we have manifested the world where digital cable quality matters. And really the “hurr durr digital signals” thing has always been a little bit of an oversimplification, even as someone who’s said it before too.
There isn’t a correlation with price in the high end, going from a $30 cable to a $150 cable doesn’t get you anything, but going from a $5 cable to a $30 cable you certainly do get better shielded cable and better termination if nothing else, and yes, it does make a difference when you are actually pushing the spec to the limit. You will definitely find out that not even all 10gbps usb-c or hdmi 2.1 UHBR cables are created equal let alone ye olde hdmi 1.3 bullshit cable.
Digital audio, no matter how high quality and how many channels, uses vastly less bandwidth than even the lowest quality video signal these days. At absolute best, lossless audio with 16 channels is still less total bandwidth than even USB2.0 speeds. It’s just not fast.
Although signal integrity matters, signal integrity in digital signals is almost always not silent degradation. And often is extremely obvious degradation. So if your cable has problems, you know almost immediately (or, when it happens you know). Unlike analog audio, it either works or it doesn’t. In the marginal case you don’t just get worse sounding audio… you just don’t have audio periodically which sounds super obvious to the ear.
Just to support your point, there are many professional digital audio products used for recording that quite literally run on USB 2.0. For instance, this well respected interface that supports 32 channels in and out up to 192khz: https://www.sweetwater.com/store/detail/DigifaceUSB--rme-dig...
Why so aggressive? I didn’t make any claims other than less than USB2.
And for reference, I was assuming 384 kHz, 32 bit depth, which would be 196 Mbps at 16 channels. You cannot even do 32 channels over USB2 which would be the next logical step.
Sorry you got offended, that wasn't my intention. Pointing out inaccuracies is not aggressive, and in this case, I was reinforcing your claim, by saying USB2 is not only capable of tens of audio channels, but hundreds.
I was just adding context, and I'm happy you provided even more. 32/384 is indeed at the extreme top of digital audio signals, beyond what many people consider to be useful. Even in such cases, USB2 can still do the job, but most modern devices will use USB3 or Thunderbolt, simply to avoid issues and ensure enough headroom on the digital bus.
some inexpensive SDR dongles sidestep the question of getting a really good high-bandwidth ADC (still not cheap even today!) by just hooking up an intermediate frequency stage which downmixes RF to AF, and then you just hook it to one of the innumerable sound cards on the market to pull it into the PC for processing etc.
Wider bandwidth is wider bandwidth, the faster your sample rate the wider the range that you can sample. Iirc it's nyquist rate, so, with 384kbps rate you can sample a 192khz window.
Are you really taking a discussion about audio and making some sort of niche invented case of making a sound card be the analog to digital conversion from a software defined radio?
Even in that case 32 bit sampling is nonsense, the resolution of detecting voltages is never going to be accurate enough to need 4 billion different levels.
I'm actually quite serious, of the people who are buying really hi-fi audio inputs, it's not unlikely a large number of them are quite interested in the actual bandwidth of the input. They're using that sampling headroom for something else (oversampling).
Regulation and stabilization is a whole ""niche"" area of electronics called "voltage regulation" and yes, you can improve it a lot (and you can only do your best regardless of instability of the base). Actually a lot of audio engineers would be feeding it clean (linear regulated) power in this case as well. My Astron linear supplies are quite clean for radio and would be excellent for audio too.
USB optoisolator + linear regulated supply on a hub on the other side is something I've seriously considered for a nice SDR setup even short of that.
Again, of the people who buy this stuff, yes you will find a lot of people paying attention to the details and buying/building esoteric setups. Hams exist and the hobby almost inherently involves opening up the checkbook lol. Even decent audio gear is generally going to be $500-1000 most of the time.
While I agree with you, you can find 384 kHz 32-bit playback available on many of their LG cell phones as a marketing feature. Does it matter? No. Does it exist, even in consumer electronics? Yes.
High speed digital signaling over differential pairs requires precise impedance control to handle multi-gigahertz data rates. You're paying for that in a high quality USB3 cable. Audio signals running below 20 kHz are in a completely different electrical regime that doesn't require special manufacturing attention. It's easily doable with 19th century technology.
Right, but in the context of an audio stream, it's not going to matter these days.
Even a hugely overkill uncompressed 8-channel, 32 bit, 192kHz sample rate audio stream[1] is under 50 megabits/second.
Any digital cable and transceiver from USB 2 (480Mbps) or 100BASE-T Ethernet (100Mbps) onwards will handily do that, with headroom for so much forward error correction that you probably won't expect a single bit error in your lifetime.
[1]: good luck getting a clock with low enough sample aperture jitter to actually record that in the first place, it's probably under a femtosecond.
If the sample clock edges aren't very (very very very) regular, on a sample-and-hold ADC, the waveform isn't sampled evenly and that manifests as noise that swamps the detail provided by the higher bit depth.
This is called "sample aperture jitter". Requirements scale linearly with frequency and exponentially with bit depth.
Sure enough, 32 bits sampling a 96kHz signal, which is the Nyquist frequency of 192kHz sampling rate, is 0.3fs. At 24 bits, it's more like 100fs, which is much more doable, but still not easy.
Which is why audio bit depths usually don't go to 32 bits, despite formats like FLAC supporting that.
The practical upshot of this and other noise sources is that higher audiophile-grade bit depths and sampling frequencies are quite likely to have at least some of those bits swamped out by noise on real hardware.
This is just getting the audio recorded. Playing it back as physical sound waves adds something between quite a bit and radically more noise to the signal, even if there's never any lossy compression.
It seems like you are arguing with someone that 32 bits per sample is too much resolution and I agree, but I'm not sure who you are are arguing with or who is saying that.
It was an interesting (to me) footnote to the point that even if you massively overspec your audio stream to the point of physically being unable to record the audio at that quality (the footnote being why it's unfeasible), you can still easily fit many such streams down a single modern-ish digital link.
The point isn't that you shouldn't record audio at 32-bit depth (which you probably shouldn't if you expect it to bring much benefit, but that's by-the-by), it's that even if you did, and you have a 7.1 system with 8 uncompressed streams, you still won't be anywhere near the point where USB 3 cable grades will start to matter.
You're the one who asked for clarification on the footnote specifically.
> Dynamics are better and overall naturalness is improved. Here is a test for all you Silver Rock owners. Try removing the bakelite knobs and listen. You will be shocked by this! The signature knobs will have an even greater effect…really amazing! The point here is the micro vibrations created by the volume pots and knobs find their way into the delicate signal path and cause degradation (Bad vibrations equal bad sound). With the signature knobs micro vibrations from the C37 concept of wood, bronze and the lacquer itself compensate for the volume pots and provide (Good Vibrations) our ear/brain combination like to hear…way better sound!!
I get nerding out on stuff but I don't get doing it at the age of the guy in the story. I thought as one got older one started to see the foolishness in this sort of excess. You can't take it with you. And in the end you see what became of it. Split up and sold off to strangers for a fraction of what it cost.
I had a similar experience, though the other way around.
I was listening to some speakers at a relatively high end store (they sold 10k$ CD player power supplies, along with a selection of more reasonable gear), where they'd only have one set of speakers in the room at a time to prevent passive effects on the sound. (which is sort of reasonable, but a lot of work). So they bring in the Magneplanars and the (incredibly inappropriate) amp I was interested in, wire them up, and within 5 seconds I know they're out of phase, and then in another 30 seconds, pretty strong clipping.
Sales person was going on about how good they sounded.
Yup. When I used to do studio recordings as a musician, most studios have A/B/C speakers for mixing. They have the nice speakers usually on concrete breeze blocks which is what you do your first mix on, and those are equivalent to entry-level-ish audiophile speakers but not super-expensive, then they have the B speakers which are a bit worse and then they have the depressing C speakers which represent what people usually have at home. Even super-expensive studios that will happily drop 10s of thousands on a valve preamp or Neumann mic don't use any kind of fancy cable or unobtanium plugs or anything. Just regular hard-wearing well-shielded audio/speaker/whatever[1] cables depending on application.
[1] There are some differences based on whether the cable is going to carry just audio or audio+phantom power and also whether it's line level or speaker level or whatever it's called. It's been a while.
When I was a kid, the whine of CRTs was genuinely annoying, and those 'keep pests away' things hurt when they went off. I was looking forward to not having to experience that as I got older.
Now that I'm middle-aged, I can tell there's a little loss around the vocal frequencies... but damn if those old tubes and pest devices aren't as annoying as ever. Ears are weird.
I had a magic skill when I was in high school: I could walk into any computer lab and tell you exactly which monitors were on (when the computer was off). I believe this is around 15K.
it looks my hearing drops off between 13k (noticeably quiet compared to 12k) to 14k (can't hear it at all).
Hah reminded me of when I was a kid walking into the living room with a few people around many times and turning off the TV (crt of course) that was just showing a black screen. It was so loud and obvious to me. Now all the TVs seem to have bizarre screensavers
I also seem to have beyond normal hearing, and can hear some electrical devices or high pitched sounds where others don't seem to.
Last year I was in a mall where they had a Tesla store and it was emitting a horrible high pitched sound.
Where I live some houses have Seagull scaring devices and they also emit a horrible high pitched squeal which I'm guessing others can't hear, otherwise it would honestly drive you nuts.
Not so sure about that, mixing engineers have to do very precise manipulation of sound and there’s plenty of them working well into their 40ies and beyond. While of course aspects of your hearing deteriorate, it should still be pretty simple to pick out the artefacts of compressed audio if you’ve gotten attuned to them at a younger age… At least for heavier compression, something like a 320kbps MP3 or 256 AAC is hard to distinguish from lossless but that’s also true for young folks.
I had previously toyed with the idea of marketing "organic" audiophile speaker cables. The ad copy would have been that the cables were harvested from veins of copper dragons that lived in faraway mountains. That the sound would be warmer and richer than ordinary, industrially produced artificial speaker cables, which tend to sound harsh.
It's bad enough that some audiophiles make it a point to not talk about being an audiophile because they don't want to be the associated with that foolishness.
Ah yes, DirectStream Digital. I think the math behind that was solid but there wasn't really any point because PCM was totally sufficient. The company I was working for made AD and DA converters for recording studios and film so we were adjacent to all the insane HiFi stuff but didn't really interact with it. We did get bug reports that were real but effectively not visible, I learned a lot about debugging from watching my boss work on those.
I remember being an electronics store once and seeing some cables intended for audio that described themselves as being "low oxygen" cables. My god, the nonsense that people fall for!
Out of all the ridiculous things, this one is actually not completely ridiculous. For low temperature physics applications, Oxygen-Free High Conductivity (OFHC) Copper is the only material that is sufficiently thermally conductive at low temperatures. I’m not an audio person, but I have a hard time imagining it would make much of a difference at room temperature; maybe similar performance with a slightly thinner cable?
> High-end hifi is full of grift. It literally relies on people with more money than sense who can be persuaded to spend a fortune on components and extras that either make no difference to the sound or make it objectively less accurate.
Replace "hifi" with any other product and the sentiment remains the same. High-end houses, high-end cars, high-end clothes, high-end food, high-end technology... the ostensible quality of the product is entirely divorced from the price, because the actual product isn't the speaker or the handbag or the smartphone or etc; the real product is social signaling via conspicuous consumption.
To be fair, with most of those, you're paying for a piece of art to some extent. I don't think anyone buying $500 cables thinks the cables themselves are worth that, they're buying them because they expect some tiny increase in sound quality which measurably and provably doesn't exist. Someone buying a $10k sweater isn't buying it for the engineering in the material, but for the perceived artistic value. Of course, there's also a grift involved here, where fashion brands can pass things off as having great artistic value just because they're associated with the brand. But at least the perceived value is subjective and not coming from something that can be measured not to exist.
Which is why you get nonsense like directional cables, cable supports to keep cables off the floor, cables with magic insulation, magic rocks that go on top of components, magic knobs, magic creams, "quantum" anything, grounding boxes full of pebbles that don't ground anything, and so on.
Also, most of the people who can afford this hobby are rich old boomers who often can't hear much above 7k on a good day.