Hacker News new | past | comments | ask | show | jobs | submit login

Lots of my plugin users are resorting to massive oversampling, even when it's not appropriate, simply because in their experience so many of the plugins they use are substantially better when oversampled. 192K is an extremely low rate for oversampling.



I wonder how you measure the quality difference for higher rates? 384K, 512K and beyond? I hear from audiophiles that there is a very distinct difference, but there is absolutely no basis for it in science.


Not so: this oversampling is mostly about generating distortion products without aliasing, so in the context I mean, the difference is obvious. But, it's a tradeoff- purer behavior under distortion, versus a dead quality that comes from extreme over-processing. I've never seen an audiophile talk about 384K+ sampling rate unless they mean DSD, and with that, it's for getting the known unstable HF behavior up and out of the audible range.

Oversampling in studio recording is mostly about eliminating aliasing in software that's producing distortion, and it's only relevant in that context: I don't think it's nearly so relevant on, say, an EQ.


That could be simply that some of these plugins are either written to specifically work correctly only at 192 kHz; i.e. it's a matter of matching the format they expect.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: