Hacker News new | past | comments | ask | show | jobs | submit login

er... what? can you explain that a bit i think i must misunderstand?

what i think of as undershooting or overshooting is relative to the range... and besides that, what is wrong with clamping? its how computer graphics has always had to deal with these things... limited range simply doesn't exist in that context, and it doesn't harm anything.

when computer games are forced into limited range for consoles you don't get these unless your tv is applying one of those god awful filters that ruins everything anyway... (i'm still not sure why so many tvs have these - reference monitors never do anything this insane) ... but i can tell you what you do get, a subjectively /and/ measurably worse quality of image than from a monitor.

(i don't think i'm alone in this based on the contents of the ITU-R BT.2100 either... which defines a full range as well as a 'narrow' one)




> what i think of as undershooting or overshooting is relative to the range... and besides that, what is wrong with clamping? its how computer graphics has always had to deal with these things... limited range simply doesn't exist in that context, and it doesn't harm anything.

As far as I understand it, limited range was historically used so you could use efficient fixed-function integer math for your processing filters without needing to worry about overflow or underflow inside the processing chain. You can't just “clamp back” a signal after an overflow happens.

Of course, it's pretty much irrelevant in 2016 when floating point processing is the norm and TVs come with their own operating systems, so these days it just exists for backwards compatibility with the existing stuff - which is a property that video standards have tried to preserve as much as possible since the early beginnings of television.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: