> Why are televisions interesting? PS3 and Xbox graphics are so many years behind PC now...
They can catch up as fabbing processes catch up into consoles, or some sort of gaming set top box that works more like a PC. I believe its sort of a common assumption PCs are going niche.
> Have you seen one of the new iPad screens? It's hard to believe that that kind of DPI on computer monitors isn't going to have huge adoption.
I haven't. I'm sort of into photography (see profile) and would like to in order to see some of my photos displayed on that much DPI on an iPad held at about 20/30cm from my face. That resolution should really help get more depth effect in full screen photos, and maybe some day we'll no longer need to print them in order to fully appreciate them. I know I'll drool in awe when I see one, it did happen with the iPhone 4 :P
But... remeber resolution is all about distance, because as you put things further away pixels seem smaller. According to Apple "retina display" is 57 arcseconds, that's (I believe) an iPad 3 at 30cm. WARNING: I didn't find an easy calculator for these numbers, please someone correct me if I'm wrong: A 46 inch 1080p TV at 2.5m should present you with smaller pixels than that, around 45 arcseconds. A 27 inch computer monitor with 2560x1440, like the iMac display or the Dell U2711, at 60cm has a pixel size of around 75 arcseconds. So we are almost there for the best displays, and definitely already there for big TVs, taking into account average viewing distance. Since I imagine graphic intensive gaming will take place mostly on TVs using something more like a console than a PC, I don't see a bright future for NVIDIA, but I do see most gaming taking place in Intel SOCs.
Also, for completion of my numbers, a 23 inch 1080p monitor at 60cm should be more in the 85 arcseconds territory... so maybe there is still some leeway for graphic cards, but they'll definitely hit a roadblock soon.
They can catch up as fabbing processes catch up into consoles, or some sort of gaming set top box that works more like a PC. I believe its sort of a common assumption PCs are going niche.
> Have you seen one of the new iPad screens? It's hard to believe that that kind of DPI on computer monitors isn't going to have huge adoption.
I haven't. I'm sort of into photography (see profile) and would like to in order to see some of my photos displayed on that much DPI on an iPad held at about 20/30cm from my face. That resolution should really help get more depth effect in full screen photos, and maybe some day we'll no longer need to print them in order to fully appreciate them. I know I'll drool in awe when I see one, it did happen with the iPhone 4 :P
But... remeber resolution is all about distance, because as you put things further away pixels seem smaller. According to Apple "retina display" is 57 arcseconds, that's (I believe) an iPad 3 at 30cm. WARNING: I didn't find an easy calculator for these numbers, please someone correct me if I'm wrong: A 46 inch 1080p TV at 2.5m should present you with smaller pixels than that, around 45 arcseconds. A 27 inch computer monitor with 2560x1440, like the iMac display or the Dell U2711, at 60cm has a pixel size of around 75 arcseconds. So we are almost there for the best displays, and definitely already there for big TVs, taking into account average viewing distance. Since I imagine graphic intensive gaming will take place mostly on TVs using something more like a console than a PC, I don't see a bright future for NVIDIA, but I do see most gaming taking place in Intel SOCs.
Also, for completion of my numbers, a 23 inch 1080p monitor at 60cm should be more in the 85 arcseconds territory... so maybe there is still some leeway for graphic cards, but they'll definitely hit a roadblock soon.