Not really an accurate statement. They are basing this entirely off of the theoretical triangle throughput and not comparing pixel shading power, memory bandwidth, cpu perf (those SPUs on the PS3 actually do get used), etc...
"Given the amount of hardware on this new chip, it's complete overkill for a smartphone. "
No it's not! I can't wait to have a killer smartphone with HD graphics on a high-resolution 5" screen. That would really rock. Smartphones this powerful could replace laptops as long as they had monitor-out and usb plugs.
I could see SmartPhones in 5 years being incredibly flexible with multiple personalities. When you're using it on the go it's an iOS/Android style touch OS. When you dock it to your keyboard/mouse you get a fairly standard Linux desktop. When you dock it to your TV you get something that looks like the Xbox Dashboard. Unfortunately I think this will be limited by battery life. If we can maintain a full day of normal usage that's probably acceptable but realistically 3-4 days is what most people need. I wish my iPhone had the same battery life as my iPad because at that point charging is just an afterthought. You don't have to worry about where the next charge is coming from because chances are you'll be back to your home/work/hotel/etc to charge within 3-4 days.
Most people are not camping all the time. Plug it in before you go to sleep; just as routine as brushing your teeth. (Although this will not work for blackout drinkers.)
The information is pretty vague, but seems to hint that the power draw is too high for a smartphone.
That said, apparently someone has already hacked Symbian to interact with the Wiimote, so the current smartphone hardware is already capable of supporting Bluetooth controllers. If you had S-video out to plug into a TV (or if you supported one of the newfangled wireless video protocols) you could have a gaming console that folds away into your pocket. Oh, the possibilities. :-D
I hate to reply to myself, but here's a product idea:
A Sheeva-plug form-factor device with video output, and Bluetooth support, sold with a set of Bluetooth controllers and a sleek carrying bag with pockets for the cables and controllers. You can carry it to your friend's house in one hand, plug it in to his/her TV set, and be gaming in under two minutes.
Package it with a bunch of casual multiplayer games, and whoever owns one will be invited to every party ever. ^_^ You could even sell it to those little businesses that operate children's birthday parties.
Ahh, the old "triangles as an absolute measurement of power" fallacy. Shame on you Ars, shame shame.
Still, an amusing headline given that among developers the RSX (the PS3 GPU) is widely acknowledged to be a bit lacking. One of the big challenges of PS3 development these days is figuring out how to move basically everything onto SPUs and leave the RSX as a basic rasterizer.
"OpenCL was initially developed by Apple Inc., which holds trademark rights, and refined into an initial proposal in collaboration with technical teams at AMD, IBM, Intel, and Nvidia. Apple submitted this initial proposal to the Khronos Group. On June 16, 2008 the Khronos Compute Working Group was formed with representatives from CPU, GPU, embedded-processor, and software companies. This group worked for five months to finish the technical details of the specification for OpenCL 1.0 by November 18, 2008. This technical specification was reviewed by the Khronos members and approved for public release on December 8, 2008"
Many game companies already have proprietary libraries to manage the SPUs. It's still a struggle to make PS3 games look as good as the 360. In fact, I know there are at least a couple of games where the 360 version is crippled to match the lesser quality the developers could wring out of the PS3.
This micro is not for smartphone, Marvell is not in that business.
Currently Marvell has the best micro's for Server class machine based on ARM (XScale) architecture, look at the Plug Computer (http://www.marvell.com/platforms/plug_computer/). This last micro, with three core is in the same family of the ones used in the Plug Computer (armada) and in the OpenRD reference platform (http://www.open-rd.org/); And none of this chip are in any smartphone.
Not for lack of computing power, but because they have a different target market (and different on-chip peripherals). Marvell's Armada SoC have Gigabit Ethernet and SATA interface on board... not exactly something you need on a smartphone (at least not on mine).
Still... Marvell is not in that business, but neither was MOS in the personal computer business when they launched the 6502 ;-)
Accidents happen and the wildest things become real. Besides that, this level of performance will be quite mundane two years from now. In a decade, low-end smartphones will have this kind of capability. Not because they need it, but because nobody makes a simpler part anymore.
Sure, but when the Motorola engineers left to built the 6502 there was no Personal Computer market. The 6502 (its low price) created it. At 25$ it was a bargain against the the Intel 8080 which costs about US$150: http://www.commodore.ca/history/company/mos/mos_technology.h...).
What about smartphones? they're already here and they're evolving.
Is this new chip something new and revolutionary? NO
Is it an innovation? YES
"In a decade, low-end smartphones will have this kind of capability. Not because they need it, but because nobody makes a simpler part anymore."
I'll not put my money on this. In electronics price is king, and for large volumes every dime is important. For example I'll not be surprised to find in my kid's toys some sort of 6502.
Phones are now an ARM territory, and the low cost ones will probably stay on ARM7TDMI for a long long time (may be in some speedy versions, around 300-500MHz).
BTW thanks for the nice "saturday" conversation. I love HN
Have a look at the "Epic Citadel" tech demo on an iPhone 3GS or 4. We're almost there already , of course subjectively (lower resolution on the mobile displays). The future is now, it really is pretty amazing.
I like the "2.5 core" design. The OS can switch off the fast cores if the load goes below a certain level and resurrect them when needed.
IIRC Linux kernels had a memory and processor plug-and-play thing in place that could be used to dynamically "plug" and "unplug" cores as required by changing load conditions. Was it limited to SPARC?
Even if it does, you would need 5 years to understand how to use it to the full, if that is ever allowed, and not limited by what the underlying OpenGL ES 2.00 would give you.
With pictures, it would be pointless, too. You cannot see from a picture how long it took to draw it, whether that drawing speed can be sustained, etc.