Hacker News new | past | comments | ask | show | jobs | submit login
Nvidia officially unveils next-generation Tegra 4 SoC (arstechnica.com)
91 points by Garbage on Jan 7, 2013 | hide | past | favorite | 63 comments



That was effortlessly the worst keynote I have ever sat through, with the added bonus of a grotesque "gamer babe" interlude that saw a married 49 year old CEO pretending to hit on a model at a tiki bar. Rancid 1993 puke.

All that said, their Shield handheld console looks pretty spectacular, if vaporous.

5" ~retinesque~ touchscreen clamshelled onto a full-size game controller with all the bells and whistles. Outrageously powerful, has HDMI, USB, headphones, and holds a 5-10 hour charge. Runs pure Android with full Google Play access and a specially curated library built for the console itself. Can also wirelessly stream games off of a PC, including from Steam Big Picture. Cloud storage of game state comes free.

Not announced: price, storage, availability, who's making it, sales channels, or anything. So whatever. But this -is- the Sony PSP slayer, and it is another sign that all roads lead to Android+OpenGL ES on ARM as the future default game platform.


Haha, I agree that the keynote was at times extremely awkward and hard to sit through, but I have to say I enjoyed how casual it was. Compared to something like an Apple presentation, where it seems like there's 0 room for error or for tangents, this felt more "real".


It was thoroughly amateur. The host/CEO was leaden, the pacing was all over the place, anything involving women was grotesque, it was full of meaningless and half-hearted digressions, nothing shown was placed into an industry context, and their livestream offered an unskippable chat box that was mostly "U FAGGOTS" and "BRING THAT GIRL BACK".

Demo example: "Here are the guys who made the game Hawken." Who? What company? What is Hawken? What does this game mean for the industry? How is it showing off your product in the best light? What does this mean for developers?

No, none of that gets answered. Two cameras drill in through shitty sightlines at three anonymous men playing a game in silence while Jen-Hsun Huang asks things like "where are you guys at right now?" and "so what are you doing?"


Available Q2 it seems. Also more details on the SoC from Anandtech: 4 Cortex A15 on 28nm HPL (28nm low power with high-k + metal gates) running at up to 1.9 GHz and a fifth A15 core for low power work.

http://www.anandtech.com/show/6550/more-details-on-nvidias-t...


While a somewhat valiant effort from a specs/control perspective, I think this is going to be a pretty big flop. Primarily because it doesn't have much of a market, which means that they won't be able to overcome the chicken-and-egg issue.

There's a reason why the main gaming console manufacturers also have large development studios as well. Every new gaming device needs a killer app to launch it past the critical mass where it becomes economically beneficial for outside developers (3rd parties) to also release software for said platform.

nVidia doesn't have game studios (that they've announced anyway) so they're going to be relying on the broader Android gaming developers. However, Android developers are going to be targeting the hundreds of millions of phone/tablet Android devices and not unknown number of Shield owners.

So if very few developers are going to take the development resources to make games that take advantage of the Shield then how many gamers are going to want to spend the kind of money this will cost, when there are MANY other great alternatives (E3 this year will see the announcement of PS4 and the next Xbox).

If the hardcore gamers aren't going to be purchasing this system, then that leaves the so-called casual gamers. However, by definition this thing is targeting hardcore gamers so it doesn't have much appeal to casual gamers at all. Which means there's a very narrow market segment that is seriously interested in this.


Android gaming is going to boom this year I think. The market is primed for mobile gaming to take off into more graphically intensive game territory, and the industry has been getting sized up for years.

And this is the Android gaming device. A controller that doesn't require a usb port or finnicky wifi / bluetooth. So that is its killer feature.


As somebody who is really interested in retrogaming (and loves his gp2x wiz like a beloved pet), I'm definitely in the market for this. I've found that as I'm getting a little older smaller handhelds are getting harder and harder for me to enjoy. This thing looks perfect, and I already know there's a pretty robust emulator ecosystem for Android already.


Nvidia has good relationships with a lot of game developers, and they've created the Tegra Zone store, for which the developers optimize their games (well for the Tegra chips). As long as they keep optimizing them for Tegra 3 and Tegra 4, the Shield should have plenty of games - certainly more than any new console, which usually have like 15 games at launch. Not to mention these games are a lot cheaper than a typical console game - literally an order of magnitude cheaper.


I think you read a different article than I did. The Ars Technica article doesn't mention the Shield at all, just the Tegra 4.

Anyway, it sounds like NVidia is putting out a device to show people what the Tegra 4 can do. They have to build prototype devices anyway; they might as well sell a few of them while they're at it. They're not expecting to create their own platform: the Shield is just useful for basically acting as a controller for your PC or as a box for playing Android games.

There's more to the market than "hardcore gamers" (how hardcore can you be using a portable device anyway?) There's the Windows 8 tablet and Android tablet markets, which NVidia would very much like to win. I also would not be greatly surprised to see an SoC like this in Steam's new box, either.


Sorry, I got my headlines/articles mixed up.

The Tegra4 does indeed look quite stellar!


Real-time HDR is great! Hope everyone gets in that bandwagon.

At some point HDR processing should become part of the hardware pipeline in every sensor, and we'll simply have a selectable dynamic range.


You made my postmodern sense tingle!

I have noticed more and more cameras implementing automatic color correction resulting in orange people and color changes while filming the same scene. I think HDR looks unnatural.


HDR can look as natural as you want. You're probably thinking of burnt-out crazy over-saturated flickr shots like this:

http://www.flickr.com/photos/yury-prokopenko/3561920871/ligh...

Or this (beyond terrible):

http://www.flickr.com/photos/nik-on/4624961812/

But that's the photographer's fault. HDR is just a way of compensating for lack of dynamic range in a sensor. Used correctly it should make the image more real, not less, by bringing it closer to the human eye's full dynamic range. Examples:

http://www.flickr.com/photos/margall69/7496881548/lightbox/

http://www.flickr.com/photos/frankspecht/4954970921/lightbox... (snow and sky would be completely blown out without HDR)

http://www.flickr.com/photos/michaelgcumming/4525129653/ligh...


Although I completely agree with your point - that HDR is a tool, and can be used for both high and low quality shots - I actually really like the second example you gave for what "bad" HDR looks like. It seemed like an artistic use with a rather gorgeous result to me. Your main point stands, though.

(Disclaimer: I'm not a photographer, and the only thing I know about photography is what I learned in college computer vision courses and overheard from friends.)


I'm actually a photographer and I like the over-the-top HDR shots (even the example linked). It really depends on personal taste, and what it is you're after.

A lot of people dismiss them because they're "unrealistic" (which they obviously are) but I ask you is an oil painting unrealistic? For example this:

http://snapzlife.com/wp-content/uploads/2013/01/Oil-Painting...

Clearly that is a painting, but if exactly the same picture was acquired using HDR everyone would dismiss it as being terrible?

My point is that you can use HDR to get realistic looking pictures, but you can also use HDR for "artistic" reasons, like painting using your camera. A lot of photographers are dismissive of the latter because they want to pretend photography is a practical rather than artistic endeavour.


wouldn't most people also think that was a pretty bad painting?


Not if you live in Wisconsin.


HDR can look natural when used correctly, but if the camera's doing it automatically I bet we'll start to see lots of casual photographs abusing it.


Sony's Exmor RS sensor already has hardware HDR, and can take HDR videos in real time. They should ship with many of the high-end smartphones this year.

http://www.dpreview.com/news/2012/08/20/Sony-Stacked-CMOS-se...


I just want higher dynamic range screens.


here's a really good blog post on the topic http://19lights.com/wp/2011/11/14/will-hdr-displays-really-w...


Too bad the explanation of how it works was indistinguishable from technobabble.


>In a side-by-side Web page loading test with Google's Nexus 10... Tegra 4-based prototype loaded a set of Web pages nearly twice as quickly... the Tegra tablet appeared to be running the stock Android browser, however, while the Nexus 10 was running Google Chrome

Scumbag Nvidia? I would doubt that either browser makes significant use of >2 threads, leaving most of the performance difference that was not due to the 200mhz clock increase explained by the different browser.


Because Chrome for Android is in a pretty terrible state right now, and doing it on Chrome, it may have made it look even worse than iPad 4. Chrome for Android does a "good job" at making some of the most powerful chips on the market look pretty mediocre right now. I really resent Google for this right now, because they are embarrassing a lot of chips makers and device makers because of it. But whatever - they'll probably fix it in Android 5.0 (I hope).

So basically Tegra 4 shouldn't be "2x faster" than Nexus 10, but probably more like 1.5x-1.8x faster, depending on how much more they optimized the stock browser over Chrome.


I am surprised no one mentioned the i500 SDR. Maybe it's hackable, run GNURadio then BAM Tegra 4 becomes the ultimate portable software radio, with 20MHz bandwidth, both receive and send, ARM NEON/VFP and GeForce GPU power to process data, it's like a wet dream.


Some cool stuff here. We've been shooting with RED Epic for a while now, and the extended dynamic range of the HDR-X mode is really nice. It's about 18 stops of dynamic range. I can pull almost anything out of high contrast scenes in post-production - we don't have to wait for the magic hour anymore and can generally wave goodbye to blown out skies. Check out londonhelicam.co.uk - there are a couple of HDR video examples in the reel.

I wonder if this Tegra 4 chip will have enough horsepower encode the extended dynamic range of the HDR video into a usable 16bit/32bit video format, at good quality bitrates. What codec would it use? h.265?

Sony has the best dynamic range in consumer sensors, hope there will be some products combining this into a truly usable and user configurable product - either in a phone or a consumer camera.


VP8 and h.264. h.265 and VP9 probably won't be ready until late this year or next year.


It's really a pity that Android makes it so difficult to get down to the metal on this hardware. If I want to take advantage of Neon intrinsics I have to use the NDK and write them by hand. On iOS I just use the Accelerate framework directly.


You have to have some limitations, it's not like every Android device can do NEON.


That's the whole point of having a first-party API for common DSP and compute tasks; Accelerate uses NEON where it's available, uses SSE/AVX where it's available, used AltiVec where it was available, and will use $(FutureISAExtension) where its available. Most of the time, developers can just call the interfaces without worrying about the details at all.


Sure but that doesn't mean that couldn't be fewer hoops to jump through for the many devices that do. I have an iOS app that does realtime audio FFT without even breaking a sweat. I've considered porting it to Android but the degree of hassle puts me off it.


Sure, it could be made easier, but i am wondering if it is done on purpose. Android encourages the developers to stay away from the NDK unless it's really necessary. Making it easier to build incompatible apps (= NEON) would surely mean more devs building incompatible apps even if not necessary and thus hurting the android ecosystem in general with more fragmentation.

Doesn't mean there shouldn't be some helpers in the SDK in the likes of "if NEON is supported execute <optimized code path/ARM assembly>, else <slow stuff>".


Perhaps but the unfortunate consequence is that there are entire niches of apps that flourish on iOS that don't really even exist on Android. Android is a great utilitarian OS but the really exciting stuff on mobile (IMO) needs to push the hardware hard.


What niche apps would that be?

I mean the same statement can be made for Android and Widget Apps, Homescreen Apps and whatever else is not possible on iOS. I doubt the amount of your niche apps is more then a fraction of those :P

P.S.: My point is, that there will always be differences in the available apps plainly because the API is not the same. I am wondering though, what apps can be done on iOS that can't be done on Android.


Practically the entire audio field(recording, synthesis, processing, sequencing, etc.) has stayed away from Android because latency is far too high on average, ruining UX; it also varies widely between manufacturers, which doesn't help matters.

To some extent this also impacts all games, since high-latency playback hurts the experience.


I am wondering how true this is with Android 4.1+, because audio latency was one of the major goals for Android 4.1 and more so in 4.2 and was supposed to bring audio latency on par with iOS.

See: https://developer.android.com/about/versions/jelly-bean.html (Section is called Low-latency Audio).

edit: So, even when said apps were not possible in the past, now would be a very good time to port your app (or those apps in general), because then this is a niche which is not yet filled and there is some money to be made by being #1 in that niche :)


http://code.google.com/p/android/issues/detail?id=3434

There's some hope for 4.2 but needless to say there's no point targeting that market yet. None of the devices currently on the market have acceptable audio latency. And Android still has no MIDI support.

Android has its strengths but it can't hold a candle to iOS for these kinds of applications.


Mh, i just tested on my phone ( https://play.google.com/store/apps/details?id=de.darkbloodst... ) and i don't see/hear a latency, certainly nothing above 100ms, so i don't think your remake "NONE of the devices currently on the market.." is true. My Galaxy Nexus is over a year old and works fine. Atleast i suppose that this app would show what you think is not working.


Latency for pro audio apps should be around 10ms. No Android phone gets close to this but all iOS devices do. You can use the free Caustic app on Android to measure your exact audio latency and then you will understand why none of the pro audio app makers bother with Android despite the large install base.


I see, i have no clue about "pro" music stuff so let's hope they fix that... (says 40ms on my phone).


Between project shield and the ouya, it should be a great year for android gaming.


I feel like the OUYA is a bit out of luck considering how fast the mobile landscape is moving. Tegra 3 is being surpassed (at least technically) before the OUYA launches, assuming the OUYA even hits its "early 2013" release.

As mentioned on here and elsewhere, maybe development will cater to the lowest-common-denominator as far as SoCs go and the OUYA will be OK, but I feel like the audience for the OUYA won't particularly like being on a last-gen mobile SoC.


I think OUYA would be smart to re-target it at sub-14 year old kids, and not really at the more hardcore gamers. At least until they release a version with the latest and most powerful ARM SoC.


Kinda, but isn't it a pain for developers to have to account for all these kinds of Android devices?

I don't have any experience developing for Android, but I can imagine developing for Android phone handhelds is hard enough already.


Can't wait to see that chip in the Nexus 7 2 or however they brand it. Or maybe the Nexus 10 2? A15 cores are such a noticeable boost over A9.

I'm really hoping once this thing gets benchmarked that Nvidia finally hits it out of the park on the graphics side, for being the foremost GPU company of the last decade they sure screwed up ULP Geforce with 7000 series gpus from a decade ago. I think these are Kepler cores, and they have proven themselves fantastic on the desktop.


Are they going to start supporting VP8 hardware decoding there? It's not available in Tegra 3, at least with their Linux for Tegra release.


This will also be great for emulators.


Nvidia will save us from Intel in the long run. Intel just doesn't get multi-core processors.


Ok, that made me chuckle. Intel absolutely understands multi-core, what they don't yet 'get' is third party access to the inner works. Its interesting to watch nVidia because they have been on the receiving end of that sort of squeeze (front side bus patents and all that) and so they tend to lean toward that as a strategy, whereas other ARM processor houses don't seem quite so focused on locking down all of the silicon around them.

The really smart bit here though is the LTE capability. If nVidia goes 'all in' on building integrated CPU/GPU/Wireless cores then that puts even more pressure on Intel to integrate or leave the market. Intel has not had a stellar track record with regard to wireless unfortunately.


I'm not sure LTE is much more than a no-brainer for anything handheld going forward from Q3. Somehow it seems the Nexus 4 ended up with two (or possibly even three LTE) implementations on the circuit board and it's not even an advertised feature.

If US carriers insist on locking down 4G LTE onto expensive multi-year contract subsidized devces, Intel may not have to worry for another generation. So the real wildcard here is, I think, T-Mobile.


I would say Intel/Infineon is ahead of Nvidia/Icera. At least the Infineon baseband has had some design wins.


That's an excellent point. Makes me wonder if there is an AMD ARM part with a Radeon GPU and some wireless implementation in our future. Seems like all the cool kids are building this particular kind of chip. It almost feels like the old 8080 days where everyone had a kinda-sorta the same 8 bit MCU to throw at the emerging PC market.

It is certainly going to be an interesting decade.


INTC + RF = 0 still holds


Intel of course is aware of the market and it's needs. They have their own strategy, R&D and future plans. Case in point: Xeon Phi.


Wow!

If Intel doesn't get multi-core I wonder how well a design firm that makes chips designed for the rendering pipline fair? What kind of load balancing does CUDA offer? Didn't they implement recursion a few weeks back? Scalar stream processor smacks of the Cray 1.

Intel's lack of understanding has enabled it to catch 75% of market share. 2 years ago Intel demonstrated their understanding by increasing clock-speed when AMD dumped more cores onto their CPUs creating the underwhelming Interlagos.


Intel gets multi-core for desktop and server just fine. It's mobile devices and graphics where they have been consistently behind.

There are some (specialized) types of computation for which parts made by AMD/ATI and Nvidia are greatly superior. That Nvidia and other ARM licensees have completely taken over the fast growing mobile device market with efficient multicore CPUs with integrated GPUs is really significant to Intel.


> Intel just doesn't get multi-core processors.

Would you like to qualify that statement?


LOL.


The "72 cores" is marketing drivel . The stream processors found in GPUs are not general purpose and certainly not equivalent to a cpu core.

This number of cores remarks started to remind me of the gigahertz race between Intel, Amd and Cyrix back in the days.


Take a look at the very first image right at the top of the article.

I never saw them mixing up the GPU and CPU cores. They have 4 CPU cores and they kept on saying it.


Well, 5 cpu cores, one is invisible to the OS though.


The number of cores is not "drivel", and it's also not new. This kind of information is absolutely standard fare when talking about GPUs nowadays.

See for instance the (huge) tables at http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_p..., under the "Config core" column you can see each model's configuration of cores.

You can see that the idea that a GPU has several types of core, and various amount of each, goes back 5+ years.


I don't think they ever claimed or even misled people into thinking those are 72 CPU cores - even at the event. I think pretty much everyone knows they're talking about GPU cores, and not just to say that "we have more GPU cores than you do", but to compare to their previous "12 GPU core Tegra 3". Besides, they were talking about the 4 Cortex A15 cores all the time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: