Hacker News new | past | comments | ask | show | jobs | submit login
The Xbox One: Hardware Analysis & Comparison to PlayStation 4 (anandtech.com)
102 points by AutocorrectThis on May 22, 2013 | hide | past | favorite | 76 comments



My take away:

- PS4 and XBox One are going to be similar hardware wise, with the PS4 having an edge in raw power and the XBox One having an edge in "cleverness" (e.g. memory caching) and power consumption (lower).

- Microsoft is re-using Hyper-V in a very awesome way(!).

- HDMI passthrough is Microsoft's "ace in the hole."

- Developers are going to have a great time this generation thanks to how similar the hardware is (identical in many cases).

I really want to know how Microsoft is able to run two hypervisors and offer GPU acceleration to both concurrently. Is this part of AMD's new CPU+GPU on one die magic, or is this something we'll see on the PC?


Using cleverness to get better performance than your competitor is an edge; using cleverness to get the same performance in the common case but worse performance in the worst case is poor design.

I suspect that HDMI passthrough will be a frustratingly unfulfilled promise, as it is with Google TV.

Recent GPUs are virtualizeable (using multiple command queues); this is mostly targeted for VDI or cloud gaming but unused in PCs. http://www.nvidia.com/object/enterprise-virtualization.html


I don't think it makes sense to compare the Xbox One to Google TV, Google TV is very much a side project for Google which is obvious because they aren't putting much effort behind it. While many people already use their Xbox for streaming Netflix and Hulu, the HDMI passthrough is an extension of that.


I think it makes sense in that the base technology is exactly the same. There is really only so much you can do with HDMI + IR blaster (or for the lucky few with very new TVs and/or set-top boxes) HDMI with CEC.

There are tons of edge cases you run into when you don't have full control of the tuning stack. As one example (I can think of dozens easily):

Family has a single-tuner old school Scientific Atlanta DVR (still very popular with cable operators despite being horribly obsolete). They set it up to record a show, it is currently recording said show, person interacting with the Xbox One does something to change the channel to one different than the one being recorded on... oops! There is no way for the Xbox One to know this is a potentially destructive operation for the DVR. At worst the DVR changes the channel and loses half your show, at best the DVR is smart enough to prompt you about this and stop the channel change, but how does the Xbox One know this occurred? There are some high-end DVRs with APIs that exist in an attempt to deal with situations like this but they are very few, very far between and there are no standards for them.

Corner cases like this present themselves all the time with these HDMI-in/HDMI-out devices like Google TV and they are inherent problems with the current state of set-top box technology. No matter how many resources Microsoft throws at this, the end result will still be kind of lame a lot of the time and a far cry from the seamless experience you see in the on-stage demos.


I agree with you but for discussions sake...

What if this wasn't classified as a corner case? As you said - most MSOs are running either an old Moto or old SA as their standard box. Because headends are either Moto or SA there are actually very few unique STBs in the field. And of those, they are all versions of the same OS that either Moto or SA have been using for a long time.

If MS just threw a bunch of resources at common SA and Moto boxes to enumerate and add special code to handle these cases... could it work? Would it be a better experience? Is it worth it?


[deleted]


Don't they have to decode the HDCP stream anyway? Just getting basic scaling to work would require access to the raw pixel data, right?


Dear god don't compare the Xbox to Google TV. No. Google TV was shit slow at launch and that was enough to kill it, imo.


I don't know if I would consider HDMI passthrough as an ace in the hole... I mean I get how it would be cool to be able to overlay your cable box output on the screen, but this also implies that you would have to have your Xbox on every time you wanted to watch television.

Unless, of course, the pass through works when the device is turned off... However, I haven't heard any mention yet that it does (does anyone know if it will support this?).

And plus, if you had any other multimedia devices (e.g. a Blu-ray player), wouldn't you still need to use the other inputs on the television anyways?


That's the idea. Low power is to help justify leaving it on constantly. You don't ever turn this thing off - it just sleeps at such low power that you write it off like you do with a modern TV or the like.

Microsoft's betting the farm on the idea that a consumer will leave the xbox on 24/7 in a low-power state and have it as THE hub for all things digital entertainment. Your TV goes through it for advanced features such as TV channel guides, extra content, etc. You play games on it. You chat, take video calls and the like on it via Skype. You steam music/movies to it as your media player. All of it is controlled via the Kinect gestures and voice recognition. It is also why they're planning to push updates/etc. through in idle times (e.g.: overnight) when the device is left in low power but able to do basic tasks.

Basically, MS wants to take the beachhead they established with the first two generations and expand it. They want the Xbox One to go against the Apple TV and XBMC media centres and the like to become the all-encompassing portal of your entertainment.


But I don't want Kinect. I don't want to pick up the controller when I want to watch TV. Can I get a good remote for the Xbox One?


Almost surely. There were official remotes for previous generation consoles. It is, after all, a blu-ray player.


> "Developers are going to have a great time this generation thanks to how similar the hardware is (identical in many cases)."

Sure the hardware will be similar, but what use is that if the APIs to access the hardware are totally different? I think the odds of Sony using DirectX in their SDK are pretty slim.


Developers had been writing multi-platform code for 10 years already. When you have API calls isolated from the rest of the code it takes a couple man/months tops to target another API.

The hard part of the previous generations multi-platform had been in getting to the same level of performance. E.g. 360 had a pretty weak CPU and a superior GPU compared to PS3 so the distribution of things done on CPU and GPU was completely different between these two targets.


With completely different abstractions, "similar performance" makes much more sense than "similar hardware". Thanks


Took me a while to discover the meat of the article was hidden inside a drop-down navigation system. I honestly nearly bailed thinking, "This is all there is?"

What an awful idea. Thank goodness I saw it at the last minute.



>Took me a while to discover the meat of the article was hidden inside a drop-down navigation system

I agree! I hate their navigation UI. In general drop downs should only be used for familiar lists (counties, states, etc.) However, some of their articles are very detailed and has numerous pages and they need to provide a quick access to individual pages. One work around is to provide a simple "Next >>" at the bottom of the article. It would make it a lot less confusing.


There are forward and backward links at the bottom of each section.


That's the experienced offered by most hardware review sites - AnandTech, DP Review, Toms Hardware, Extreme Overclocking, etc. Certainly to pump up pageviews, and thus generate more revenue.

Having said that, it is still bad design/usability. They could easily fix by keeping the model of separate pages for each topic, but moving the content from the drop down to an index at the bottom of each page, and also use prev/next buttons.


Well, I feel like most of these hardware sites have pretty meaty pages so it's not terrible that they split up their articles, as otherwise they'd be very long pages. Especially Anandtech, some of their reviews are 15-20 pages long - I wouldn't want to read that in one go. Though sometimes I just stick their articles in Pocket and read it that way on my phone, so maybe I'm just rationalizing a practice that I've just grown accustomed to over the years.


There are previous/next buttons on each page, labeled with the title of the preceding or following page.


This is pretty much standard on hardware benchmarking and review sites. I'm not saying that it's a good design, but it's not unique to this site.


It really needs to pop more though. I noticed the links to the top-left and top-right of the select and really wondered why they aren't forward/back arrows. They could at least have chevrons.


I actually prefer it greatly to either the never ending scroll or the digit-based pagination (1, 2, 3, Next, etc) wherein I cannot tell what's on page 3 or page 4. Often I just want to go to the summary, or to just view GPU info. I like it.


Note that a lot of the best info is actually in the user comments anyway. Some of the arguments there are very enlightening, and uncover good points that the article didn't touch (even if you look at all the pages, the article was pretty brief and short on analysis).


It's really bad in mobile browsers.. a thin gray line, very easily missed


> "The hard partitioning of resources would be nice to know more about. The easiest thing would be to dedicate a Jaguar compute module to each OS, but that might end up being overkill for the Windows kernel and insufficient for some gaming workloads. I suspect ~1GB of system memory ends up being carved off for Windows."

Microsoft actually confirmed that 3GB will be reserved for the OS while only 5GB is available for games. Check out the video at 9:44

http://www.gameinformer.com/b/features/archive/2013/05/21/in...


> Combine the fact that TV is important, with the fact that the Xbox 360 has evolved into a Netflix box for many,

i thought the ps3 was more widely used for streaming?

it's pretty fscking ridiculous that you need to have gold membership AND netflix membership to stream with the xbox.


>i thought the ps3 was more widely used for streaming?

Not by a LONG shot.

And you don't need gold membership to use the Netflix app on xbox. Also, the Netflix app for xbox is much better than the PS3 app.


According to Netflix PS3 is the most popular streaming device (not just out of PS3 and 360 but all TV-connected devices).

http://www.pcmag.com/article2/0,2817,2412856,00.asp


These posters on the official Xbox forums seem to all be under the impression that Gold is required.

http://forums.xbox.com/xbox_forums/xbox_feedback/f/2602/t/37...


As a Xbox 360 user, I can confirm that Xbox Live Gold membership is required for Netflix.


It looks like the Sony machine will be more expensive (or its profit margins will be a lot lower). The PS4 seems to have a lot more hardware under the hood than the One.

Having said that, as I pointed out elsewhere, it's now how many things your device does, but how well it does the thing that'll define its function. If it's an entertainment system, it's how well it entertains people. If it's a videogame, it's how fun the games it plays are.


Not necessarily -- the Xbox include a Kinect in the box which will increase its price/lower its profit margin. The PS4 has a camera but nothing as sophisticated/expensive as the Kinect.


Total Tangent: I just saw Jaron Lanier speak in SF last night and he was very proud of the work he did on the Kinect.

I find that particularly interesting in light of his sort of decrying the digital-social culture we've developed in the last decade. Strange guy with a lot of really different opinions; was a lot of fun.

The Xbox including the new Kinect definitely lowers margins, but if Kinect becomes an integral part of the gaming experience it will be a sound bet.


> The Xbox including the new Kinect definitely lowers margins, but if Kinect becomes an integral part of the gaming experience it will be a sound bet.

Indeed. Bundling it is the best way to ensure the Kinect is used in every title.


I don't think the goal is to make sure the Kinect is used in every title. Rather, it allows any title to use it.

Currently, the Kinect is a peripheral, which means that developers can't assume that it's there. If they want to use it, they have two options: be a full-on Kinect game that doesn't make sense without the device, or have controller-only options for everything. That's a lot of pain. But if they can assume that the Kinect is always there, there's no tradeoff.


Exactly, the developer experience is very important, and hardware is only an issue if you're really pushing the limits. Differences in SDK made a huge difference in popularity between the PS3 and the 360. Since the 360 handled background music, matchmaking, and voice chat natively, devs had a huge head start over competitors on the PS3. Helping out the devs had a direct effect on the players' experience, which led to the 360's dominance in multiplayer games.


Tangent: In addition to that, I think this generation might be defined by how well each console opens up to smaller/indie developers.

Modern day indie developers are making products which we would have called AAA games in 1998 and many of them are selling extremely well (just see Kickstarter). I think they have a HUGE role to play this generation, and holding them hostage in the "app" sandbox will be a massive mistake.

If either Microsoft or Sony opened up their SDK to indie developers, and created some kind of arrangement where these developers could sell digitally for a fee, I think they will effectively "win" this round. And I do not mean "apps." I mean full power of the console level stuff.


Yeah.. but it does look like MS is backpedaling in support of indie devs for the XBox One.


OTOH, it seems developing for the One will be easy. At least, as easy as developing for Windows.

Although I personally have little love for Visual Studio, most Windows developers would never leave it.

Were it programmable in Lisp, I suppose developers would use it to read e-mail and chat on IRC ;-)


I believe they used to make the previous console at a loss, because they generate most of their profits through the sales of games and accessories. I believe.


Yes, the PS3 initially launched at a loss(mainly because of the Blu Ray player from what I remember). Towards the end of its lifetime it wasn't losing them money.


The extra CUs in the PS4 are offset by the ESRAM in the One; I would expect the die size is similar. Sony has more expensive RAM, though.


Is the eSRAM really that expensive in $ terms? Yield issues?

I understood that SRAM was only really 'expensive' in that it chews up masses of die space (which isn't a bad bet going forward because there'll inevitably be a new version at some point with a smaller process)


Yes, my assumption is cost = die size.


[deleted]


I was talking about cost, not performance.


And the rest of the ecosystem. I do hope the new incarnation of Live allows for PC/One simultaneous play.


It it really necessary that the article be split between multiple pages. At first I wondered, "where is the comparison" until I found the drop down menu...


It is "necessary" in the sense that AnandTech would be out of business otherwise.


this feels like a real problem that some startup should solve.

clearly, websites that do this are shooting themselves in the foot in terms of UX, for the sake of pageviews and ad impressions. Is there now better way?


The other model is a paywall.

When I worked for a publisher there was a long debate over UX vs page views and ad impressions.

I think Google has rules around how you can load ad's also so it might not be posible to do anything dynamic. Google has all the ad inventory so is the dominant player.


This makes me wonder further. Did you see higher bounce rates when using multiple pages? If so, did the ad revenue brought in by displaying ads across multiple pages out-weigh the higher bounce rate? Honestly curious about the pros and cons.


The only reason I could see that Microsoft would make the choices they did was to save on cost or to be easier to scale up to 10's of millions of units a year by getting higher/cheaper chip yields faster than Sony can with basically identical chips. The only way that Microsoft can capitalize on that advantage will be to actually have a lower price faster since they don't have a one year head start this time.

Nintendo's only shot right now is to cut price on the Wii U.

Just remember that historically the magic number for console sales is $199. That's why the Xbox 360's entry price is $199 now and the Wii worked so well at $249 with a bundled game (basically $199 console + pack in game).

Mobile devices the sweet spot is more like say $150, which is why the DS did better than the PSP, 3DS than the Vita and so on.

Price makes a huge difference on consumer products and getting to certain price points can be the difference between success and failure other things being basically equal.


So, I have a Linux Desktop PC, with 8GB of RAM, and an AMD A10 processor. I'm running XBMC (or LXDE or Front Row), and running 5 VM's on it under KVM. Cost about $450 when I bought it. price is about $570 now.

I've had this setup (with a slightly slower AMD proc) for over 2 years, and I've been running BSNES, Dolphin, PCSX2, Nestopia, and MAME fairly regularly. Today's announcement informed me that I've been enjoying much of the next-generation gaming experience already, without any of the DRM Sony and Microsoft promise to include with their systems.


>I've been enjoying much of the next-generation gaming experience already.

You can't be serious. Although you can very soon be enjoying the next-generation gaming experience, what's exciting to me about this as a PC gamer is that all next gen games will be designed with x86 in mind, which should make porting trivial compared to this generation.


They still don't budget enough time and money towards PC ports even today. I played Metro Last Light recently (released last week) and the secondary weapon selection was basically holding tab and then the menu emulated the mouse like it was an analog stick to select knives, grenades, etc. They are making huge money on the PC platform but still can't be bothered to make a proper UI or binds for an entirely different platform.


Sure I'm serious. Steam, and Front Row, are focusing on linux. Valve's making Linux X86 the target platform for their console. the Steambox is going to be another X86 PC of comparable speed to what I already run, but much smaller.

I'm an early adopter of the same hardware you're all going to be enjoying this christmas, and many of the same games. Plus, I get a Virtual Console offering that Nintendo/Microsoft/Sony only wish they could match, and a television experience with Sickbeard/XBMC that not even Netflix can match, let alone whatever soon-abandoned toy Microsoft will be including with the XBO.

You're absolutely right; Microsoft/Sony adopting the X86 platform for this generation is a perfect storm for Valve; release a Steambox running linux, and you can minimize the cost of porting to 20% of what those same efforts would require coming from a PPC architecture.


Porting from console to PC is trivial. Getting it working well on an infinite number of configurations is not and supporting it is expensive.


Often it's not the problem of it not working well, it's the issue of UX. Mouse and keyboard are miles away from controller.

And these days they mostly don't even bother to do anything...


That's what abstraction layers are for, think Direct X and OpenGL.


Those certainly help port it to the PC, but they don't guarantee it will work well on every combination of components.


Even if you had the same GPU and CPU (which you don't, there had not been 8 core/12CU GCN APUs two years ago, I doubt you can buy them even now) you still would be far away from the console experience because the PC games cannot access the GPU directly and rely on slow and inefficient APIs.


So, maybe you can point me towards some documentation that states that the AMD A10 is somehow massively outclassed by this new AMD-created CPU/GPU combo?


I don't know what is "massively outclassed". I am just saying you don't have the same or similar hardware - there had not been 8 cpu cores in A10 series and their 6 CU GPU is not even from the same generation (it's old VLIW cores). And even if you had the same hardware - PC games could not utilize it the same way the console games can. JFYI PS3 runs modern games on the Nvidia 7800 with 256M system memory. Try just booting Windows in a 256M on PC.


Ah. so the XBO/PS4 has two more cores than my PC, and the chip itself has more in common with the Atom-clone E-450 chip than the A10. that's maybe a $120 upgrade in December if I want to match it core-for-core, and everything else is already on-par. Thanks for the agreement that there's nothing on the XBO that outclasses my little PC.

Can you point me towards any documentation on AMD using VLIW cores? I thought the VLIW movement died with Transmeta. Also, if you could provide me with documentation that AMD isn't using the same-gen GPU cores in the XBO/PS4 that they put in the A10 (Southern Islands - Radeon 7XXX), I'd really appreciate that as well.

I wouldn't dare boot windows on a 256MB PC. it wouldn't run. However, I could do the very same thing with a Linux PC (you know, a PS3) without much issue. In fact, while typing this, I used Chef to deploy a 256MB VM running Lubuntu 13.04. :)


Seriously? You think 6+2 equals 8 and 6 is on par with 12 and 18 and then ask me to point at technical documentation readily available at the AMD website? What are you going to do with that, it many grades beyond basic arithmetics.


very interesting. I was looking on AMD's website for info on the Jaguar cores, and couldn't find any. Perhaps I'm looking in the wrong place?

so, I'm not sure what Very Long Instruction Word being implemented in the A10 cores has to do with whether or not I understand basic arithmetic, so I'm going to assume you don't have any information on that, and will disregard that part of your grandparent post accordingly.

I'll take your silence about how the A10 is a much faster CPU than the low-power equivalent in the XBO/PS4 as agreement to these statements. Seems all we're doing is haggling over how much faster the GPU(s) in this unit are compared to the A10. I haven't found anything on AMD's site explaining these cores. I'd ask you to point me toward some documentation, but I expect you to return with an ad hominem again.

I'll also take your silence about how much more efficient linux can be with 256MB of RAM than windows as your agreement to these facts.

Still looking forward to any Jaguar-related documentation that you can find on AMD's site that covers my questions. thanks in advance.


I don't think you can find info on the Jaguar core on their website. Why are you looking for a CPU core info now, though? I thought you wanted to find if AMD GPU (note, letter C is not the same as G) in your PC uses VLIW, and somehow being reluctant to use web search (like this http://www.bing.com/search?q=amd+a10+vliw ). I am sorry, I cannot help you with navigating AMD website - it's dog slow and poorly organized. Also, disregard all I say, you, obviously know better. And sure, my silence means I agree with your idiocy, and not doing something else, like not checking HN comments.


I understand that native Xbox 360 games will not run on Xbox One, but games developed using XNA should be able to run if Microsoft ports the CLR to Xbox One no?


XNA has been discontinued. I wouldn't expect Microsoft to port the CLR.


The Xbox One runs on x86 hardware though, porting the CLR would be rather trivial. Indeed, the CLR probably runs as part of the apps VM.


Edit: Removed. Been quite a while since I played with XNA, totally forgot they discontinued it.


It's an interesting break down but ultimately the software will be the ultimate factor.


tl;dr:

Sony PS4 has a better CPU and GPU


There is no way in hell I'm going to buy the Xbox one always online spy box. Skype has a backdoor.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: