Hacker News new | past | comments | ask | show | jobs | submit login
The Ouya works, it’s here, and it’s heading your way (penny-arcade.com)
203 points by richeyrw on March 29, 2013 | hide | past | favorite | 108 comments



According to the Kickstarter page, the Ouya was due out in March 2013:

http://www.kickstarter.com/projects/ouya/ouya-a-new-kind-of-...

I'm pretty sure I was one of the many naysayers who said the Ouya would never make that launch date. Hell, I once ordered underwear from a Kickstarter clothing outfit and that came at least three months late. I've supported several Kickstarters and none of them have delivered the physical goods within two months of their promise (I've even had one delayed by a year). The Ouya is late by a couple of weeks, if even that.

Congrats to Ouya for meeting their goal, here's hoping that their system is a success.


The reports of system problems are already rolling in, though. In particular it seems that the controller has perceptible input lag; that's an immediate killer for pretty much any gaming-related product.

They might have shipped, but it remains to be seen if they shipped something that's any good.


I don't know where this is coming from. I can't perceive any lag in my unit when playing all sorts of games. I've hooked up an xbox 360 controller and there's no difference. There's no more input lag on an ouya than on an xbox 360 as far as I can tell.

There are a couple games built using an older version of the ouya Unity plugin that introduced some lag, so that might explain it.

Full disclosure: I am an employee of OUYA. But any developer that has had a dev unit for the past 3 months can chime in. We never got this feedback until a couple people in the press brought it up.


> I don't know where this is coming from.

Polygon's hands-on review mentions this latency, along with a few other criticisms, including the feel of the controller:

http://www.polygon.com/2013/3/28/4157602/ouya-feature


I know where the statements are literally coming from, but not why they're saying it :)


@jader I've got over 8000M in Canabalt, a personal high score on my ouya, much higher than I've ever got on my Nexus 4. If there was significant lag I don't think that'd be possible.

I've talked to the developers of the ouya port and they're baffled by these reports as well.

People subconsciously looking for something negative? No clue.


Doesn't HDMI give you the new and wonderful possibility of display lag? (This is particularly pronounced when playing games using iOS devices, especially older iOS devices, with AirPlay.)

Because the time between a frame being rendered and appearing on screen can increase, you are reacting slowly to events on the device which you perceive as control lag when in fact it's display lag.


Compliant HDMI shouldn't introduce any latency itself. It's just a digital signal, no different in principle than the signal from a wired game controller. Where latency can creep in is a TV's image processing algorithms – a color adjustment algorithm might add a few milliseconds, an edge enhancement might add a few more; a 120hz motion compensation system will add a lot more.

Most TVs will allow these to be disabled (often in a specifically labeled "game mode") but a few may inexplicably not allow them all to be disabled, hardwired for a certain degree of latency.

AirPlay, on the other hand, introduces a lot of latency because it's compressing the image prior to transport and uncompressing it on the other end. This generally introduces significantly more lag than any TV image processing.


Here's an article on how to detect and measure the effect I'm talking about.

http://www.pcworld.com/article/183928/how_to_find_and_fix_in...

It's a consequence of using framebuffers in the display device. I mentioed Airport because it exaggerates the effect, but any digital display will likely introduce s little extra lag.


I don't think mortenjorck particularly disagrees with you.

I think they are reading [HDMI] specifically as [HDMI] rather than [The use of an external display, via HDMI].


Sorry I can't reply to you directly, but:

EXACTLY. HDMI encodes frames and pipes them to the TV which has to render them. There's no guarantee this will be instantaneous (indeed it's virtually guaranteed that it won't).


Incorrect. HDMI is a standard for transmitting uncompressed video, same as DVI. Raster goes in one end and out the other. AirPlay, on the other hand, does encode frames that need to be uncompressed by a renderer.


Compressed or uncompressed doesn't matter. The point is that the display device is buffering frames and then rendering them later (ideally imperceptibly later).

Airplay definitely adds more potential bottlenecks but the principle remains the same.


>Compressed or uncompressed doesn't matter

it most definitely does. With DVI/HDMI each frame sent is just that, a frame. at X the screen should be Y. If a frame is damaged/lost you have only lost the information from that one frame (or just part of the frame).

However with compressed data each frame is now intertwined with neighboring ones. if you lose a frame or its damaged, you have now have lost X frames until the dependent frames are past.

I'm not sure if you've ever had the experience of a damaged HDMI cable, but you can see the exact pixels that are affected, and they change frame to frame as you twist the cable making it worse or better.

Compressed video is like Netflix or a damaged AVI where when the corrupted data is hit the entire stream goes wonky for a short while until it suddenly snaps back into clarity when a keyframe is hit.

Uncompressed is as near real time as you can get, the video is directly passed through. Compressed you have a buffer, decoder, etc and there is more delay/processing.


In practice no digital TV actually works like this. The screen isn't instantly updated the moment the pixel data arrives on the line (which is exactly how analog CRTs work, which is why you have to adjust vertical hold, etc.)

You can still get lag on CRTs — phosphors don't light up instantly — but digital TV adds all kinds of opportunities for screwing up timing, and ironically it's the fancier TVs that do more processing (e.g. Interpolating frames, decoding 3D, et .).


But the buffering is not required because of the HDMI link; a display could update each pixel as it receives it over HDMI, just like CRTs did. LCDs buffer at least one frame for many reasons, but "because HDMI requires it" is not among them.

And compressed vs. uncompressed does matter, especially when you're talking about video compression. Because compression requires some minimum amount of data before it can even begin to compress, let alone start to send the compressed bitstream over a physical link. And then you have more latency as you add a decode step. Not to mention the complete lack of latency guarantees a wifi link has...


Compressed or uncompressed doesn't matter. The point is that the display device is buffering frames and then rendering them later (ideally imperceptibly later).


not to mention go through your congested wifi network.


I agree that display lag can be an issue. But HDMI is not the cause. Sluggish post-processing in the TV is (if it does color correction, sharpening, noise filtering, etc. etc.).


I know other people have already talked about screen lag, but I just wanted to mention that this is well known in the Super Smash Brothers community. LCD/HD tvs are not usually used in Super Smash brothers tournaments because of the noticeable lag. http://super-smash-bros.wikia.com/wiki/Lag#Screen_Lag

While its possible the controller lags, it seems a little more likely that the lag is caused from the transition from the iPad screen to a lagging LCD screen.


Yeah, I sometimes think sensitivity to input lag is very subjective. Also, another factor is the TV and/or receiver (if they have Ouya going through one). But surely Polygon editors would factor this in?

I'm not a game developer (or hardware expert), but are there not tests that can be performed to capture input lag? At least that way it could be ruled in/out at the hardware/software level, assuming you have a baseline reference of any input lag from TV/receiver.


John Carmack has done a bit of research on this, he has a good explication on SuperUser http://superuser.com/a/419167

Edit: Also on his blog http://www.altdevblogaday.com/2013/02/22/latency-mitigation-...

tl;dr High speed video camera capturing button input and screen in the same frame to measure the complete cycle. Diagnosing why is harder.


> This was most apparent in the Ouya port of Canabalt. On iOS, Canabalt is a game of twitch reflexes. On the Ouya, I needed to react much earlier, knowing it would take an inordinate amount of time for me to get off the ground.

Have you guys not had the same experiences w/ Canabalt, or have you not heard any other reports of this?

I personally have a hard time noticing latency like some others report w/ games, unless it directly impacts gameplay (i.e. music games like Rock Band). I don't notice it on other games like FPSes, but many people are very sensitive to latency in games like this.


I just played canabalt on my OUYA devkit (I'm making a game for the OUYA), and I didn't experience any latency from the controller. A bunch of people played it last night when I demoed the OUYA at gamecity in Nottingham and no one complained about it then either. I haven't played canabalt on iOS and I'm not a big console player, so maybe I'm just not attuned to it


Maybe it is video lag? (i.e. their TVs are adding the lag, not the controllers.)


Since the controller uses Bluetooth, it's sending packet data over 2.4GHz with error correction. Perhaps if you're in a Wifi-congested environment, you get more error packets, leading to a reduction in sending speed or more time spend retrying. That could be one difference between the Ouya lab and the laptop/console-filled halls of a gaming website.


I hope these people are carefully considering the many sources of input lag[1], especially display lag, before giving the controller a bad rap.

[1] http://en.wikipedia.org/wiki/Input_lag


This is only for the Kickstarter backers, though. The public launch is in June. If the lag is due to software, it could be fixed by then, for everyone.


So, you're saying that potentially releasing a poor quality product to early adopters who bothered to help fund development of the console and will help shape early public opinion about the unit before the full launch, is ok.


At this point, Ouya has a choice between shipping beta software on schedule or shipping more polished software late. From this review it sounds like the software is good enough and they're better off avoiding bad press from being late.


Well, it'll also help to shape early internal opinion about the deficiencies and accomplishments of their product and any particular fixes which need to be done to maximize the success of the later public launch.

Hardware MVPs are hard, but this certainly feels like a step in the right direction of faster hardware iteration.


Got any source for that controller input lag? That does sound like a bummer.


"Everything isn't perfect; I noticed a slight bit of lag on the controller, and I can't point to a single game that would make one need to buy a system at launch."

From the linked article.


Not a comment on the Ouya specifically, but we live in an odd moment in computing history when many of us carry powerful computers with us everywhere we go, but connecting them to our existing peripherals is difficult enough that we'd rather just buy another with the right plug.

I wonder how long this will last.


Connectors and physical controllers remain expensive and difficult to make. "Computers" just fall to the base price of etching a circuit board.

This is the new normal. The connector is the valuable part.


This captures it. The cost to add the transistors to put an ARM CPU in silicon is < the cost of the metals in the USB and HDMI connector you use to talk to it.

I gave a talk on the Internet of Things where I tried to communicate this point clearly, for more and more applications the marginal cost to add a computer is nearly 0, the marginal cost to add a network is about $0.35, so a lot of things that wouldn't have had networks or computers in them before suddenly do.

Back when Sun Micro was trying to get everyone in the Java group to write up patents on anything they could think of, James Gosling in what was a great comment on the process wrote up one for Java inside a light switch. He reasoned it was the most ridiculous patent ever since a light switch from Home Depot was $1.50 and there was no amount of "coolness" you could add with Java that would merit building a light switch with its own processor. What he didn't count on was that the cost of the computer that could run Java would drop below the cost to make a mechanical switch.


> The cost to add the transistors to put an ARM CPU in silicon is < the cost of the metals in the USB and HDMI connector you use to talk to it.

This statement amazes me. It may be true, but it is very counterintuitive to my experience.

A micro USB connector is about $0.35/unit (for quantites of 1500). An HDMI connector is about $0.50/unit (similar quantites). These prices are from Digikey. So, in larger quantites from larger distributors, you could certainly get cheaper prices. The cheapest ARM processor (cortex M0) is about $0.78/unit on Digikey. I understand your point is about licensing the ARM design and integrating it into your silicon, but most low cost devices I've seen have just been PCBs that integrate these off the shelf components. I would have to imagine the cost of hiring engineers to do the VLSI design/integration, the cost of licensing the ARM CPU, and then the cost of fabricating/testing your silicon would have to exceed integration on a PCB. So, I would then assume that the processor, while certainly cheap, is still a very substantial portion of the cost of the device. But, I have no data on the cost of ARM licenses to back that up. You're definitely right in asserting the trend is cheaper and cheaper processors, but I don't think we've arrived at the "processors are so cheap they're basically free" world quite yet.


The difference is that what you and I can buy vs what can be done in the world of building semi-custom ICs for your consumer gear.

The ARM7/TDMI core is about 100K transistors, that is nominally a square 316 transistors x 316 transistors, which with a 22 nm process is about 3 microns square. The marginal cost to add 3 square microns to a chip is very nearly 0, as an example the 'test feature'[1] on a chip the company I worked at in 2000 was 18 square microns and "wasted space" in the final chip. I say its 'nearly' zero because while the cost to produce the chip doesn't change measurably, the yield curve does and the 'cost' is the chips that fail due to this extra core not functioning.

So if you make a consumer electronics gizmo in quantity and it has a semi-custom chip on it, adding a computer to that chip these days won't make your semi-custom chip that much more expensive and by having a programability aspect you can add features without re-spinning the chip.

As for the cost, TSMC offers "add an arm core" to your ASIC as a design service. I've not been part of a wafer start negotiation for over a decade but it would not surprise me in the least if they offer to throw that in for free these days to sweeten the deal.

[1] The "test feature" is a part of the chip that the fab uses to verify the wafer processing worked correctly, it generally can be probed to with a simple voltage or current pulse to quickly screen out die which didn't get baked correctly.


That doesn't sound right: 22 nm * 316 = 69 microns, giving 69^2 == ~4800 square microns; a transistor is larger than the feature size, and don't wires take the majority of the area? (Still, that's well under a square millimeter.)

So we haven't quite reached the day Eric Drexler hinted at with "so-called microcomputers still visible to the naked eye".


Nice catch, in my haste I was dividing and should have been multiplying. Typical transistor size is 4x feature size. So 316 * 2 * .022 ~ 14 microns and squared its ~ 200 square microns.

That said, we are still talking about an incrementally small addition to a chip.


Oops, I misplaced a decimal myself. That's amazingly small (still without wiring overhead, of course).


Do you have a link to the video of your talk?


Doesn't look like it, it was for an IEEE symposium in San Jose.


> This is the new normal. The connector is the valuable part.

Right, and I read this as "the connector is the next thing to go". Someday (one hopes) all of this will be done by software radio, anything that needs to be really fast will be over a generic optical interconnect, and the idea of buying a particular computer because it has a bit of copper that mates with your particular display will be humorously antiquated... like steam engines, or cursive handwriting.


I don't think so ... it's going to be infinitely recursive in terms of "middle or connecting pieces that are the new bottleneck".

Some vested interest or IP complication will reassert itself into the same chokepoint as soon as (insert your future state here) comes true...


There's an active and competitive market for, say, CPU architectures that end users are completely (and gladly) unaware of. I just see the walls of the black box expanding, that doesn't mean there's nothing going on inside.


We're seeing the beginnings of this with Apple's lightning connector. It's designed to be as flexible as possible, to the point where the AV cable that attaches to it has an ARM chip in it and runs (a very stripped down) OS X. By moving smarts out of the computer's port and into the cable, they make the port as simple and generic as possible.

I completely disagree on cursive handwritng though. I use that every day.


Im not sure if your comment was meant to be satirical, but you just pointed out the big danger in the impending move to wireless connections.

> the idea of buying a particular computer because it has a bit of copper that mates with your particular display will be humorously antiquated

Is the next stage going to be everything playing nicely with eachother as it should, or are we going to get a range of proprietary wireless standards that won't work with each other in the name of squeezing a few more dollars out of users?


Assuming you meant this to me? See my other comment about architectures. What you're describing sounds like Macintosh with PowerPC; was it ever an advantage? Maybe, but then it became a liability, and they switched. It hasn't exactly gone bad.

I'm sure there will be growing pains, but users will place such a high premium on systems that work together that it will have to happen at a software level even if there's differing technology under the hood.


I think this is only the outlook of casual gamers or (no troll) apple users that tend to only use apple products in their ecosystems.

Android is very rich with bluetooth controllers and microusb to hdmi adapters like this http://www.amazon.com/SlimPort%C2%AE-SP1002-Connect-connecto... .

As someone that would rather pay a little more for a real console, I really don't see the appeal of Ouya.


Ouya has no appeal to anyone but kids and shovelware authors who think they'll make a lot of money making crap games. It's obviously a trial run to knock out bugs and hacks in their distribution system and tighten up the security on their custom ARM hypervisior. Then they'll kill the Ouya, license the hypervisior as their product which they'll be able to sell to a large base of ARM Cortex based device manufacturers as a field tested DRM solution.


Many smartphones have HDMI out, and usb for a controller. It seems inevitable that they'll replace consoles (and desktops/laptops). Especially with wireless controllers and HDMI.

But I think the significant thing is price: you cannot buy a comparable smartphone for $99. This will always be true, because the smartphone needs a display, a battery, be compact, light, and not drain that battery too fast. What will change is that, eventually, smartphones that are good enough for games will be cheaper than $99 (or whatever price-point is then relevant). Although batteries are not improving at Moore's Law rates, this point will probably be reached surprisingly soon.

Bold prediction: the next generation of xbox/playstation/wii will fail for this reason. (smartphone GPUs will likely reach xbox360 levels this year; and the mainstream hasn't been demanding the more powerful GPUs of PCs, unlike previous generations).


Aside from that, even being able to plug things in doesn't guarantee a useful connection.

I can plug a tablet into a computer by USB, but it can't be a keyboard. Even though it contains a keyboard - is a superset of keyboard components - it still doesn't have the software to do it. Nor can it be a USB soundcard, or a USB display.

(But it can be a second display over TCP/IP and that can be done over a USB-Ethernet dongle which it does support).

Quite precisely what can be usefully plugged into what, is a website I've considered creating for a long time.


That's an encouraging review. I am not a backer, because I'm cynical, but that's quite promising.

This paragraph got me thinking: "I can't point to a single game that would make one need to buy a system at launch. Much of the value of the OUYA hardware lies in what you can do with it, from media functions to creating your own games. It's very possible that a breakout game is coming, and we just don't know what is it yet, but at this point it's hard to point to one single game that will get you to buy a unit."

For a traditional console, that would be a huge issue. (Ask Nintendo about the Wii U launch.) Sony, Microsoft, and Nintendo spend a lot of cash making sure there will be excellent launch titles. Traditional wisdom is that launch titles drive console sales.

On the other hand, the current generation of smartphones didn't have big launch titles. Possibly the landscape has changed.

On the third hand, smartphones do other things than play games.

On the fourth hand, sounds like the Ouya might be a strong media console, depending on how slick that XBMC integration is. C.f. the number of people who bought Playstation 3s as a Blu-ray player.

In any case, I'm impressed that hardware is shipping and I was wrong to think it wouldn't. I'd keep an eye on those lag reports, though; I would think Penny Arcade and Polygon are smart enough to think about video lag as a possibility.


I didn't receive mine yet but as a backer I was pretty disappointed to learn only yesterday (in their last email) that I have to put my CC on file to download demos. I know some people may not have a problem with that but even on my Apple Store account I can download free apps without a CC.

From their email: "You'll need a credit/debit card to download games. All games are still free to try. Your card will only be charged if you buy content you love. We do want valid payment information for everyone. This is to ensure that game developers can get paid when you love their game."

The article mention gift card, but I just don't want to pay a gift card that I may not use just to download demos. Well I guess I'll use the hardware as a MAME box, and video player.


I have a semi-disposable pre-paid credit card[1] that I load with a small amount of cash and use only for uncertain transactions.

I agree that storing a CC number has a number of problems (including lack of parental controls, as mention in the article).

[1] Yes, calling something a credit card when it doesn't allow me any credit, and doesn't provide some of the advantages of credit cards is annoying.


Are pre-paid credit cards really indistinguishable from credit cards?

I thought that some merchants didn't accept them for various reasons.


The CC is on file in order to streamline the purchase process when you decide to buy something. For OUYA to be a success there needs to be an appealing app ecosystem. That won't happen unless it gains developer interest and developers often invest in markets with a profit motive.


That doesn't make sense. Having a credit card on file does not lead to a better app ecosystem.


Having a credit card on file is neither a necessary, nor a sufficient condition to have a better app ecosystem. But if you had two app ecosystems that were identical aside from the fact that one stored credit cards and the other did not, you would certainly expect a better app ecosystem with the stored cards. Having to enter your credit card is a source of additional friction that decreases people's purchase rates of apps, even if by a small amount (although my opinion is that it's by a large amount). Having an app ecosystem where people spend money and spend it more freely certainly seems to improve the ecosystem, if you take comparisons between the iPhone and Android app stores into account, especially in the earlier days.


Yes sorry, you are right about having the credit card on file being a good thing, I meant in terms of downloading demos/free apps.


Again, there would be less demos/free apps if not for this, this is to motivate the developers.


Tell that to Apple. One of the reasons for App Store success was that people could buy apps without doing the complex credit card confirmation step for each purchase.

This leads to impulse purchases, and higher sales as the friction on purchase is reduced to a click/tap.


I'm in agreement with you on this. OUYA has proven to be very receptive to community feedback so it's good to voice concerns. It has a good chance of being reassessed.


True, on the comment page of their kickstarter a lot of people expressed their concerns over the CC requirements. I really hope they will at least communicate on that particular issue soon


You're in the minority as most people will enter their credit cards, as you say, and they will make lots more money because of this. It's a smart move and they are certainly aware of the cost-benefit ratio. (I bet that you will enter your credit card ultimately in the end if you buy one of these things, despite your concerns, since you will eventually cave once an inevitable "must play" game enters the ecosystem.)


I'm really excited to get mine soon! Does anyone have any resources for people who want to dabble in creating a game for Ouya? Is it as basic as just making an HTML5 game with a wrapper of some sort? I would love to see some basic "how-to" type articles.


I was under the impression that Ouya is running Android? So you'd write it in Java ideally. I may be wrong though.


You don't need to write Android apps in Java; you can write them in C, or in any language that can compile to native code and use a C API via FFI.

In any case, for games, you probably want to start learning OpenGL ES.


The issue with the NDK is that it is still a 2nd class citzen.

The few Android APIs available, are wrappers around JNI calls.

Java is the native language of the platform and Google does not seem willing to change that, even with the ongoing court issues.


You may want to consider directly going to Unity


I think that actually does sound right, if I remember correctly. So, to modify my question, does anyone have any good tutorials on creating games with Java for Android? I'm just starting to get the hang of making HTML5 games, but it would be nice to see how I could translate that to work for Android devices.


Last I checked, this looked like the way to do it.

http://libgdx.badlogicgames.com/

It apparently can build for iOS and the web (via GWT) too.

Though I can't see any reason they couldn't start supporting some HTML5 games as well assuming it can support a browser.


if you happen to know actionscript 3, you can also use adobe flash builder or flash develop - as adobe air can target android/ios devices just fine


Yes, it does run Android.


easiest would be to use something like unity or shiva


I give Ouya 5 million space points for the 'make' option. That's the most incredible part of this whole system. Develop on the box you play on... It's just revolutionary. If this thing can catch it might just revolutionize and destroy the gaming industry as we know it.


Commodore 64? Develop on it for it. So mebbe the revolution started without you... in the 80's. :)

I agree that we'll see a lot more games developed in garages and on weekends. And I think that's where a lot of the unique games come from. But this won't destroy the gaming industry at all. It will help usher out folks who shouldn't be in the industry in the first place.


Any early computer. Heck, UNIX was developed so the bored team members could play games on an old minicomputer.


You got a link to back that one up?


No I read it in a book years ago. (Perhaps "Hackers", by Steven Levy.) The story, as I recall, was that Ken Thompson was working at Bell Labs, bored, and had an old PDP-7 (?) that had no software. He was bored and wanted something to run space war on, but to get there he needed an OS, compiler, etc.


You are right on that citation. Awesome, awesome book that really opened my eyes (as a 24 year old) as to just how far things had come.



Thank you. I have a better memory for trivial detail than sources!


I didn't say it would "destroy the gaming industry." I said, "It would destroy the gaming industry as we know it." Meaning that it could radically change the economics and the way we play and consume games.


I was initially thinking the Ouya would be cool. But then I actually thought about the console space and realized that what was true of my younger years is no longer the case. In particular, a NES was significantly different than a Windows or Unix workstation. The console was an entirely different platform/ecosystem, and in a time where they were reasonably complex by comparison other computing operating systems.

These days, we're literally running the same software we run on everything else, but in a little box that has an audio/video output and a port for a controller. And then when I realized that, I immediately realized that the console is mostly dead. The only case where this isn't true is where performance metrics are consistent. This is why development on platforms like a PS3 or 360 result in shorter dev cycles and higher quality results: the hardware is all the same. But that matters when you're writing software that isn't shielded from the system, so with Java, that's a non factor, making Ouya nothing special.

I believe the next Playstation, Xbox, and Nintendo will all have their merits -- high-end hardware that is consistent for years, which will allow developers to rapidly build games without having to concern themselves with the lowest common denominator (it's ridiculous to see software designed to run on a 512MB 1 core machine performing horribly on a 24GB machine with 6 cores, 2 GPUs, and 3GB of GPU memory because it was decided by someone that progressive enhancement of features would be too expensive a development cost, or for those high-end features to be completely non-optimized).

For me, I am summarily unimpressed and not excited. For me, this is packaging Java in yet-another-box that I have to buy. Why can't I just download an app and play Ouya games on my PC? That's a -1 for Ouya and a +1 for what Valve is doing with Steam.


The question is whether PCs are approaching the advantages of consoles from the other side. Are hardware requirements and iterations stabilizing? Are operating systems stabilizing?Are gaming APIs stabilizing?

I bought a gaming computer last summer and was quite impressed with it until its SSD died (I have to get it shipped to me and replace that SSD at some point). With an ordinary Windows 7 installation it ran a full gamut of emulators, ran the Source engine with the quality settings turned pretty far up, and ran everything else available on Steam with good quality, too. It also played DVDs and downloaded movies in high-def and with good sound, as well.

My real question is: over how many years of usage can I amortize the cost of that gaming PC? Because a lightly-used or "last year" gaming PC costs $600-$800, while a new one with top of the line hardware costs about $1000-$1200. If I can keep it for 6 years like I would with a console, the new console can match the one-year-old gaming PC for price, while the PC has general media functions, retains backwards compatibility via emulation, and gives me choice of peripherals.

Hmmm... but the traditional disadvantage of PCs was having to upgrade your hardware, operating system, and APIs continually to keep up with new features in the gaming world, whereas with a console you'd just drop $200-$300 every 4-6 years for the new system. With a PC, upgrading the graphics card, motherboard or the hard drive might easily cost that much, depending on just how up-to-date you keep it.

Seems to me there's a space for a "Ship of Theseus" model of PC, where the cost of hardware upgrades made every few years can approach the cost of a new console with the same frequency while retaining backwards compatibility.


I'm struggling to understand what you ar saying here to the point where I wonder if you've missed the biggest factor for the Ouya which is this.

It's that it's part of a (potentially and to a degree actually) huge Android ecosystem with the same games running on phones, tablets, consoles, Smart TVs, media centers, mini-PCs and netbooks.

Doesn't that change things?


The important part is the controller and potentially a market of games for it.

While it sounds nice to run the same game on a console and a touchscreen I don't see that working too well in reality.


I think the problem is smaller than you think.

Some data points:

1. Many phone tablet games compromise the controls due to the lack of button or stick controllers. They would actually be improved by a gamepad.

2. If a market exists then people will adapt the games to that market. It's not hard to imagine how many touch-only games could be altered in fairly minor to be D-pad friendly.

3. Developers will innovate new approaches to game input if the hardware is out there.


so you are saying that a game must have high end graphics or else it only applies to the lowest common denominator?

What about the fact that this thing boots up faster than a pc, is dedicated to focus on entertainment around the TV space as consoles are and is a system that is priced at $99. Call me a realist but a pc can do alot of the things my mp3 player can. But i dont want to have to lug around by pc just so i can listen to songs.

Oh and you do know steam is also coming out with their own console right? This only solidifies the notion that there is still a growing market for non techie individuals who want entertainment in their living rooms. Personally i prefer crowding around a big tv when playing console style games with friends. Its a bit hard to do that on 1 computer and 1 keyboard.


Is it wrong of me that I wouldn't mind turning one into a Linux system and not run Android or play games? (After all this is Hacker News...)


well even before I backed it I realized a) I don't have time to play games and b) it's been years since I played any game so I'm not sure if I'll like it. So I just bought it with the idea of running a distro with SickBeard/SabNzbd/Xbmc. Just like I did with the Raspberry Pi, but that is just a tad too slow. So, no, I wouldn't say you're wrong. It's good to see people are still into this stuff.


An ODROID would probably be better for your purposes.

http://www.hardkernel.com/renewal_2011/main.php


Considering features sets, an Odroid wouldn't be as cost effective.


you can get android on a stick for $24


Why would you do that? The days of yearning for a small hardware platform on which one can install Linux are over. You can fit Linux on a computer that would fit in your wallet.


Why not? That's the question to ask. :)


> Why would you do that?

There's a tradition of putting operating systems on anything. FreeBSD runs on cameras and game consoles and all sorts of devices that did not intend the user to install an OS.

So it's just a fun gimmicky project.



Personally I would have totally buyed the ouya 2 years ago when the average Android phone was still too weak to play anything more demanding than angry birds without choking on it. But today, miracast+N4+X360 controller and you have something that mops the floor with the ouya, game over.

But the biggest irony is that I would really dig this thing if it were a portable: touchscreen controls suck, there's no way around it. The xperia play was a fiasco and the Vita is going to die any day now. There are portable consoles running android but spec-wise they are all pathetic, and the quality of some is just subpar. If the ouya was a GBASP with a Tegra3 it would be the best portable console ever made, and would blow any phone out of the water when it comes to gaming.


Very excited to get mine. It is funny though as I look at OUYA, Steambox, even GameCube of old and other console cubes appear and can't help that Jobs almost called it too early with the Mac Cube a bit before it's time and not targeting the right area of the house yet while GameCube did around the same time (http://en.wikipedia.org/wiki/Power_Mac_G4_Cube - probably based on NeXTCube).

Hopefully the game controller for Apple TV isn't really an April Fool's Joke as AppleTV is close to being the next big console possibly.


I think Roku is closer than AppleTV. It has "wiimote" that can be turned sideways as a gamepad or used as a motion capture remote. Not to mention they actually let people develop for the platform, they just need to offer something other than that terrible Basic clone, BrightScript.


The opened up controller looks kind of haggard. The controller in the image here looks more abused than ny 6 year old XBox 360 controller, hopefully their production models are more durable than this one looks.


fyi, i submitted a game for ouya, and the system is not ready.

1) the dev sdk is not yet complete. Try figuring out how to bring up the in-game menu, or "pause" using the controller. Seems logical that's what the middle button should be for, but there isn't any sort of guidance on this in the api.

2) my game is developed using unity, and runs on android (it's in the google play store). We get my game building using the official Ouya plugin, but without dev hardware, our ability to test is extremely limited, but we do our best to plug in an xbox controller and make sure everythign works. .... so we submit the game to ouya, and they reject it citing "the game takes turns every one second after starting, and the music keeps playing after exiting the game"..... since my game doesn't actually have "turns", and we use Unity as an engine so don't actually do anything special for exiting, I emailed them back asking for clarification ... and got no response.

rant: they name the controllers buttons OUYA. wtf seriously? they couldn't use ABXY?


I can't wait to receive my Ouya, as a $99 backer.


i didn't back this project, but it's really exciting to see that they have shipped the ouya, on time. much props to the ouya team and all involved!


You can lead a horse to metered water but you can't make him put a quarter in the booth.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: