Hacker News new | past | comments | ask | show | jobs | submit login

Not a comment on the Ouya specifically, but we live in an odd moment in computing history when many of us carry powerful computers with us everywhere we go, but connecting them to our existing peripherals is difficult enough that we'd rather just buy another with the right plug.

I wonder how long this will last.




Connectors and physical controllers remain expensive and difficult to make. "Computers" just fall to the base price of etching a circuit board.

This is the new normal. The connector is the valuable part.


This captures it. The cost to add the transistors to put an ARM CPU in silicon is < the cost of the metals in the USB and HDMI connector you use to talk to it.

I gave a talk on the Internet of Things where I tried to communicate this point clearly, for more and more applications the marginal cost to add a computer is nearly 0, the marginal cost to add a network is about $0.35, so a lot of things that wouldn't have had networks or computers in them before suddenly do.

Back when Sun Micro was trying to get everyone in the Java group to write up patents on anything they could think of, James Gosling in what was a great comment on the process wrote up one for Java inside a light switch. He reasoned it was the most ridiculous patent ever since a light switch from Home Depot was $1.50 and there was no amount of "coolness" you could add with Java that would merit building a light switch with its own processor. What he didn't count on was that the cost of the computer that could run Java would drop below the cost to make a mechanical switch.


> The cost to add the transistors to put an ARM CPU in silicon is < the cost of the metals in the USB and HDMI connector you use to talk to it.

This statement amazes me. It may be true, but it is very counterintuitive to my experience.

A micro USB connector is about $0.35/unit (for quantites of 1500). An HDMI connector is about $0.50/unit (similar quantites). These prices are from Digikey. So, in larger quantites from larger distributors, you could certainly get cheaper prices. The cheapest ARM processor (cortex M0) is about $0.78/unit on Digikey. I understand your point is about licensing the ARM design and integrating it into your silicon, but most low cost devices I've seen have just been PCBs that integrate these off the shelf components. I would have to imagine the cost of hiring engineers to do the VLSI design/integration, the cost of licensing the ARM CPU, and then the cost of fabricating/testing your silicon would have to exceed integration on a PCB. So, I would then assume that the processor, while certainly cheap, is still a very substantial portion of the cost of the device. But, I have no data on the cost of ARM licenses to back that up. You're definitely right in asserting the trend is cheaper and cheaper processors, but I don't think we've arrived at the "processors are so cheap they're basically free" world quite yet.


The difference is that what you and I can buy vs what can be done in the world of building semi-custom ICs for your consumer gear.

The ARM7/TDMI core is about 100K transistors, that is nominally a square 316 transistors x 316 transistors, which with a 22 nm process is about 3 microns square. The marginal cost to add 3 square microns to a chip is very nearly 0, as an example the 'test feature'[1] on a chip the company I worked at in 2000 was 18 square microns and "wasted space" in the final chip. I say its 'nearly' zero because while the cost to produce the chip doesn't change measurably, the yield curve does and the 'cost' is the chips that fail due to this extra core not functioning.

So if you make a consumer electronics gizmo in quantity and it has a semi-custom chip on it, adding a computer to that chip these days won't make your semi-custom chip that much more expensive and by having a programability aspect you can add features without re-spinning the chip.

As for the cost, TSMC offers "add an arm core" to your ASIC as a design service. I've not been part of a wafer start negotiation for over a decade but it would not surprise me in the least if they offer to throw that in for free these days to sweeten the deal.

[1] The "test feature" is a part of the chip that the fab uses to verify the wafer processing worked correctly, it generally can be probed to with a simple voltage or current pulse to quickly screen out die which didn't get baked correctly.


That doesn't sound right: 22 nm * 316 = 69 microns, giving 69^2 == ~4800 square microns; a transistor is larger than the feature size, and don't wires take the majority of the area? (Still, that's well under a square millimeter.)

So we haven't quite reached the day Eric Drexler hinted at with "so-called microcomputers still visible to the naked eye".


Nice catch, in my haste I was dividing and should have been multiplying. Typical transistor size is 4x feature size. So 316 * 2 * .022 ~ 14 microns and squared its ~ 200 square microns.

That said, we are still talking about an incrementally small addition to a chip.


Oops, I misplaced a decimal myself. That's amazingly small (still without wiring overhead, of course).


Do you have a link to the video of your talk?


Doesn't look like it, it was for an IEEE symposium in San Jose.


> This is the new normal. The connector is the valuable part.

Right, and I read this as "the connector is the next thing to go". Someday (one hopes) all of this will be done by software radio, anything that needs to be really fast will be over a generic optical interconnect, and the idea of buying a particular computer because it has a bit of copper that mates with your particular display will be humorously antiquated... like steam engines, or cursive handwriting.


I don't think so ... it's going to be infinitely recursive in terms of "middle or connecting pieces that are the new bottleneck".

Some vested interest or IP complication will reassert itself into the same chokepoint as soon as (insert your future state here) comes true...


There's an active and competitive market for, say, CPU architectures that end users are completely (and gladly) unaware of. I just see the walls of the black box expanding, that doesn't mean there's nothing going on inside.


We're seeing the beginnings of this with Apple's lightning connector. It's designed to be as flexible as possible, to the point where the AV cable that attaches to it has an ARM chip in it and runs (a very stripped down) OS X. By moving smarts out of the computer's port and into the cable, they make the port as simple and generic as possible.

I completely disagree on cursive handwritng though. I use that every day.


Im not sure if your comment was meant to be satirical, but you just pointed out the big danger in the impending move to wireless connections.

> the idea of buying a particular computer because it has a bit of copper that mates with your particular display will be humorously antiquated

Is the next stage going to be everything playing nicely with eachother as it should, or are we going to get a range of proprietary wireless standards that won't work with each other in the name of squeezing a few more dollars out of users?


Assuming you meant this to me? See my other comment about architectures. What you're describing sounds like Macintosh with PowerPC; was it ever an advantage? Maybe, but then it became a liability, and they switched. It hasn't exactly gone bad.

I'm sure there will be growing pains, but users will place such a high premium on systems that work together that it will have to happen at a software level even if there's differing technology under the hood.


I think this is only the outlook of casual gamers or (no troll) apple users that tend to only use apple products in their ecosystems.

Android is very rich with bluetooth controllers and microusb to hdmi adapters like this http://www.amazon.com/SlimPort%C2%AE-SP1002-Connect-connecto... .

As someone that would rather pay a little more for a real console, I really don't see the appeal of Ouya.


Ouya has no appeal to anyone but kids and shovelware authors who think they'll make a lot of money making crap games. It's obviously a trial run to knock out bugs and hacks in their distribution system and tighten up the security on their custom ARM hypervisior. Then they'll kill the Ouya, license the hypervisior as their product which they'll be able to sell to a large base of ARM Cortex based device manufacturers as a field tested DRM solution.


Many smartphones have HDMI out, and usb for a controller. It seems inevitable that they'll replace consoles (and desktops/laptops). Especially with wireless controllers and HDMI.

But I think the significant thing is price: you cannot buy a comparable smartphone for $99. This will always be true, because the smartphone needs a display, a battery, be compact, light, and not drain that battery too fast. What will change is that, eventually, smartphones that are good enough for games will be cheaper than $99 (or whatever price-point is then relevant). Although batteries are not improving at Moore's Law rates, this point will probably be reached surprisingly soon.

Bold prediction: the next generation of xbox/playstation/wii will fail for this reason. (smartphone GPUs will likely reach xbox360 levels this year; and the mainstream hasn't been demanding the more powerful GPUs of PCs, unlike previous generations).


Aside from that, even being able to plug things in doesn't guarantee a useful connection.

I can plug a tablet into a computer by USB, but it can't be a keyboard. Even though it contains a keyboard - is a superset of keyboard components - it still doesn't have the software to do it. Nor can it be a USB soundcard, or a USB display.

(But it can be a second display over TCP/IP and that can be done over a USB-Ethernet dongle which it does support).

Quite precisely what can be usefully plugged into what, is a website I've considered creating for a long time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: