Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It seems like basically the machine needs to be able to push the equivalent video resolution (presumably software codec; In-Home Streaming probably doesn't leverage hardware MPEG decoders) and that's about it.

On my TV I use a 65W A10 in a mini ITX case with a 90W PSU. It has on-chip graphics, HD7660, and it handles streaming at 1080p without getting warm (my network on the other hand...)

I've also used my laptop, an A6 with HD8250 on-chip graphics. It does 720p with no problem (don't know about 1080p, 720p is the native res)

I expect you can go even lower, but if you are building an HTPC the cost difference between an A6 and an A4 is pretty small.

Edit: if you wanted to go tight budget, you might even be able to use one of these: http://www.newegg.com/Product/Product.aspx?Item=N82E16819113...

Edit2: Apparently it does use hardware encode/decode. I think the most recent 2-3 generations of chips from both vendors all have that, but don't quote me.



FWIW - on my meager Atom/ION setup I had to turn HW Decoding off in order for it to work properly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: