Their technical details page seems to push eMMC pretty hard but their prices, 64GB for $80 [1], seem really high compared to a microSD card. Can anyone with experience comment on whether the eMMC has anywhere near the value for its price?
Yes, it does. SD cards are not really designed for being used as a root file system. They are OK at streaming writes and reads, but once you start getting non sequential they slow down like you wouldn't believe. Which, yes, I know, is super weird for what should just be some glue logic around a NAND.
I was investigating corruption and performance problems with our root filesystem on an SD card. The cards would be great a obvious benchmark style tasks, but when I recorded the block access pattern from our driver and replayed that on a Linux desktop with O_DIRECT, we'd not only get under 200KiB/s average access speed (which matched what we were seeing on the embedded system), but we would also get corrupted sectors (ie. we write it just fine, but we just get an error back from the card when we go to read the sector) in less than a day of sustained writes (once again at only 200KiB/s). And this was with high quality SanDisk cards that were then verified by them to be real cards (for a while we thought that clones had made their way into our supply chain). Using a weird form factor eMMC chip in an SD card got rid of these issues. There's also "industrial" SD cards that are around the same price that I suspect are basically the same thing.
Most SD and μSD cards have embedded microcontrollers http://www.bunniestudios.com/blog/?p=3554 — I wonder how much of the corrupted-sector problems you were getting are really just bugs in their firmware? Or things that could be worked around with better firmware?
Yeah, it was almost certainly firmware bugs combined with already marginal NANDs. ie. the NANDs probably would have been fine if they were babysat a little better, but they were probably bottom of the barrel. That's kind of the SD card market in a nut shell any way.
That being said we tried many different SD cards from quality vendors and found pretty heavy bugs and performance related issues among all of the non "industrial" versions.
So, yes, the problems don't have to be intrinsic to the form factor, but empirically you're more likely than not to have issues with them.
I bought a Odroid-U3 and pretty often the bottleneck was on the microSD card so I grew tired of it and I bought a eMMC 64gb and it gets the job done. It is way faster than the best microSD if that is what you were wondering.
Some eMMC modules claim 150MB/s read/write; i.e. same speed as hard drives from about 5-7 years ago. Or put another way approximately twice as fast as an average USB3 flash drive.
I guess the configuration I'm benchmarking with, and the type of load, which is basically streaming, but from a journaled-but-otherwise-standard ext4, slows me down a little. But the convenience and standardness of this setup is worth more to me than a factor of 2 in performance (that's not my bottleneck at the moment).
EMMC is worth it, I have the ODroid U2 and am about to purchase the one at this link, and would not skip out on the EMMC for anything. SD card bottleneck is real.
Just wanted to post that my interaction with (and purchases from) ODROID have been fantastic. They also have this really cool magazine they release from time to time:
It's a great platform to learn on, no fuss ordering (a little waiting), and the platform is both cheap (they're one of the rare companies that has reduced the price of a product going from one release to another).
I wonder what kind of "license issues" are present in the current software?
"U-boot/Kernel/Linux source code will be released 15-Dec-2014. Android source code will be published in February after cleaning some license issues."
Looks like the biggest problem with most of these ARM boards (raspberry pi included) is the dependency on some closed-source binary blobs, at least for boot, and sometimes even the entire kernel is a closed branch off some old release.
I've never dealt with an ARM core that needed some magic object-only blob to boot. They're almost all using Denx U-Boot.
If the GPU needs an object loaded to boot, like the RPi, that's another story. But a decent SoC should be able to start without the GPU getting in the way.
It's really not quite fair to describe videocore as a GPU, it's a proprietary SIMD DSP architecture with a really awesome 2D register file.
I've joked before that the arm core on the rpi is really just there for power management ... and considering its share of the transistor budget, that joke almost sounds credible.
Correct. But this is just the first stage bootloader; the GPU boot ROM loads a second stage loader from the SD card, which then loads the kernel and launches the ARM straight into it. None of this is encrypted or signed as is usual on Android devices.
The license conditions do, however, forbid you from using the first-stage bootloader on anything other a Raspberry Pi. That is, if you somehow manage to get hold of another board with the same Broadcom SoC that's somehow compatible with the bootloader, you're not allowed to boot that with it.
The Raspberry Pi GPU has been fully and openly documented now, so it should in theory be possible to boot it without a binary blob. You just have to reimplement enough of the graphics stack to make the CPU happy (?)
While your avoidance is understandable, given Broadcom's thoroughly disreputable history with regard to free software, it's not justified in this case. The Pi GPU is apparently fully enough documented that you can write your own software for it.
I could care less about the state of the GPU driver source and the typical complaints of RPi developers. Open, closed, whatever.
I'm more skeptical of a SoC architecture that has the CPU dependent on the GPU for its startup sequence. Or is this a matter of the GPU controlling the ARM's clock tree?
Most "bigger" SoCs have a secondary processor for SoC bringup/boot. (It's not uncommon to see a tiny ARM7 or something in this role). Since Broadcom succeeded in making a particularly general purpose core for their GPU (it seems that the GPU runs a full RTOS written in C), it only makes sense (well, to me at least) that it could take over that role as well to save on the transistor budget.
Anybody make something similar with dual ethernet? I'd like to replace my aging Alix-based home router/vpn (http://www.pcengines.ch/alix2d3.htm) with something like this.
If 100M is enough for you, I'm developing a new board with 2 RJ45 for industrial ethernet, which has a 40 pin connector compatible to that of RPi B+, but the circuit is adapted from beaglebone black. It's scheduled for end of January 2015.
So, $315 and ~15W max. I bet this could serve a small office well. Someone should get this down to $200 with minimal specs (and no video or audio).
It has dual Realtek NICs; it runs pfsense and openVPN well, and you can also run squid and snort if you're into it (I haven't learned how to use them yet, but I plan to).
The mirabox has dual ethernet and arch linux support is good. The box is a little expensive and I have never got wifi to work but as a router/firewall it does the job:
The Novena board looks pretty shiny for this. Mine should come in any day, and I plan to at least try it out for this purpose. Unfortunately it isn't dual gigabit though, so it'll bottleneck my internet a little bit.
would be interesting to know if this odroid is better or worse than a device like Cubox[1] and CuboxTV[2].
They are more expensive but they include case, power supply etc..
I heard something, that the software (hardware drivers and other things) support for these not so much sold devices is sparse. Does anybody know more? I guess, they ported some Linux on it, but how about media server software (like XBMC/Kodi) and other stuff??
This should run most of the existing Ubuntu and Android images without any hassle. Having owned an ODROID-U2 for a long while now, I can tell you that these things run Ubuntu 14.04 like a dream (except for the lack of OpenGL - you only get ES) and Android works great:
(Mine is now running Ubuntu only, and the only gripe I have is that the kernel currently lacks enough group support to run Docker - that's only a recompile away, but I like my uptime...)
I have the Odroid XU3 (which I think runs the same version of their Ubuntu). XBMC/Kodi runs on it, but I get a crash intermittently when fast forwarding a movie. VLC works perfectly. Chromium and Firefox runs superbly. I was unable to get Flash working so Amazon Instant Video and HBOGo don't work. Wireless with the $8 usb dongle they sell works perfectly.
I'd love to be able to do that kind of thing on the ODROID-C1, especially since its GPU sounds like it's a lot faster! I see that they're designed by ARM rather than Broadcom, and there's a reverse-engineering effort that has produced something of a GL implementation on them http://limadriver.org/ that has Quake 3 Arena running on it already, and faster than the binary driver, but not yet playable https://libv.livejournal.com/23886.html; but that effort seems to have stalled last year. But it sounds like it was, at the time, limited to working from reverse engineering rather than official documentation https://archive.fosdem.org/2013/schedule/event/operating_sys..., and while ARM has a lot of development documentation on their Mali site http://malideveloper.arm.com/develop-for-mali/sample-code/, none of it seems to be for the GPU itself, but rather for the OpenGL ES implementation they've written for it.
So what's the deal? Is the GPU really actually totally undocumented officially, with the only available information being a dead open-source project that produced a half-complete free-software OpenGL implementation for it by reverse engineering? Or is there more stuff out there I'm missing?
(In any case, it's pretty incredible that in 2014 you can already buy an 8-processor single-board computer that you can program for US$35.)
It's amazing running Android. The drivers are all there and it can do 1080p playback no worries. Netflix for Android runs perfectly. So does every emulator going from the N64 generation back. Easily the best box short of a media PC to connect to your TV at the moment.
It's not as good with standard Linux. The graphics drivers aren't there at all. 1080p playback doesn't really work. Example link you can read up on yourself- http://forum.odroid.com/viewtopic.php?f=83&t=3214
I think you may have replied to the wrong comment, since what you said has nothing at all to do with what I was asking or saying in the comment of mine to which you attached it.
The SoC seems to be Cortex-A5, while many of the other boards around are Cortex-A9. Looking at the comparison between those two types, that might have been one of the trade-off made, as the A5 is targeting the lower end devices, and has less performance than a same clock rate/core count A9.
It's not necessarily bad, but good to know for better comparison.
I'm sure there are other advantages to AArch64, like a larger set of registers, that could still make it worthwhile. At the very least it might push down the prices on the old armv7 CPUs.
Is it possible to buy a tablet display without the actual tablet computer? I want something inexpensive and lightweight and efficient to pair with this for display and a regular lcd monitor isn't that.
Ok yes that's it except I just bought a quad core tablet and it only cost like $150 so the screen by itself should not be $120. I mean you can buy tablets for $50..
and a 3D hardware acceleration... this thing is so cool.
For $35 you can get a really decent computer. Makes it really viable for people to run a small unix dev machine or buy a bunch of these to run a miniature server farm.
I'm thinking of upgrading my Pi cluster (http://github.com/rcarmo/raspi-cluster) to a set of these boards. Mostly waiting for some benchmarks to come in and the holiday season to go away...
According to the Technical Detail tab : "5V2A DC input : This is for 5V power input, with an inner diameter of 0.8mm, and an outer diameter of 2.5mm. The ODROID-C1 consumes less than 0.5A in most cases, but it can climb to 2A if many passive USB peripherals are attached directly to the main board. "
Yes, the higher performance is rather appealing. Of course one of the reasons for the success of the raspberries is the low energy consumption, but sometimes it is just to slow. A little reserve really would be good for media centers or also if you want to have a small server solution.
Of course best would be some device with low power consumption when less work is to do and which can go to full power if needed. Multiple cores would be good for that, if the device could deactivate unneeded cores or even reduce the clock speed.
I am looking forward for more power saving devices, since servers which run all the day should consume less than today.
There might be opportunity for lower energy, speculating if the SoC has better dynamic power abilities, and even without it, can finish it's work earlier and go back to deep sleep longer.
[1] http://hardkernel.com/main/products/prdt_info.php?g_code=G14...