Mining started out with CPU's, then with GPU's, then with FPGA's, and now ASIC's. What's the next technology to make mining an order of magnitude or two more efficient? D-wave computers perhaps? Those cost quite a bit, though.
I'm curious as to where exactly this guy got the funding from. ASICs are not cheap. Even at a um process node, it would still probably come out to around $1m.
They got money from pre-orders (300 units at $1300 each). The Avalon team published their partially obscured contracts with TSMC which showed that the NRE costs were much less: about $200-300k total. And their chips are 110nm.