Hacker News new | past | comments | ask | show | jobs | submit login
Design of CMOS-memristor Circuits for LSTM architecture (arxiv.org)
58 points by godelmachine on June 8, 2018 | hide | past | favorite | 15 comments



This is cool stuff. But you guys, the perception that there are no low-hanging fruits anymore is false, and the reason it is perpetuated is because people oddly obsess over frontier techniques and their combinations, instead of innovating at first principles.

Just trying to be the voice of psychological health in this community.


AI is largely a research dominated field, and as such, it emphasizes novelty. Once the average developer starts using it I think things will change towards more practical concerns. We're starting to see this already with existing libraries and porting to non-python environments.


I'm not that familiar with CMOS-memristors. The paper is a bit scarce on background.

Is this just for inference or training time as well?

Is it digital or do the inputs have analog voltage rates and only then trigger like a neuron?

Can the network topology and weights be modified or is this fixed like an ASIC?


Interesting concept, but note this can't be used for training an LSTM network.

it's the training that involves far more computation and memory, because rather than just storing the latest state of the LSTM cells, one needs to store all past states and activations of the cells.

For forward inference, this logic looks good, although I'm unclear what use cases it would apply to.


It makes much more sense to design ASICs for inference than for training. Any useful inference network will be executed many millions of times more than its training and so will require much more computation in aggregate. Inference may also be run in embedded environments, running on battery power with no network connectivity while training can probably run in the cloud.


Are there any Master's degrees in AI + chip design?


In my university, both machine (and deep) learning and FPGA + chip design are specialties of the electrical engineering department. We just launched earlier this year an industry-oriented AI masters degree. Taking your extra classes in chip design would give you a solid foundation to perform this.


Would you kindly provide the link to the particular course of your university? Thanks :)


The master's program in AI I was referring to is only available in French (Laval University, Québec City).

The chip design expertise is provided by most electrical engineering departments, with courses usually named VLSI design, FPGA/ASIC development or microelectronics.

If you apply for a master's degree (in AI, for example), you can often mix-and-match speciality classes and ask for those chip design courses to be added to your cursus.

If you are a hands-on person curious about the matter, you can buy an FPGA (~50$ for entry-level) and follow a Verilog or VHDL tutorial online. Quickly put, an FPGA is a chip that can be "rewired" at will, very useful to learn or prototype before building a production chip.


Thanks for the info :)

Hope it will be available in English soon enough.


Never heard of one, if you ask me. You have to customize the course so as to suit your expectations. Basically, take up courses on Machine Learning and Advanced Digital Electronics.


MIT has a course I think.


Would you kindly post the relevant link?


I think it's an MIT only course, but this is a good starting link: http://eyeriss.mit.edu/


Any interesting algorithm eventually finds its way down into silicon.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: