This is cool stuff. But you guys, the perception that there are no low-hanging fruits anymore is false, and the reason it is perpetuated is because people oddly obsess over frontier techniques and their combinations, instead of innovating at first principles.
Just trying to be the voice of psychological health in this community.
AI is largely a research dominated field, and as such, it emphasizes novelty. Once the average developer starts using it I think things will change towards more practical concerns. We're starting to see this already with existing libraries and porting to non-python environments.
Interesting concept, but note this can't be used for training an LSTM network.
it's the training that involves far more computation and memory, because rather than just storing the latest state of the LSTM cells, one needs to store all past states and activations of the cells.
For forward inference, this logic looks good, although I'm unclear what use cases it would apply to.
It makes much more sense to design ASICs for inference than for training. Any useful inference network will be executed many millions of times more than its training and so will require much more computation in aggregate. Inference may also be run in embedded environments, running on battery power with no network connectivity while training can probably run in the cloud.
In my university, both machine (and deep) learning and FPGA + chip design are specialties of the electrical engineering department. We just launched earlier this year an industry-oriented AI masters degree. Taking your extra classes in chip design would give you a solid foundation to perform this.
The master's program in AI I was referring to is only available in French (Laval University, Québec City).
The chip design expertise is provided by most electrical engineering departments, with courses usually named VLSI design, FPGA/ASIC development or microelectronics.
If you apply for a master's degree (in AI, for example), you can often mix-and-match speciality classes and ask for those chip design courses to be added to your cursus.
If you are a hands-on person curious about the matter, you can buy an FPGA (~50$ for entry-level) and follow a Verilog or VHDL tutorial online. Quickly put, an FPGA is a chip that can be "rewired" at will, very useful to learn or prototype before building a production chip.
Never heard of one, if you ask me.
You have to customize the course so as to suit your expectations.
Basically, take up courses on Machine Learning and Advanced Digital Electronics.
Just trying to be the voice of psychological health in this community.