Hacker News new | past | comments | ask | show | jobs | submit login
The future of electronics based on memristive systems (nature.com)
87 points by stealthcat on April 22, 2018 | hide | past | favorite | 35 comments



I've just started a PhD on neuromorphic memristive Systems at EPFL, so if someone wants to ask questions/chat about this I will do my best to serve. Here or via the contact info in my profile:)


I’m finishing a PhD on neuromorphic computing. If you want to ask me about it I will do my best to serve :)


Heh cool, what did you work on? What did you find the hardest? What do you regret not having time for/would you like to add to what you did?


I've worked on HW/SW codesign, for mixed signal chips, initially for convnets and image recognition, and later as a more general DL architecture. My advisor is the guy who "found the missing memristor" at HP in 2008.

Looking back now I regret going into HW field. I should have applied to CS rather than ECE, and focus on DL algorithms (especially RL), or maybe even something like what Numenta is doing, because my primary interest is AI, not hardware to run AI.

The hardest part has been being the only DL/ML specialist in my group. Everyone else here is more HW-oriented. I only understood this when I did an internship where I worked with people I could learn from/discuss ideas with.

I looked at your CV, and my guess is that for you, the hardest part will be to focus on one thing for the next 4 years.


Thanks for sharing and this:

>I looked at your CV, and my guess is that for you, the hardest part will be to focus on one thing for the next 4 years.

Is an astute observation although I've found more than enough things in my lab to keep my interest so far^^"

Could you please link or mail me some of your publications? I'd greatly appreciate it


Our group page: https://sites.google.com/site/strukov

If you like to chat more, send me an email.


Any recommendations for good profs/schools? Also pre-reqs to study before starting PhD?


Will the 3D chip geometry be possible soon? Production, thermal balance etc. Will that alone change the programming paradigm much?


Define soon...in research it is already working I think, but AFAIK in commercial products it's at most used for memory and CMOS image sensors (which are amenable to that and where density pressure is the highest).

Getting production and thermal balancing right is really tough here, and remember that you need scale it to insane production runs to work with IC. But I'd not be surprised to see some breakthroughs in the next few years.

And I think memristives for memory will probably help because they ease the thermal pressure if you manage to eliminate sneak currents, and the material science/microengineering wizards in my lab are working on directly using the vias to make memristives, so obviously I dream of having stacks of memory sandwiching compute layers already...but this is still very much research!


Which is the most plausible application of neuromorphic memristive system that has the potential to reach the mass market with minimal hassles?


The one I'm working on for my PhD thesis ;-)

No seriously? These guys here with their DVS: https://inivation.com/

They are already on the market. It's not compute, but it is an awesomely HDR,low latency image sensor which gets you a motion gradient for free. Downside is no colour and no static images, but as a supplementary chip for stabilzation, for robotics or low power "wake up" camera and all those applications it is in my opinion already worth considering, and if the software and algorithmic side of image processing catches up to event stream based sensor data this will be quite awesome.

Disclaimer: I'm acquainted with some of the people who worked on this and think they are all quite lovely people, so might be biased.


How high dynamic range? Would be great to see some examples, nothing on the site



skummetmaelk their examples. What I should have said that the HDR is for the event sensors, which are black/white in the direction of the event (i.e. motions dark=> bright or bright to dark). But if your usecase is motion detection or tracking, that is enough as far as I'm concerned


How, if at all, will the memory device in Intel's optane xpoint memory relate to neuromorphic systems? Is it commercial proof of concept for a commercially viable device that, with relatively straightforward modification, can support in-memory computing in the future?


Based on what we know about the phase change memory in optane products it does not seem like it can be used for in-memory computing.


Do you see new programming paradigms emerging from this?


I am actually working on this partially...I wouldn't call it new paradigms, but dataflow, probabilistic computing and parallel by default paradigms like VHDL or CUDA would be my best guess if the wildest dreams come true and we actually reroute physical chips to make new models. But only in the lower layers(similar to XDA abstracts stuff already for you on tensorflow).

At the end, "all" I see happening in the near future will be

* ISA extensions to have accelerators similar to the ones we have for Crypto/RNG and especially vectorization (crossbar matrix multipliers <3 ) * custom chips being made using the standard custom/semicustom design flows, just including memristive and maybe neuromorphic cells * maybe something akin to FPGA programming, especially since intel might be integrating their new Altera into consumer devices https://newsroom.intel.com/news-releases/intel-completes-acq...


Do you see an applications in neuromorphics enabling lower power, massively parallel molecular simulations, e.g. for protein folding, etc?


I don't know enough about protein folding to say anything authorative about that. In general: it depends. If it has a memory bottleneck and the computation can be expressed as Matmul/matadd, then I can see memristives as accelerators. neuromorphic stuff...my gut would say no, unless you can map it into some domain where it is good at


What should I study first before starting a PhD in neuromorphic computer?

If it matters, I graduated with EE but now doing Masters in CS (Machine Learning).


You are doing about as well as I did...I'm full EE, but worked as Software and AI/ML freelancer. Do some reserach about labs (there aren't that many) and then develop towards that


I am wondering what have you done previously that got you into neuromorphic computing. I am currently an undergraduate in computer engineering but the idea of neuromorphic computing is really interesting and I am curious what the academic path looks like.


I did electrical engineering and listened to a lecture by our neuroscientific systems prof https://www.nst.ei.tum.de/team/jorg-conradt/ and it kind of developed from there, since I was interested in AI and ML as well(worked as a freelancer)


What does the neuromorphic part refer to? Is that a specific subset of applications of the tech, or something inherent to memristor-based computation?


Neuromorphic engineering is the ~buzzword that pays for my grant~ is a loosely defined field that was basically started by Carver Mead in the 70s: https://en.wikipedia.org/wiki/Neuromorphic_engineering

It's basically doing for compute what biomimetics https://en.wikipedia.org/wiki/Biomimetics did already for materials and robots (Festo has some amazing stuff here:https://www.youtube.com/watch?v=7-JvyzOddTM), looking at systems in nature when designing VLSI circuits. Examples would be looking at the eye for camera sensors, the ear for audio, the brain for computation etc.

So it's an orthogonal field that happens to marry nicely with some of the properties of memristives, in that both with them and in the brain there needs to be consideration of noise, false firings etc. It also gets people (like me) excited because it opens up a way for routing and synaptic plasticity which was missing for ML-ASICs so far


The last paragraph is everything!

"We conclude by noting that biology has always served and will continue to serve as a great inspiration to develop methods for achieving lower-power and real-time learning systems. However, just as birds in nature may have inspired modern aeronautics technology, we eventually moved in new directions and capabilities for faster travel, larger carrying capacities and entirely different fuelling requirements. Similarly, in computing, modern application needs to go beyond those faced in nature, such as searching large databases, efficiently scheduling resources or solving highly coupled sets of differential equations. Interestingly, some of the observed characteristics in memristors may similarly provide ‘beyond biology’ opportunities in computing, taking advantage of the novel device dynamical behaviour and the network topology inspired by biology. In this regard, concepts such as the memory processing unit represent truly exciting opportunities down the road. To achieve these and other new computing systems of the future will require persistent and creative research that goes beyond any single discipline, and must include insights from neuroscience, physics, chemistry, computer science, and electrical and computer engineering, among others."


I don’t understand. Can someone please dumb it down one setting on the dial? What makes this obviously superbiologically capable?


The obvious things are size, speed and integration with existing computers.

A neuron is measured in micrometers, a memristor in nanometers (and the paper suggests even single-atom devices may be possible).

A neuron can fire at most a couple hundred times per second; a memristor has "subnanosecond switching speed", i.e. it operates in the usual GHz operating frequency range of modern computers.

And you can integrate them at the circuit level, so they can draw on the superbiological capabilities of existing computers at native speed.


Downsides:

* routing is a BITCH

* cooling is a BITCH

* noise is a slightly nicer BITCH

* we are very early in building real stuff with this

But yeah, it's cool, I hope it works out (otherwise I chose a deadend/false start for my PhD :-)


You should be comparing memristors to synapses, not to neurons.


Beyond replacing current transistor logic, I find it theoretically pleasing that memristors fill the gap to complete the linking of current, voltage, charge, and flux with the existing 3 fundamental passive devices. What would basic circuit analysis lessons look like 10 years after they become commonplace?


> What would basic circuit analysis lessons look like 10 years after they become commonplace?

Exactly the same as they do now. Memristors aren't linear time-invariant, so it's extremely unlikely that they would show up in a basic circuit analysis lesson.


Not even inductors or caps are in a basic circuit analysis course. But this quadriga of passive components was at least mentioned in the very beginning of the first ee semester, components class which focused the physical effects (and was notoriously hard).


The "future" should be in quotes. Articles have been selling these BS for 20 years now --- and we're gonna get the same articles for the next 30 years at least....




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: