Hacker News new | past | comments | ask | show | jobs | submit login
'Universal memory' research passes new milestone (techxplore.com)
102 points by lelf on Jan 21, 2020 | hide | past | favorite | 34 comments



This is probably a little better: "Lancaster University shows how InAs/AlSb resonant-tunnelling non-volatile memory consumes 100 times less switching energy than DRAM " http://www.semiconductor-today.com/news_items/2020/jan/lanca...

Sigh, it's a III-V semiconductor. We've been here before--and failed before. I'll believe it when someone actually coughs up an array of these that can hold at least a couple of bytes.

However, to be fair, we might actually start getting some progress outside of silicon now that Moore's Law has broken. If silicon is at a standstill, other technologies may actually be able to catch up for once.


Original paper (Abstract): https://ieeexplore.ieee.org/document/8948343

I don't have access to the full paper, but I note this sentence of the abstract: "Simulations show that the device consumes very little power, with 100 times lower switching energy per unit area than DRAM, but with similar operating speeds."

So perhaps no actual devices exist, only concept and simulations?

Edit: I am reading the original via sci-hub and yes, all conclusions are based on (quite elaborate) device models.


Is this not typically (read: always) the case with new technologies though? Before any production or engineering could be done, a model must be created. The fact that a model exists strongly implies that it is physically possible.


No, as a matter of fact it's extremely rare a model precedes something physical that makes it to the corporeal world. Especially in device physics land.


My pcbs existed in design software before they were manufactured. I expect that is the more common case today.

I ran my code against a dev database before it went to prod.


The GP isn't talking about products but about exploitation of physical phenomena. The first PCB was etched with the ancestor of our modern fab process long before CAD software. Although the hydrogen theory for acid and bases was discovered a few decades before, AFAIK there still isn't a unified model of acid-metal interactions for different copper alloys, let alone all metals.


Just as a note, you don't need sci-hub for this, the link above has the full-text in because it's open access.


Sci-Hubify bookmarklet for super quick access: https://bookmarkify.it/9864


I’m skeptical of any ‘best of all worlds’ claim for a nascent technology. In this case, durability and read/write speed in one device would indeed be revolutionary, but at what cost? If I were to guess I’d say it would be around data integrity. If it works using resonance, could neighborhood bits be flipped sympathetically under certain circumstances? Would it have the same durability guarantees from server to mobile device? In environs with high EM radiation? Over millions of writes?

If there are indeed no trade offs, hooray! Even if there are, it will still have use cases for many scenarios, albeit not ‘universal’.

More generally I wish sci/tech reporters would ask these kinds of open questions, or lead their readers into doing so. Otherwise it feels like vaporware as they presented it, even if the work is valuable and credible.


Haven't read the paper, but persistence and energy (and therefore transition time) to write, i.e. change a physical state, if not read, seem like a pretty fundamental tradeoff pair that can't be broken.


> Otherwise it feels like vaporware as they presented it, even if the work is valuable and credible.

Whatever happened to all the graphene hype from 5 years ago? For a while, graphene was the wonder material that was going to cure cancer, give us fusion and make us immortal. Everything was supposed to be made of graphene material by now. And suddenly people/media stopped talking about graphene.


That progress is happening and I see it every day around the academic labs I work at. There are improvements in manufacturing larger sheets, it is used as an important step of various nanofabrication protocols, and it inspired a whole family of more interesting 2D materials.

I suspect the hype died for the same reason it happens with any important new discovery - it is overhyped in the beginning by excited scientist because they love what they are working on, then mostly silently the work continues for decades, and then people just do not notice when it is part of the everyday tech we use.


Hush your mouth! Graphene can do everything! (Except get out of the lab)


Aren’t we already at the point where a sufficiently large amount of RAM with a UPS just enough to save everything to SSD in case of power loss is a sufficiently close approximation to universal memory?


Yes and there's few computers sold that don't have either a battery (phones and laptops) or UPS and on site generation (servers). Only desktops are affected and they are the smallest segment by far.

What the world needs is cheaper DRAM, not non-volatile DRAM.


DRAM costs less than 4€ per GB. It's extremely cheap right now. I remember seeing prices above 7€ per GB a few years ago.


Explains why new phone have as much RAM (8GB) as macbook pro laptops.


Wouldn't the real use-case here rather be battery powered devices?

While RAM on a modern laptop will only use a few watts at most, which is little compared to other components when under load, that still accounts for quite a big chunk of total power usage while most of the system is idle, such as during light usage and with display brightness on low.


The largest memory technology breakthrough in memory technology I can imagine would be to the ability to cost effectively combine high density RAM and logic in the same chip.

Bandwidth intensive operations could happen inside single RAM chip or between two of them. CPU would just send instructions with parameters and no data.


Micron dabbled in this. 2003 project Yukon (shutdown in 2004) and again 10 years later with 2013 “The Automata Processor”. All pretty much scrubbed from their website http://www.micronautomata.com/research https://www.micron.com/about/innovations/automata-processing 404 or permanent maintenance.

Tinfoil hat would say NSA contracts slurped all the product.


I just find it strange that they haven't put CPU and memory on the same die, is there a specific reason for that? I mean over the years more and more components have been integrated into the CPU - north / southbridge (I forgot what those were for), GPUs, etc. Why not RAM? There's L1/2/3 cache on there as well.


Different, incompatible Fab processes (diffusion, temperature) are needed for logic and Dram cells (capacitors)


i have often thought about that,even limited isa on ram could help imo but people say it won't work. that said a company made some pr not long ago about a smart memory module so who knows.

maybe a future of memrocessors arrays waiting


People always say it won't work about everything. They're right a lot, and wrong a lot, with not much predictive power. The trick is just keep your own counsel on what could work or not.


true

I'm still struggling about the paradox of dreaming of improving a world which can only happen if you ignore the world intuition about your attempt


One way of looking at innovation is through the lens of the efficient market hypothesis. If the market were 100% efficient there would be no opportunities, every idea that could currently work would already be exploited fully. However, the market makes mistakes. People miss opportunities, discount them, etc. Almost by definition then, innovation requires going against the intuition of the market/crowd. Phrased another way, opportunities only exist where most people think they don't. So to be an innovator, you must be a contrarian about the thing you're innovating. Once you realize that, there is no paradox. It could not be otherwise.


I get that, it's just the core human emotions behind it. Fighting against the thing your trying to help feels odd. "why should I care" is something I often think nowadays. And often you end up not caring about improvement but about the pleasure of being the winner for a while and stashing profits.


I think that's fine. There aren't too many places where innovation truly makes the world a better place. I mean negative externalities and other exceptions aside, each innovation must make at least a small improvement to the world, or nobody would pay for it. But SV companies like to claim they're changing the world, when really their app/software is not such a big deal. Salesforce is not saving lives, but maybe it saves people some time and frustration.

If you can do Elon Musk style innovation, that's enviable because you could have a really strong intrinsic motivation to do that.

Most of us just settle for innovating a new job for ourselves that hopefully pays a bit better and is less hateful. I'd settle for that.


ULTRARAM

Sounds like something decided by committee.


With Ultra as a preposition in Latin it translates as - Beyond RAM.

So does seem well thought out marketing wise for the name.


Everybody would call it URAM which is kinda okay


Which will inevitably be confused with UDIMM


I'd vote for RAD. Random/Rapid Access Drive.


Oh I'd go for Transcendental Processing Storage - that way could get a TPS report from the OS ;).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: