> a single atom of the element holmium carefully placed on a surface of magnesium oxide. A special-purpose microscope uses a tiny amount of electrical current to flip the atom's orientation one way or the other, corresponding to writing a 1 or 0. The researchers then read the data by measuring the atom's electromagnetic properties.
I'm not sure I could have recalled the existence of the element holmium, I've never heard or read much about it. I looked it up, and found the likely reason it was used for this research:
"Holmium has the highest magnetic permeability of any element and therefore is used for the polepieces of the strongest static magnets." https://en.m.wikipedia.org/wiki/Holmium
I don't know if we'll see practical atomic storage or if more than one bit per atom is physically possible, but in theory there's enough space in an atom to hold millions of bits. But I think you have to get to black hole density... https://en.m.wikipedia.org/wiki/Bekenstein_bound
Just note that in practice nobody wants to keep their data storing atoms frozen near absolute zero, but rather prefer to have them at temperature close to 300 K.
However to store 1 bit of information at given temperature the energy difference between state corresponding to 0 and state corresponding to 1 has to be not less than something of order kT ≈ 0.02, otherwise the information would be quickly erased by thermal motion. But if we take maximum energy gap at atom that might be used for storing information to be upper bounded by atom's ionization energy [1], it turns out that it can't be larger than something of order 10 eV. So it doesn't seem to be possible to store more than hundreds or thousands of bits per atom at room temperature.
This is an interesting point because with the kind of matter density needed to even approach the Bekenstein bound, it seems like achieving near zero temperatures would be increasingly difficult.
Said another way, the Bekenstein bound is a limit based on the amount of information contained not just in a volume, but also with a given amount of energy. IANATP (I am not a Theoretical Physicist) but it seems like, according to the Bekenstein bound, lowering the temperature might reduce the theoretical amount of information available.
Anyway, yeah, the Bekenstein bound is purely theoretical, there is not, and probably never will be a practical demonstration of it.
> I don't know if we'll see practical atomic storage or if more than one bit per atom is physically possible, but in theory there's enough space in an atom to hold millions of bits.
I think he means that physical space could accommodate such number of bits before a black hole forms, not that we can tame an atom specifically to hold that information.
I thought he meant that you could encode more than one of two states. For a very base analogy, instead of just - and | representing 1 and 0, you could have - \ | / representing 00, 01, 10, 11, etc. Wifi does something similar with signal phase.
While I could (and do) definitely have fun speculating that more than two states might be possible to represent - I can imagine a bunch of armchair physics possibilities - I didn't mean to suggest anything specific. The Bekenstein bound is only an idea, there's no known physical way to get even close.
Maybe ionizing states, or bonds using multiple kinds of atoms, or use of radioactive elements, maybe something like that could be used to represent multiple states... I'm sure IBM & other labs are pushing to find out as fast as funding permits.
I can imagine a bunch of armchair physics possibilities
normal caveats (not a physicist, chemist, lawyer, etc)
Since atoms are made of multiple components, if you can modify and measure those components individually, then it's at least theoretically possible to encode more than two states per atom. All of the following assumes you would want to keep the same atomic number for the duration, obviously if you don't care what type of atom you're storing then there would obviously be many more than two states.
If it was possible set and count how many neutrons an particular atom has (aka which isotope), then it would be possible to encode more. Even Hydrogen has three isotopes, and Xenon has nine stable isotopes (and many more unstable). Same for number of electrons (aka ions).
If there are more properties that could be manipulated for each of those individual components, then it would be possible to have even more states. (ex: electron spin).
For example, with a hydrogen atom and it's 3 isotopes, it's theoretically possible to encode 4 states (2^2, half-nibble, crumb)
Things like this were partially why I got interested in electrical engineering & physics. Sadly, 15 years later, my career deviated to financial software, but I still find articles & progressions like these fascinating.
You'd be surprised how quickly it comes back. I recently had to design a simple ~50 node circuit after ~10 years of not doing anything of the sort, and I had it simulated in SPICE and prototyped after maybe 10 hours of work over a few days.
We had to learn SPICE in my ECE program. I found that inexplicable- writing a circuit in SPICE is like using a slide rule. I can derive a circuit on paper (just as I can do math), and I can use an actual circuit analyzer (just as I can use a calculator). Knowing the foundations of simulation programs is justifiable, but using and practicing with them is just excessive. As the circuits got more complex, I "cheated" by using a script to generate netlists and eventually I just used LTSPICE because christ, I have better things to do than type until my fingerprints wear smooth.
Storage medium is just part of the equation. The other important and hard part is the mechanism that is required to read and write data to that medium. Would you use a CD as a storage medium when the CD reader/writer is size of a washing machine?
> Would you use a CD as a storage medium when the CD reader/writer is size of a washing machine?
Yes, according to history, if that's all anyone had. :)
IBM's project might be the ENIAC of molecular storage devices. Only time will tell. Keep in mind your example doesn't go far enough to match past history, we used to actually have much worse than 600MB / washing machine. We used to have 100 words / warehouse.
"By the end of its operation in 1955, ENIAC contained 17,468 vacuum tubes, 7200 crystal diodes, 1500 relays, 70,000 resistors, 10,000 capacitors and approximately 5,000,000 hand-soldered joints. It weighed more than 30 short tons (27 t), was roughly 2.4m × 0.9m × 30m (8 × 3 × 100 feet) in size, occupied 167m2 (1800 ft2) and consumed 150 kW of electricity."
"In 1953, a 100-word magnetic-core memory built by the Burroughs Corporation was added to ENIAC"
* EDIT: It'd be more fair to use punch cards as ENIAC's storage mechanism to compare against, and punch cards held a lot more than 100 words. Anyway, still, crazy by today's standards, right?
To be fair, there's a difference between long-term storage and working memory. The magnetic-core memory you're talking about is RAM, not storage. Though it is looking like the two may converge some time in the future, until now "working memory" (RAM) in computers has always been far lower-density than storage, but far faster, for use in computations (and also volatile, unlike storage which is non-volatile). It's the equivalent of comparing your brain's short-term memory (when thinking about a problem) to your hand-written notes. Punch cards are indeed the appropriate comparison.
But to get back to your original point, a washing-machine-sized storage machine is perfectly acceptable if that's all your technology allows. In fact, it'd even be acceptable now, if it allowed you to replace what currently takes a whole data center's worth of hard drives. I'm sure Google would be ecstatic if they could store all of YouTube on a single machine the size of a washing machine.
I think it's a good unit of comparison. Virtually everyone has listened to a song to completion, and a very large number of people have done so in the past 24 hours, or even hour. It's also much easier to quantify the length of a song. Whereas with books: well, it's been a couple months at least since I've finished a new book. And number of pages in a book is not as meaningful to the human experience as amount of time. And the amount of time spent on a book -- i.e. minute per page -- greatly varies per human.
Sure, according to who you ask, a 3-minute Justin Bieber song contains less "data" than a 3-minute Bob Dylan song, but at least the quantifying of time is consistent among different people (um, relatively speaking).
And sure, 26 million songs is still as hard to comprehend as 26 million books. But again, more people can quantify how much of their life a song takes because most people have more recently consumed a song's worth of information.
The variance between data storage for song (e.g. length, kbps) is not meaningfully different enough in terms of order of magnitude.
To insinuate that the average user is using uncompressed audio is being obtuse. The social context for the measurement by song came from iPod like devices based on the average 1MB per minute of audio MP3/MP4.
Raw is 10MB
Loseless compressed is 5
Everyone using "songs" as a metric is talking about 1MB per minute MP3.
There was a time something of the sort would be needed for the popular audience - but contemporary 'normal people' well understand the usual measures of digital storage.
People don't choose between the '3,000 song' & '6,000 song' iPhone variations - even though Apple previously offered this comparison much more prominently for its iPod range. They choose 8/16/32/64/128 GB (the modern public is even catching up with us in having the powers of 2 memorised!) or whatever is the current lineup.
why? for those who know how big a song is, you can easily translate. for those who have no idea but roughly know how many songs their music players can hold it is a far more meaningful measure than units of 'bit'
The atoms are manipulated by a scanning tunneling microscope [1], which allows you to both image and manipulate single atoms, as shown in the movie "A Boy and His Atom" [2], also made by IBM. You can make sure there's only one by just taking an image at a resolution high enough to see single atoms.
This is fundamentally a scanning technique. A very sharp tip, down to a few atoms at the point, sometimes capped with a single carbon nanotube, is scanned across the surface of whatever sample you have, which for a measurement like this, must be almost atomically flat. A bias is applied between the sample and the tip, and quantum tunnelling can allow for electrons to move between the sample and the tip. This current can then be measured, and correlated with sample height or electronic properties of the sample. If you scanning step size is less than that of the size of an atom, you can then image single atoms by detecting the change in current due to a different species of atom, or due to the change in height between your flat surface and tip when an atom is sticking out of the top of the surface.
To manipulate the atoms, the tip is moved close enough to an adatom that it begins to form a weak bond with the tip. The tip then can move and essentially drag the adatom with it to wherever the researchers want. [3]
As a child (showing my age here) I remember getting a 20MB HDD for our desktop machine (might have been an Olivetti with a 286 CPU, I don't remember). At the time, that seemed an incredible amount of storage - "how will I ever fill that", I thought!
Couple of years ago a japanese team used an iodine atom to compute a fourier transform. The computation was faster than a computer but the ETL was super slow.
IBM has been pushing stuff around with a tunneling microscope for decades.
It's cool but the press should report the transfer rate.
This website is a piece of shit. It autostarts a second video off screen even after I swatted the first one that popped on the right, covering the text of the article.
I was a team lead for an IBM operations team for 6 months in 2016. Aside from my first two days, I worked from home the entire duration of the role, based in Sydney. Most of my staff were in China, with handoff to the next shift in Europe (who were all remote staff as well).
The thought process is that someone gets badly burned by Y, so the association with A becomes very strong - stronger than any other association. Now every time someone mentions A, they bring up Y.
Nope, they recently changed their policy where your manager has to go through tons of paperwork for you to work from home one day a month! One of my really good friend works for IBM and that person told me about it.
From your article: "IBM has spent the past couple of years undertaking a massive turnaround effort to transition from its servers and services business model to one focused more on cloud, security, analytics, and mobile. That turnaround has brought with it thousands of job cuts."
While it's obvious that IBM is trying to cross the same CASM (cloud, security, analytics and mobile[1]) as their "West Coast competitors", I'm glad that IBM is still investing in basic research as per the OP submission.
I'm not sure I could have recalled the existence of the element holmium, I've never heard or read much about it. I looked it up, and found the likely reason it was used for this research:
"Holmium has the highest magnetic permeability of any element and therefore is used for the polepieces of the strongest static magnets." https://en.m.wikipedia.org/wiki/Holmium
I don't know if we'll see practical atomic storage or if more than one bit per atom is physically possible, but in theory there's enough space in an atom to hold millions of bits. But I think you have to get to black hole density... https://en.m.wikipedia.org/wiki/Bekenstein_bound