Hacker News new | past | comments | ask | show | jobs | submit login
State considered harmful: A proposal for a stateless laptop (2015) (invisiblethings.org)
138 points by peter_d_sherman on Oct 24, 2018 | hide | past | favorite | 65 comments



I've started running Tails on my laptop on a daily basis. It's great. https://tails.boum.org/

It runs off a USB stick and presents a pristine system each time you boot. Modifications to the filesystem go on a ramdisk "overlay", which disappears as soon as you power it off.

It does support "encrypted persistent storage", but there is quite fine-grained control over what things are allowed to go in it. In particular, nothing to do with the web browser persists across a reboot unless you go quite far out of your way to make it persist. And each time you boot, you can opt not to unlock the persistent storage if the work you want to do is either liable to get you infected with malware, or just doesn't need any of your persistent data.

Additionally, all of the networking is done over Tor, and the firewall blocks non-Tor traffic, so you can't even miss out Tor by accident. (But if you don't want to use Tor, it's probably not for you, although having used Tor in the past, I was pleasantly surprised how much faster it is now than it was then).

EDIT: And for those who don't know, OP is developer of https://www.qubes-os.org/ which I tried out for a while, and really like the premise of, but found just slightly too inconvenient to use, and I currently prefer Tails.

Qubes has multiple container VMs. All applications run inside a VM, but the I/O is transparently multiplexed into one desktop environment. The window decorations tell you which VM each application is running in. You can copy and paste, and share files, between different VMs, but only where you explicitly want to. You can create VMs to separate data for different purposes (e.g. one just for email, one just for web browsing, and different ones for working on different projects). The VMs can even run different OS's, and there are VMs included by default that route all of their internet over Tor. It is great.


I don't think it's relevant to the article. If your host machine has malware in CPU firmware, then it doesn't matter whether you boot from [stateless] USB, because malware sits between your kernel and, say, keyboard, network card, or display.


sounds horrifying

can you point to any resources on a few things:

(o) examples of these kinds of malware, either code or writeups

(i) ways of detecting these kinds of malware

(ii) ways of removing these kinds of malware

(iii) ways of preventing these kinds of malware from getting installed



it looks like all of your links are basically only addressing my initial list item

the first link is about modifying a system's c compiler which the op's tails live usb is infact a solution to

the second is about altering firmware on harddrives, which is an engrossing writeup, thank you, but does fall short of suggesting ways to identify and remedy such an invasion

your third one is about intel's ME which i figured i'd see someone link to

unfortunately, the really interesting listed elements in my original post are the preventative and resolutive elements


I would like it if laptops had a semi-persistent storage area that you could only flash after authenticating, and then you could enter a password if you needed to unlock your persistent data. It's a great way to ensure you don't get any viruses, but it seems a bit too inconvenient for everyday use.


Like Apple's secure enclave or Intel's SGX? Or a physical jumper which would require physical access to move?


EEPROM + physical jumpers are what I always recommended. Blocks most if not all software attacks, makes physical attacks more noticeable (eg at a cafe), and still leaves control in hands of owner. Just a bit of inconvenience but quite usable.


Hi. A lot of interesting and intelligent comments, but I think that on the whole, we've over-complicated the issue.

To explain what I mean, consider a Commodore 64, Apple IIe, or Atari 800, Ti99 4a, or Radio Shack Color Computer, or most home computers from the early 1980's.

None of these computers, NONE OF THEM, had any form of persistent memory (Flash, EEPROM, battery-backed CMOS, CPU persistent memory, etc.) such that when you unplugged the computer and plugged it back in, something, even so much as a single bit was changed.

In other words, you unplug, and you start over with the exact same initial STATE as you did the first day when you unpacked the box.

That is, software, legitimate or malware, CANNOT change the initial power-up state of the computer. CANNOT.

That is STATELESS HARDWARE.

Note that I did not include x86 PC's in my list, because they typically have battery backed up CMOS, and onboard FLASH in newer models. Those things (and places for writeable Firmware) STORE STATE, thus they change the INITIAL power-up state of the computer.

Once a computer can store state between power offs, then you could be dealing with a different machine than you were when it first came out of the box.

When that happens, now you need all kinds of crazy security asserts that whatever is in there is safe. As opposed to simply powering off, and starting fresh, knowing that you are.


None of those 80's computers had power-on networks either, which is the most interesting part of the proposal, IMO. Persistent storage isn't the only way to affect the initial power-on state of today's computers.

It would also help to call this whole idea "fixed boot state" or something similar, since "stateless" is misleading even to developers, and not super accurate, which may be part of the reason you're sensing complication.

It is fun to consider how to guarantee that a computer has a fixed boot state. Not all that difficult if you can break down all the components.

The more interesting question to me is whether that truly improves security, once you start working on networks, and whether it could be made usable for the general population, rather than so inconvenient that nobody can adopt it. Is it only something the most extreme high security secret lab air-gapped environment will ever tolerate?


>The more interesting question to me is whether that truly improves security, once you start working on networks, and whether it could be made usable for the general population, rather than so inconvenient that nobody can adopt it. Is it only something the most extreme high security secret lab air-gapped environment will ever tolerate?

Networking a machine that cannot get security updates sounds like a bad idea. It's pristine boot state is bound to get infected seconds after going online.


Exactly!


All of those machines had persistent memory in the form of cassette or floppy


That is correct. Storage by definition is persistent memory. But, BUT, the storage is in a format which is easily auditable to the end-user or virus-checker program. You can reformat/reinstall and start over. Show me how that's possible with undocumented storage (e.g., EEPROM, CMOS, FLASH, microcode updates, on-CPU storage, etc.) Once you've got undocumented persistent writable memory holes, neither end-user nor virus-checker can audit them. You're basically at the mercy of the code written by the vendor, or whoever figures out how to shove unwanted stuff there. Also, you can write protect your cassettes and floppies, and be fairly certain that your wishes will be respected. Can you do that with undocumented persistent memory locations in your Chipset/Motherboard/CPU/PC? Do you even KNOW about undocumented persistent memory locations in your Chipset/Motherboard/CPU/PC? And... are you sure? How sure? You see how far the rabbit hole goes?


That's basically what is proposed in this paper except with a USB stick or SD card instead.


Other than the Apple IIe floppy (I think, it has been a bit), cassette or floppy did not boot with the machine and had to be loaded.


It's frustrating that the author is using the word "stateless" because it usually means something different in the context of computer-related topics. Maybe the title of the paper should have been something like "A proposal for an ephemeral-storage laptop" or something along those lines.


Conversely, it's frustrating that you reference some definition instead of just stating it. Are you referring to stateless (web) applications? Wouldn't that also exactly fit your other "ephemeral storage" definition? You surely don't propose that those applications don't keep state while processing a single request. This proposed laptop only keeps state for a single boot.


>It's frustrating that the author is using the word "stateless" because it usually means something different in the context of computer-related topics.

Not really different. State is state.


Well then the premise is wrong. If state is state, then any useful computing device has state. The question is where (and if) that state is stored persistently.


The meaning of the title was not obvious to me either. Stateless in programming means pure functional, so the first thing I imagined from the title is some sort of new pure functional CPU.

This laptop also has a state, the post-boot state is just fixed and repeatable. And, of course, the underlying technology is not different, it makes use of a lot of state, so using the word “stateless” here is certainly not the clearest language possible.


To be fair, though, not even functional programming is truly "stateless." The program has a state as it's actually executing a function. It's still remembering the functions arguments, etc. It's just that the state is extremely ephemeral, and is lost by the time the function returns.

I think this is using the word "stateless" in the same way, only over a slightly longer timescale.


Functional is a programming language paradigm. That's too high a level to think at when you are talking CPU's and hardware.

There's no linguistic or semantic sugar on this level. Think assembly and lower. Remember, programming langyages, paradigms, and compilers are for you. Not the computer. It's all just 0's and 1's to it.


The author is a well-known computer security developer, I'm sure they're aware of the alternative meanings of state when deciding on the term.


What different do you mean? In computer-related topics state always refered to the ability to change or not change after an interaction has finished. I don't see how he used a different usage?


Not clear from your comment what definition of "stateless" you mean, but it is clear from the HN title.


The author actually has to clarify what they meant in the paper, as RAM carries state as well.

> state-carrying (persistence-carrying)

So what would be more accurate, I think, is "Persistence considered harmful".

https://en.wikipedia.org/wiki/State_(computer_science)#Progr...


Well CPUs and GPUs carry state too; I don't think anything practical is completely stateless so it's a fair description.


At least in the deeply embedded space, in context 'state' is commonly used to mean 'persistant state'.


It's stateless the way AWS Lambda is serverless. A bit like calling a taxi ride "carless".


Author doesn't speak native English; she's Polish.


This is a particularly appropriate link right now, given that the core idea of "saving data and state is something which should be avoided" is seeing a renaissance in both privacy-focused circles and in privacy-focused regulatory policies. I love the idea that saving state is to be avoided, and I hope to see it implemented with increasing granularity, both on other people's computers (e.g. web services) and locally.

Remotely, a trend away from "You must create an account to view this cat meme, now enter your date of birth and present a working email address which we'll send a message to" is particularly appealing. And with hosted services, particularly paid ones, I'd love to see privacy policies focused on how little data is kept, rather than how much. I keep hoping that the regulatory or legal or insurance environment will change such that sooner or later, companies will view stored user data as a liability more than an asset, and 'zero knowledge' will be a desirable thing.

Locally, as per-application permissions and sandboxing grows stronger, I can picture a class of applications which would be hard-restricted from saving data between sessions (e.g. the browser, a calculator, a scratch-pad app). Or perhaps, "the only data you can save must be XML, in this plaintext file, so that preferences are saved, but nothing else. Qubes (OP's baby) does this already, but I think a dash of this in existing OSes would be a small step towards big security.

As an aside, in my estimation, Joanna Rutkowska is one of the most compelling computational thinkers today. Although her work tends to be at the most elaborate-threat-focused edge of computing security, reading it, it often feels like she's already running "where the ball is going", and I wouldn't be shocked to find that in 20 years, some of the 'whoa, crazy' things from Qubes or this stateless approach are actually regularly used in mainstream computing. I have no particular threat, and no particular need, and I suspect that many of the programs I use regularly wouldn't work there, but there's a part of me that would love to spend more time in Qubes.


I actually would like a stateless computer. A pure functional laptop. It could store state on the cloud as a series of changesets, and each boot would be a "checkout" of the top state. Then as you compute, you make to make new changesets.


You might be in the market for a Chromebook. Keep in mind that this isn't really what the author is talking about, though–a significant part of her paper is a desire for the ability to keep her information private, which includes measures such as removing wireless functionality.


See GuixSD: https://www.gnu.org/software/guix/manual/en/html_node/Featur...

It doesn't store "in the cloud" by default, but there's no reason you couldn't set that up.


Functional has nothing to do with how a computer actually works. It's a semantic convention employed by programming languages to assist programmers by offering an alternative way to mentally model what the computer is doing. Don't conflate that.


It's called a 'terminal'


Or a Chromebook.


Interesting! I hadn't seen this work before. Honestly this seems exactly like the kind of service that would get heckled on HN in the classic Dropbox-Show-HN style. So much so that a couple of my friends and I tried to build something like this a few years ago, essentially outsourcing the end-to-end-encrypted transport of sensitive information via a user-friendly service. We thought a user should be able to designate data for transport, and then click a button, and it gets moved off the machine until some conditions are met at a later time (once you're through the border or whatever) when the data gets moved back onto the machine. Super user-friendly and built for plausible deniability. Our effort sort of trailed off into nothingness in the face of so many other random cool project ideas, in that particular way which probably most of you reading this know quite well. Though as much as people like to joke and cajole about this, it's actually a pretty interesting problem and an excellent thought experiment. If you shed your expectations, it'll send you down some pretty cool rabbit holes. PS: We sometimes use "Seems like a great Dropbox Show HN idea" as a kind of funny emphatic for project ideas when they go on the whiteboard.


I don't see the connection with "HN+dropbox". Dropbox was/is easy to do by yourself as a techie depending on your usecase.

This is not easy/possible at all and the closest I can think of, which has long ways to go in pursuit of this (and they aren't pursuing this directly (I think)), is something like NixOS - which seems to be quite popular on HN.


That's the joke [0]. But otherwise this is what I meant by "shed your expectations". Even if it were possible to prove something was not possible, how would you build it? There might be some free money in there somewhere.

[0] https://news.ycombinator.com/item?id=8863


I know(? it feels like I'm missing something). It is just that I feel the opposite. This is exactly the type of project that HN would like. And not heckle, because there are no similarities with the Dropbox case.


I mean, at the time of my posting this, literally every other top-level comment but two are kind of heckling the idea. I agree that a (working) project would be very interesting to the HN crowd. But it definitely also is an idea that many of those same people would have doubts about, very much like the Dropbox Show HN comment.


> Dropbox was/is easy to do by yourself as a techie depending on your usecase.

Dropbox should have come from the OS vendors. It didn't, because we don't have OS vendors any more - we have ad agencies.


Of course, even with a truly airgapped computer, people have developed side channel attacks to exfiltrate information by listening to things not generally considered to be part of the "networking" portion of the computer, such as radio signals generated by the operation of the processor.


Or using distortion in WiFi signals to record keystrokes.

https://www.schneier.com/blog/archives/2016/08/keystroke_rec...


I think I must be missing something fundamental. As far as I can tell, the core notion is to move all laptop firmware to an external storage device that has a write-protect switch.

How is this better than an external write-protect switch for the firmware on the device itself? Considering my normal patterns of device usage, I'm just going to put the laptop-specific Trusted Stick into each of my laptops and leave it there. In which case I'm not seeing why it should be removable at all.

And regardless, I'm not seeing how a write-protect switch for firmware, whether on a stick or in the device, is actually better for the average user. At some point the computer is going to tell them, "Hey, you need to allow this security upgrade." How do they know when to allow writes?


Evil maid can flip your write switch.

Longer version: Boot firmware and possibly OS boot loader must be unencrypted, thus attackable by an "evil maid" if left on device. Evil maid can flip the write switch. All other storage can be encrypted and is safer to leave on device.

Alternative approach, validate state of firmware on each boot with a TPM and a 2nd trusted device that you do keep with you such as a cell phone running Google authenticator or a hardware security module (HSM). See Trammell Hudson’s Heads https://trmm.net/Heads and what's going on with Purism and Librem key https://puri.sm/posts/the-librem-key-makes-tamper-detection-...


This was in 2015. Did he ever develop a stateless laptop?

It would be useful to have one. Too many governments want to snoop on your devices at border crossings.


I know it's not exactly the same thing but you could check out something like the Purism laptop[1] and you could run Tails from a USB key[2]

P.S. I wouldn't assume that someone called Joanna is a 'he' LOL

[1] https://puri.sm/learn/why-purism-computers-are-better-than-p...

[2] https://tails.boum.org/


> P.S. I wouldn't assume that someone called Joanna is a 'he' LOL

There are no girls on the internet.

Joking aside, as for Tails, Whonix has advantages and drawbacks:

"Unlike Tails, Whonix is not "amnesic"; both the Gateway and the Workstation retain their past state across reboots. Not being amnesic improves security on the Gateway, by allowing Tor's "entry guard" system to choose long-lived entry points for the Tor network, reducing adversaries' ability to trap users by running malicious relays.

On the other hand, a non-amnesic Workstation could possibly allow attackers, especially operators of Web services, to inject state and associate a user's sessions with one another, despite the Tor Browser's safeguards; for some users, this could be a serious security exposure. It is possible for users to force the Workstation to be partly or wholly amnesic by manually resetting it to old states after use, although the developer does not suggest this. It is also possible to run more than one Workstation VM with a single Gateway." [1]

[1] https://en.wikipedia.org/wiki/Whonix


1) they or she 2) they are software people 3) There is a laptop out there with 'open' firmware you can find. flash all the chips to your liking, and have it somewhat better than this situation. That being said of course you would have to develop your own operating system and user space for it otherwise it would be useless to go through the trouble... which is the part these people are mainly working on, since it's more practical to begin with.

I hope with innovations in fpga and other types of programmable hardware ideas perhaps for low resource secure computing we can have some alternatives (for personal communciations etc.) while for high performance computing it will be ways to go before such a vendor would release something completly open and auditable. (intellectual property protection is too important (businesswise, not human-wise) in such a market unfortunatley to survive as a company who needs to invest years worth of revenue just to put up an assembly line...)



I can see it now, Dumb terminals for everybody, and administered by the free country of the US of A


pay for everything you want to save, poor people screwed in yet another way by software.



Why not use a normal laptop, but enable it to boot from an encrypted partition of an external drive?


How do you prevent software compromising the bios? Or SSD-Firmware? Or whatever hidden storage else is builtin?


Chromebook


Then all your data belongs to the web app servers.


I would like a phone/laptop where you can take sexy pictures and be sure they stay on the phone, and be sure they get deleted when you delete them.

Think I can get VC money for this idea?


A camera with an SD card is hardly revolutionary stuff.


Sincerely, with rampant hacking by nation state actors and uninspired pragmatism winning ground, I would definitely be interested in an article on "The Nation State considered harmful". Unfortunately that's not what the title was about.


Can't this kinda be achieved with Windows and Windows Server with AD? You login to the server and it downloads the content for the session and when you logout the content is removed from the local computer?

I remember when in school they used to have this, which made logins take forever.


That's also the model of the popular anonymity OS Tails, but if you'd read the paper OS-level state is not being discussed at all.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: