Hacker News new | past | comments | ask | show | jobs | submit login

This really is just bringing WebAuthn to Safari. I've been using it via chrome w/ TouchID for our corporate okta SSO and it has been working great.

WebAuthn is really just a way to make public-key/private-key crypto scale. The user never really knows about or interacts with the keys.

The website doesn't store a password, they store a public key.

The user doesn't know about the private key (paired to the public key) they just know how to unlock the private key via the yubikey or biometric device.

The site sends some data to the user to sign, and they do so with the private key, then the site verifies the data was signed by the private key, boom: authenticated.




How does this work if you need to sign in to a site on a borrowed computer while traveling or something? Is the private key derivable from a master password or something?


Apple already kind-of supports this for iClout login. If you want to authenticate on a new device, you’ll get a prompt on one of your already authenticated devices to confirm the login. You still need the password, but it’s really not necessary... if there was also some additional “privilege” system (i.e. that the same user account would have less privileges if logged in through a less secure method) it would be even safer.


I'm surprised nobody has mentioned this but you can also use a hardware FIDO2 token for Webauthn. Like a Yubikey. This is ideal if you use many computers.

If you need to sign in on a borrowed computer you visit the site, it'll ask you to insert the yubikey (if not already), you enter the PIN code and touch it, that's it.

I believe a yubikey with fingerprint instead of PIN (or additionally? Information is scarce on it) is also coming.


A few options if you want to maintain MFA:

For my Okta account, they support push notification to their app on my phone as an alternative authentication measure.

Another approach is a bluetooth (or NFC) enabled device (like your phone) that actually caries the private key. When using a borrowed computer, the signed attestation data is shared with the borrowed system, but that can only be used once, since the private key is never transmitted.

So basically you can either have a side-channel to another MFA method, or a relay where you don't need to trust the intermediary beyond the current session.


Similar to how you would sign in on devices/browsers without WebAuthn support or don't have physical token (Yubikey/etc.) with you: use a fallback method provided by a website. This is usually TOTP or a scratch code. The website need to implement it though.

One thing to keep in mind is this is not supposed to be the only factor required to sign in. It should be used as a 2nd factor in the similar way to TOTP (but with much better usability).


With Okta (since gp mentioned it), you would sign in with a password, then click the "yes its me" button on your phone, or you could use TOTP, or even a yubikey.


Okta is more of a SAML/OpenID Connect thing with built-in multi factor authentication than a replacement for WebAuthn, though. Okta could embrace WebAuthn Platform Authenticator as one of their authenticating factor if user is unwilling to install an app, but a website isn't expected to use Okta as a second factor in their authentication flow.


OP is referring to how you already can sign in to Okta as a website, not as an authentication mechanism. You can create a sign in flow that goes like so:

username -> password -> [ WebAuthn | Okta Verify Push ]

This approach can be used on any website, just with a regular TOTP code in place of the proprietary Okta Verify Push.


Ah I didn't know you could do the Okta push without using their whole SSO suite. Thank you! I guess in this scenario Okta acts more like Authy proprietary OTP thing then.


Web Authentication does support being the only authentication mechanism, since it is in itself multi factor (in this case, physical possession of the phone and biometric confirmation)


It's mentioned in the video as well that website should always use this as faster sign in option with providing other options alongside.


To think we could have had this a decade+ ago with TLS client certificates, if web browsers weren't perpetually stuck in the past.


You have the blame misattributed: almost nobody used client certificates because they cost money ($100+/year). That meant there was little demand outside of a few spaces like government and absent usage there was not much pressure on the UI improvements.

Client certificates are also worse for privacy and phishing resistance: with a certificate, if I can convince you to click on a link I get your identity. From the site's perspective, I don't have any way to tell whether the person with the certificate is the same person I saw or the person who compromised their computer or convinced a CA to issue a cert for someone else. Requiring key storage to be on a hardware enclave significantly reduces that risk, allows for the stronger attestation requirements mentioned, and also means that you're changing things from “trust anyone who can get a CA certificate” to “trust anyone who can do signatures from a previously-registered hardware key”.


Cost money? Once upon a time, html had <keygen>


How would you do the signing part though? Given that most CA don't do client certificate at all, nor it issue certificates with signing flags.

Even if CA does sign client certificate, and website is expected to store its public key, it expose some privacy concerns since a public key is now Personally Identifiable. If a website must provide its own self-signed CA and require user to provide a CSR when registering for an account, then it becomes a huge overhead for both website and user to just login.

I work with enterprise and the requirement usually requires client certificate. I never had a good experience setting it up even with a limited number of parties.


You don’t need a public CA to do client certificate authentication. Hell, your computer doesn’t even need to trust the CA that signed the key - it’s the server on the other end that cares about that.

This is precisely how WebAuthN works - but we figured out that we actually don’t need to go through the headache of getting CAs and signing involved at all. Just store a public key attached to a user after they’ve signed in via traditional means and let the browser/security token manage the keys.


Client certificates suck in a bunch of ways that WebAuthn, specifically designed to solve this problem, does not. Example:

If the certificate used to sign into Hacker News as "sneak" is also used to sign into PornHub then I can correlate that to discern that "sneak" on HN uses PornHub. Whereas you can't do that with WebAuthn credentials - a separate credential is spun up for every single registration, it's completely useless everywhere except the one site it was issued for and can't be correlated to other credentials except via a cryptographic attack on the underlying primitives (ie breaking WebAuthn itself).


There's no reason a browser couldn't have generated a new self-signed client certificate for each site, though; the fact that they don't offer that as an option is just a browser design decision.


...which means a login is now tied to a browser, and you have to come up with a way to securely export or sync private key (bad idea?) or a way to link another browser to a login. Also user losing access to all their websites by accidentally uninstalling a browser doesn't sounds very user friendly.


This is exactly the same situation as software WebAuthn keys, if they are used as a single factor.


There was actually a <keygen> html tag used within <form>s to generate a keypair what was then supposed to be signed by the server and finally returned to the browser for local installation. At least that's how I understand it. It's been deprecated for a while now.


Oh wow, yeah - I didn't realize that existed, and now I'm sad it never caught on. :(


I think some unlikely security scenarios are neglected though. You trust you platform to keep your private keys safe and it requires you to trust is. A vulnerability here would compromise your whole identity if private keys can be extracted. I don't yet believe that to be impossible. Will solve common problems like fishing though.


It probably isn't impossible, but it is made much harder with use of separate hardware, be that a physical key or secure enclave. A fully compromised laptop still can't get the private keys in a "perfect" system. Of course, the hardware might have design defects, or some debug feature to get private keys that shouldn't have been shipped in production build, or government mandated backdoor, etc ... But it now requires a compromise of both the client system AND their hardware keystore.


You should pay special attention to the section about attestation, which is not something that is done in a privacy-focused way without an anonymous attestation authority (which is part of the iOS 14 feature)




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: