Something I've been thinking about lately is how browsers have essentially become a dependency for any sort of auth on the internet. Pretty much everything uses OAuth2, which requires you to be able to render HTML and CSS, and in many implementations JavaScript.
That's ~20M (Firefox) to ~30M (Chromium) lines of code as a dependency for your application, just for auth. This applies even if you have a slick CLI app like rclone. If you want to connect it to Google drive you still need a browser to do the OAuth2 flow. All of this just so we have a safe, known location to stash auth cookies.
It would be sweet if there was a lightweight protocol where you could lay out a basic consent UI (maybe with a simple JSON format) that can be rendered outside the browser. Then you need a way to connect to a central trusted cookie store. You could still redirect to a separate app, but it wouldn't need to be nearly as complicated as a browser.
While I absolutely share the philosophical concern, I do wonder how large of an issue this is in practice. On an IOT device, or a retrocomputing/hobbyist platform, you'd likely want to display a QR code and have the user authenticate with their phone, similar to how you log into Netflix on a smart TV.
As an aside, to my knowledge OAuth2 still works in PaleMoon. I just downloaded the source and did a count with CLOC, and it looks like there's "only" ~13.5M lines of code. :)
> On an IOT device, or a retrocomputing/hobbyist platform, you'd likely want to display a QR code and have the user authenticate with their phone, similar to how you log into Netflix on a smart TV.
It's a fair point, and there are specs[0] defined for these uses. Something like rclone could certainly do it this way if Google supports it on their end. But IMO the UX of browser-redirect OAuth is actually pretty dang good. I would like to have that available for CLI apps. What if you could literally import an ncurses library directly into your app and do the flow in-process? I'm not even sure if there's a way to do that securely but it would be sweet.
> As an aside, to my knowledge OAuth2 still works in PaleMoon.
That's going to depend completely on the OAuth2 implementation on the authorization server. It's completely up to the provider whether to require JavaScript or other features in order to render their consent page. Having a stronger specification of how to build those pages would offer more guarantees for interoperability.
> What if you could literally import an ncurses library directly into your app and do the flow in-process? I'm not even sure if there's a way to do that securely but it would be sweet.
I mean, the security of OAuth as a UX flow is that you're entering your username and password on the site you're authenticating with, not the intermediary. (And this is verifiable by looking at the address bar, insofar as that's a reliable option.)
I recall using an Electron app that did OAuth using its own web view, and despite being a fairly well-known app, I had some misgivings because I had no way of knowing if that Google login interface was actually Google's OAuth page or a mock-up generated by the app. (Not trying to spare the guilty here, I can't remember which app. Too many suspects.)
I don't see how you could avoid this problem with an imported ncurses library. Even if the requesting process launched a separate trusted binary to do the authentication flow, verifying that you were actually interacting with that program instead of a mockup is very far into the weeds of tech savvy, and at best is even slower than doing the process manually.
The only option I can see working is to have a dedicated OAuth app. You copy-paste some sort of request token out of the client, it prompts you for username and password, then it gives you a code to enter into the client. Basically the same as a how CLI apps negotiate OAuth now, except you never leave the terminal.
Not saying it's a better idea, just that it's the only one.
It's an issue because browsers keep getting bigger and more bloated, which encourages buying faster machines, which encourages consumption, the related production, and all the waste and pollution that come with it.
While this happens to be a great concern of mine, but I'd say it's a pretty different issue than the discussion of OAuth.
Although since we're here, discussing numbers of lines code—how much do you think web browsing performance has to do with browser complexity versus the websites themselves? I've always assumed the primary issue was the latter. Does e.g. Hacker News have higher system requirements in Chrome 97 vs Chrome 1.0?
Browser complexity worries me for other reasons—namely, the web isn't really a standard if there's only one implementation that matters, and because browsers are so complex, no one can really hope to create a new one.
> While this happens to be a great concern of mine, but I'd say it's a pretty different issue than the discussion of OAuth.
Well the problem is that OAuth providers will only give you your tokens if you go through impossibly complex websites
It's 100% the fault of websites - but websites allow themselves to get fat because browsers allow it, and because they allow it websites get even fatter. It's a codependent system. The medium makes the media etc...
I can only share your concern about having the monoculture we have in practice. Gemini is the best tool we have not only to escape this complexity, but hopefully put some sense into the mind of web designers.
> All of this just so we have a safe, known location to stash auth cookies.
> It would be sweet if there was a lightweight protocol where you could lay out a basic consent UI
This also feels like it should be a part of the OS, since it's "select user, input password, and maybe 2fa, store auth". It doesn't need a full-blown web-browser.
It does, however, needs a properly defined protocol.
>Pretty much everything uses OAuth2, which requires you to be able to render HTML and CSS, and in many implementations JavaScript.
I don't think this is true. My first (and last, yuck) foray into golang was forking and improving on a REST API client for Questrade[0]. I used it to write a bot that would alert me via push notifications on my phone when certain options contracts met favorable criteria, and despite the auth being OAuth2 the whole thing was hand-tooled in golang. No browser anywhere near it.
If I'm reading that correctly, it looks like you can download your first refresh token manually and then use that to get more tokens. I agree this is technically OAuth2. A more accurate way to articulate my point would be to say "for the services most people want to use, OAuth2 requires a browser".
Totally, which is probably how we got here. But native apps have gotten very popular, and on every platform I'm aware of they have to launch out to a full browser view in order to authorize against anything.
That said: cookies may have been a mistake (opinion). I should back that up with more evidence, but until I can put a complete argument with evidence together -- that's all I would like to say on the matter for now.
Think of it this way: web browsers are ALLOWING oauth in the first place. Oauth only works when, when a user has to login, they do so in the oauth provider independently - so you need some sort of common dependency that works across all platforms, can execute code (JavaScript). Web browser are pretty much the only cooj denominator - unless you want each device (OS) manufacturer to have to provide their own auth which as a user is a worse solution.
SSL cert auth is recognized by every major HTTP server and used by every major browser. Frankly I don’t understand how this still is or ever was an issue.
That's…not the same question. But yes, I am. We have systems at work built on AWS Cognito and OpenID Connect, which is based on OAuth2. No browser is required, except to interface with the user. It's all HTTP and JSON underneath.
> No browser is required, except to interface with the user.
That seems like it's always required then. I think maybe I'm not understanding what you mean. Can you describe the flow in a bit more detail? I would be very interesting in doing OAuth2 without a web browser.
Oh, I see what you mean. Yes, the flow is probably going to require user consent via a browser. But that doesn’t mean the whole app has to be JavaScript, and I believe there are flows that are more suitable for clients that aren’t. And I’m not sure I see this as a downside, the whole point is that the frame of the site you’re authorizing through is trusted. There’s no easy way to replicate the security implications of what it’s doing without a browser.
Most content on the web is static text, audio, and video, and shouldn't need a Rube Goldberg virtual machine to consume. I've been advocating splitting the web into the "document web" (ie the web as it was originally conceived) and the "app web" (which is cool and useful but a different thing) for a while now. There should be two different programs for consuming them.
And to your point, if simple web browsers became useful again, maybe simple operating systems would be viable once more, further reducing the dependencies involved.
That's ~20M (Firefox) to ~30M (Chromium) lines of code as a dependency for your application, just for auth. This applies even if you have a slick CLI app like rclone. If you want to connect it to Google drive you still need a browser to do the OAuth2 flow. All of this just so we have a safe, known location to stash auth cookies.
It would be sweet if there was a lightweight protocol where you could lay out a basic consent UI (maybe with a simple JSON format) that can be rendered outside the browser. Then you need a way to connect to a central trusted cookie store. You could still redirect to a separate app, but it wouldn't need to be nearly as complicated as a browser.