Never, ever use access tokens as proof of identity the way you are doing it. It makes you vulnerable to token substitution attacks. OAuth2 is not an authentication protocol.
I've been recently learning my way around web dev. Most of it is straightforward enough, but security is my sticking point. Everything on the web is contradictory, half explained, and rapidly changing. Seems like you know your way around best practices. Can you point to a decent trustworthy tutorial/book on how to handle logins and identity? Seems like lesson one is "don't implement it yourself" and lesson two is never quite spelled out.
Agreed. The token should be used only after the user is authenticated through another channel (e.g., username+password).
Otherwise, an attacker could obtain the target's OAuth token by getting the target to provide the token to a malicious application. The attacker can then easily authenticate through your library.
Given that OAuth is all about delegated authorization, meaning entity that uses the access token may not be the user but some third-party service using the token on behalf of the user, using it as proof-of-identity makes no sense.
This point becomes clearer with limited permissions. If access token is proof-of-identity, why limit what the user can do when you know it's the user?
In one way or another. Most are vulnerable to bugs in standard (see sakurity.com/oauth) but every single one depends on central authority which is just stupid for auth.
I applaud the goal of simplifying authentication by wrapping with an easy-to-use service. But it's important to keep in mind that authentication is a subtle process, and a critical security component of many applications!
It's essential to follow best practices, and for developers to understand enough of the interaction to be confident in an implementation.
A few points to look out for after scanning the source for the project:
* if your goal is authentication (helping a user sign in to an app using an external identity) you should be sure to use an OpenID Connect flow (requesting a scope like "openid"), and be sure to convey the id_token back to the app so its signature can be verified.
* you should be sure to create a non-guessable "state" parameter when generating an OAuth authorization request, and to verify this parameter upon completion of the OAuth core flow.
* your design needs to prevent a session fixation attack where a malicious user injects her own token into the callback url (tricking a user into signing into an app using the attacker's account, and potentially submitting data that the attacker can then access the)
* it's best to avoid inserting an access token into a url (where it becomes part of a user's history, gets cached by proxies, etc) -- the OAuth code flow you're using avoids this, but then you turn around and inject the token into your own redirect
Re: all the other things mentioned — I presume the point of Docker in that list is that the system-software packages mentioned are inside the shipped Docker image. It's an "appliance."
"Ships as a Docker image" might, for a lot of projects, just be a roundabout version of "it's a .zip file; unpack it into /opt"—but I think it's justified if your project is essentially a bunch of distro packages with a layer of policy-glue making them interoperate in a special way. Like OpenStack DevStack, or a NAS server appliance, or an "all-in-one" [L]AMP stack.
Fair enough. However, my opinion of this matter still stands. I think something like a library or even an executable/process should try to decouple itself from deployment process; and should be designed as such. Its very opinionated - I must admit - but it tends to take away a lot of friction from adopting. Also, usually, when we add things like Docker to the mix, it is quiet unclear if the said microservice can function without Docker (perhaps it's using something X specific and cannot work without it); if it can, lets mention it otherwise even if it is something I want, I won't spend the time to test it myself (yeah I am very a very picky lad I know but I bet this is something a lot of people do)
What are you talking about. Docker is a tool that positively contributes to the deployment experience and reduces friction in adoption. It is one of the tools that directly contributed to the adoption of microservices because it makes dealing with a large number of services (500+) sane and possible.
You should learn more about docker and dev-ops, because with microservices the focus is increasingly towards having developers support their services end-to-end, which means the devs are responsible for deploying/supporting the services they wrote in production.
>Docker is a tool that positively contributes to the deployment experience and reduces friction in adoption.
Well its one more things to install for starters. Do I need a specific version of docker to run with this? To me it's just an extra layer of complexity.
Well, personally I prefer using something like https://github.com/TykTechnologies/tyk which basically API Gateway, and in addition to auth (not only oAuth), also adds stuff like policies across multiple API's and rate-limiting and quotas.
https://oauth.net/articles/authentication/
DO NOT USE THIS in its current state. Stop upvoting this.