I find the whole A vs B debate on this to be quite silly.
We have a "monolith" that can also be used in a "serverless" context (Az Functions).
The way it works is we have 99% of the solution in a common .NET6 DLL that can be built either as an exe that runs on some VM, or as an Az Function project that goes to the cloud. The Az Function piece is maybe 500 lines of code and ultimately just extracts ClaimsPrincipal, validates request tokens, and invokes the dependent services (just as the exe would via its internal AspNetCore pipeline).
Having both options available over the same common code base is pretty damn powerful. We built a messaging interface between the on-prem and cloud components, so its kind of 1 big hybrid solution right now.
The biggest reason we started down this path was to explore better ways to achieve compliance with industry regs. Having a bunch of semi-custom installs running on VMs for each customer is becoming quite scary as we grow. We are taking it slow at first and will probably go all the way over time. Being able to carefully select which pieces are on-prem vs which are cloud is going to be critical for us.
Ultimately, there is probably a happy balance somewhere in the middle and you could almost certainly engineer your product to support both modes of hosting at the same time. You can likely retrofit an existing function or monolith project to work either way as well.
So are you storing claims about the logged-in user, like what role(s) they have, in a cookie, like ASP.NET's stock authentication system? I almost went with ASP.NET Core for a project, but I have a problem with that approach to authentication/authorization; it leads to stale claims. I figure it's better to store only the user ID in a signed (and maybe encrypted) cookie, and read all the needed info about the user from the DB on demand. I guess I should have used ASP.NET Core but written my own auth system, or maybe used an existing third-party one if I could find one that I like.
Yes - we are using the default AAD authentication layer over our functions, which relies on basic cookies to get the job done.
The stale token thing might come up for us, but I believe we could address this in our sensitive functions by performing some kind of query to determine if the current claims are still valid.
Hmm, I guess using the default AAD authentication layer is useful for outsourcing compliance as you quite reasonably want to do, and as I would like to do as well.
We have a "monolith" that can also be used in a "serverless" context (Az Functions).
The way it works is we have 99% of the solution in a common .NET6 DLL that can be built either as an exe that runs on some VM, or as an Az Function project that goes to the cloud. The Az Function piece is maybe 500 lines of code and ultimately just extracts ClaimsPrincipal, validates request tokens, and invokes the dependent services (just as the exe would via its internal AspNetCore pipeline).
Having both options available over the same common code base is pretty damn powerful. We built a messaging interface between the on-prem and cloud components, so its kind of 1 big hybrid solution right now.
The biggest reason we started down this path was to explore better ways to achieve compliance with industry regs. Having a bunch of semi-custom installs running on VMs for each customer is becoming quite scary as we grow. We are taking it slow at first and will probably go all the way over time. Being able to carefully select which pieces are on-prem vs which are cloud is going to be critical for us.
Ultimately, there is probably a happy balance somewhere in the middle and you could almost certainly engineer your product to support both modes of hosting at the same time. You can likely retrofit an existing function or monolith project to work either way as well.