I understand the intention was to create a DML interface for DDL, but I dont really understand why. I guess batch DDL operations are a little easier with meta, but it seems like a lot of work for relatively little gain. There is also the issue that expensive DDL operations (e.g. ALTER TABLE) are hidden behind a DML interface potentially making it easier to shoot yourself in the foot.
Were there problems higher up in the web stack that required the functionality meta provides?
For Aquameta's purposes, we're trying to make it easier for people to build visual interfaces that do programming. Higher up in the stack is a PostgreSQL admin GUI. Without meta, the backend of this interface would be the DDL grammar, so when the user took GUI actions, they would be converted to CREATE statements etc under the hood using string concatenation and other dreadfulness. With meta, we can instead just build against meta's data model, so that building this GUI just as simple as any traditional web data manip interface. It seems more elegant and simple.
It is a lot easier to shoot yourself in the foot. You can do very destructive operations through meta. It's more powerful for doing useful things too. Just be careful.
I agree I am not 100% sure I see the life-changing utility of their abstraction.
HOWEVER, if they factored out their VIEWS component so that I can have a more grokkable version of INFORMATION_SCHEMA that's read-only, that sounds useful.
As interesting as this is, I can see why original SQL "designers" didn't go down this route and instead have a separate DDL.
SQL and particularly postgres' documentation go to great lengths to be very specific about what micro-behaviour you should expect when performing SELECTs, INSERTs, UPDATEs and DELETEs (Of major note the concurrency behaviour). Once you start operating on the schema all of these promises will go out the window.
So these "queries" would mostly behave as you'd expect, but not totally.
At best you'd have to fill the documentation with "oh except if you're operating on the schema..."s. At worst it could be horribly confusing to users.
I am also writing a microservices api inside postgres, is WIP. http request is captured and sent as a json. postgres does api authentication, building json response, request/response logging.
for development purposes using sinatra(ruby) and gin(golang) for http processing.
Were there problems higher up in the web stack that required the functionality meta provides?