"...it can be easier to change grammar/law/types/schema when not explicit."
Now that's quite interesting statement. I'd say it is true for humans but false for computers and it's paradoxical to think about why this is.
If computers were an intelligent as humans, you wouldn't have to worry about giving your program any structure because the computer could change that structure later. But sadly, spaghetti code isn't what it's cracked up to be.
Similarly, an application where don't bother thinking about your schema beforehand isn't going to be application which you can change easily later.
It is not just spaghetti code, though. Right? It is code that does not constantly check that input fits an expected pattern. Which is most code. I'm reminded of this talk by Sussman.[1] Essentially, in software we are (understandably) adamant about not modifying the input of our application for "pure" functions. Nature goes right the opposite, often attempting to make the input work at all costs.
So, whereas a schema based application that models a user with an address is unable to cope with being given two addresses. A schema-less approach where it just stores blobs simply stores what it was given. If code that reads this is unable to make sense of multiple addresses, it will raise an error. Not necessarily unlike a user being told to send a package to an address, but given a list of addresses.
Now that's quite interesting statement. I'd say it is true for humans but false for computers and it's paradoxical to think about why this is.
If computers were an intelligent as humans, you wouldn't have to worry about giving your program any structure because the computer could change that structure later. But sadly, spaghetti code isn't what it's cracked up to be.
Similarly, an application where don't bother thinking about your schema beforehand isn't going to be application which you can change easily later.