> I'm talking about the complexity that namespaces and entities add to the data model
I've worked a lot with XML, and I have no idea what complexity are you talking about. This just wasn't complex / difficult. Once you've learned what this was about, this was your second nature. Eg. I spent a lot of time working with MXML -- that is an XML format for Adobe Flex markup similar to XAML and a bunch of others of the same kind. It used XML namespaces a lot. But that was the least of my problems using it...
Again, I've never had anyone who learned how and why to use XML namespaces complain about it. All complaints about this feature were coming from people discovering it for the first time.
> You can normalise and compare arbitrary pieces of JSON
Dream on. No, you cannot. It depends on parser implementation. For example, you have two 20-digit numbers where 15 most significant digits are the same. Are these numbers the same number or a different number in JSON?
The fact that it's 5 pages means nothing... it's 5 pages that define a bad language that creates a lot of problems when used. So what if it only took 5 pages to write it? You can probably squeeze Brainfuck definition into half a page? -- So what, it's still a lot harder to use than JavaScript.
I worked with XML extensively for many years starting back in the 1990s. When I'm saying that namespaces add complexity to the data model I'm not complaining about them being difficult to use or understand.
>Dream on. No, you cannot. It depends on parser implementation. For example, you have two 20-digit numbers where 15 most significant digits are the same. Are these numbers the same number or a different number in JSON?
That's just a mildly interesting interoperability edge case that can be worked around. I agree that it's not good, but it is a problem on a wholly different level. XML elements not being comparable without non-local information is not an edge case and not an oversight that can be fixed or worked around. It's by design.
I'm not criticising XML for being what it is. XML tries to solve problems that JSON doesn't try to solve. But in order to do that, it had to introduce complexity that many people now reject.
Edit: I think we're talking past each other here. You are rightly criticising the JSON specification for being sloppy and incomplete. I don't dispute that. I'm comparing the models as they are _intended_ to work. And that's where XML is more complex because it tries to do more.
I've worked a lot with XML, and I have no idea what complexity are you talking about. This just wasn't complex / difficult. Once you've learned what this was about, this was your second nature. Eg. I spent a lot of time working with MXML -- that is an XML format for Adobe Flex markup similar to XAML and a bunch of others of the same kind. It used XML namespaces a lot. But that was the least of my problems using it...
Again, I've never had anyone who learned how and why to use XML namespaces complain about it. All complaints about this feature were coming from people discovering it for the first time.
> You can normalise and compare arbitrary pieces of JSON
Dream on. No, you cannot. It depends on parser implementation. For example, you have two 20-digit numbers where 15 most significant digits are the same. Are these numbers the same number or a different number in JSON?
The fact that it's 5 pages means nothing... it's 5 pages that define a bad language that creates a lot of problems when used. So what if it only took 5 pages to write it? You can probably squeeze Brainfuck definition into half a page? -- So what, it's still a lot harder to use than JavaScript.