I've experienced a little bit of this same sentiment when I tell people I am using flex for my current idea. They look down at me almost, without having ever touched flex. Most people quickly equate it to flash (ok it does run in a flash player and makes use of action script), but its vastly different. There is no drawing, keyframes, etc. Now, I'm not saying it's for everyone, but it does make creating certain things exponentially easier. Ok, done ranting.
There is a small difference here in that the author proposes simplifying the experience of using a database, while requiring your users to have a properly-versioned flash player complicates the user experience. Less-supported OSes and browsers will inevitably have trouble working with proprietary data.
> But the underlying psychological framework is really a fear of irrelevancy. If you make things too simple my expertise will be less important. I will be less important.
Not necessarily. People generally become experts in technology x because it helps them solve problems they need to solve. For people whose problems are consistently solved by relational databases, a new technology that is simpler but less powerful (whether this is true of graph databases or not is worth debating, of course) is irrelevant to solving their problems, because easier to use is pointless if you're already an expert in the older approach.
When you see professional photographers shrug off point and shoot cameras, it need not be because of malice or "fear of irrelvancy", but because point and shoot cameras are intended for someone else entirely and are worthless to them.
When an expert ignores a new technology that's not relevant to him, sure, that makes sense. But when an expert responds snidely and defensively to a new technology, he apparently doesn't think it's irrelevant.
There's also an aspect that goes beyond a new technology simply being irrelevant or less useful than an an old technology beyond simple ego and fear of irrelevance.
Consider an example from the original post. For the world of computing at large, GUIs are a giant win. But many expert computer users prefer a command line for many or even most tasks, and some even use windowing systems like xmonad or ratpoison that are optimized for command-line use.
Likewise, while word processors have taken over a lot of the market for creating documents, many programmers are much more likely to use text editors -- often ones that trace their heritage back to the 70s like emacs or vim.
There's nothing wrong with these new technologies, but it's perfectly valid for people to not prefer them, right? Some newer, friendlier technologies may be better for people who don't want to go through a learning curve, but may actually be worse for people who have already gone through the learning curve or are willing to do so in the future. Perfectly normal.
Here's the thing, though: when there is a large community of people who are using your technology (whether a CLI, a text editor, a relational database, or a programming language), there are a lot more new, useful tools being built on top of that technology. This means that someone producing a new technology will siphon resources away from a technology that earlier users consider superior for their uses.
Irrespective of being considered irrelevant, having fewer new tools at your disposal actually reduces what you can get done. Which sucks, right?
His theses in the older article are that A) More complex relationships between more disparate data are becoming more useful; B) It's hard to establish new entity relationships or new entities in a relational database.
B) Not really true. Learn relational theory. It's definitely easier than updating data structures, file formats, and all affected routines, which seems to be what he's suggesting.
A) True. But absolutely nothing about this is easier to deal with by not using a database than by using a database. Loads of new flatfiles, query APIs, graphs, and datamining algorithms don't mean it's time to throw out the DB. It just means things are harder.