I have devoted 10 years to game content distribution, packing, compression etc. (Now not in gamedev anymore)
This is a very easy problem which usually solved by attaching fairly simple script which is aware of your file formats to any commercial installer system.
Some companies are even selling more or less standard solutions for that, but in reality from any given 1000 games 900 will have very different data formats and all have fairly good reasons to do so - using universal "patch systems" really creates more problems.
I think the "900 different data formats" problem is something that will go away as we move towards better tools which cover all the standard use cases.
Gamedev is riddled with really smart people that reinvent the wheel all the time because they found a way to micro-optimize this or that. They get to do this because until recently, there was no "good enough" solution for a wide range of games (or the "solution" was priced with enough zeroes to make bill gates cringe).
But you saw how popular Unity got, and how fast. That's the games industry in a nutshell: ripe for solutions that work for more than just one studio.
BTW Unity, with all its excellence, has really horrible data format for content and patch distribution, and had and still has huge problems with this. Perhaps the legacy of early overengineering and struggle to protect the games from easy reverse-engineering.
And compare, say, to simple incremental zips of Quake with alphabetic file loading order.. Total no-brainer to implement and use. (I have even seen zips with custom LZMA compression!)
So, if any, someone will have to solve a problem of artificially created obstacles, not a problem per se.
The path forward for games is roughly similar to where digital audio is now: Comprehensive workstation environments with an easing facade through plugins, presets, etc. The coarse elements of a rendering algorithm or a piece of game logic can be reduced to a processing graph, behavior tree, or other convenient abstractions. They can plug into each other by exposing both assets and processing as globally addressable data. Original coding for game logic will still be required for the foreseeable future, but most of the development problem is weighted towards getting assets in the game, and that can be abstracted.
This is done in bits and pieces across existing engines and third-party tools, but there's a lot of room to make it cheaper and easier.
I used to use irrlicht and Ogre. Both have the problem of only really doing graphics and to a certain extent input. In comparison, Unity and Unreal offer the whole package: graphics, asset pipeline, audio, networking, and physics.
Speaking from experience as I'm currently making the jump to Unity for my projects, the time savings of choosing one of the all-in-one engines instead of gluing together engines is really substantial.
Some companies are even selling more or less standard solutions for that, but in reality from any given 1000 games 900 will have very different data formats and all have fairly good reasons to do so - using universal "patch systems" really creates more problems.