Java moving to 6 month release cycles is good. OpenJDK is good. RedHat providing long term support is good. All of this is good.
Where I'm worried this fails is with dependencies. Also in the context of developing plugins for other Java projects. For example building a plugin for IntelliJ and the different versions of IntelliJ users may have.
It's rare that Java the language or the JDK breaks backward-compatibility. Code without generics, for example, still compiles with JDK 10, albeit with a bunch of warnings, so long as you specify an appropriate language level.
The ABI, however, can and does break, usually when one of your dependencies updates to a language level not supported by your toolchain or runtime environment. The common solutions are to update your toolchain and language level, "desugar" the artifact if possible, or to not update the dependency. This is something the Android world has been dealing with for a long time (since Java 8 was introduced) and it's not a completely terrible situation.
The problem you mention regarding IntelliJ and its (lack of) stable APIs has more to do with IntelliJ than the languages used or the tools with which it is built.
The biggest issue I've met with the ABI issue is frameworks (Spring etc.) that emit bytecode at runtime.
I tried upgrading our project to Java 10 and Spring 4 and Camel (I can't remember the version) were the main issues because of their runtime generated proxies. The fix is to upgrade Spring to Spring 5, but that's a whole bunch of other work, and I'm unsure of the scope of work to upgrade Camel.
All that said, Java ABI is far more forgiving than Scala ABI. Every Scala artifact is appended with an "underscore Scala version" and SBT uses the '%%' operator to handle this implicitly, and it causes me no end of grief.
Last time I looked at upgrading our Spark code to use JRE 10, Scala broke because JRE 10 support only came in at Scala 2.12 (but was backported to Scala 2.11 recently) and Spark only supports Scala 2.10 and 2.11 at the moment, apparently the work to support 2.12 is still ongoing.
I remember running into an issue sort of like that with Struts (IIRC). I could use Java 8 anywhere in the codebase except on JSP files because Struts would choke on the new bytecodes it didn’t know were now valid.
This changed somewhat starting with Java 9. Quite a few libraries broke down due to deprecated classes finally being removed and the version format change.
Java also has a considerable class of libraries and tools that tends to break on every major release: mocking frameworks, dependency injection libraries, annotation processors and everything heavily relying on bytecode generation. I'd say most Java projects contain at least one of these, so migrating to a newer Java is not just a matter of installing a new JDK and changing your JAVA_HOME.
Java nine is sort of a special situation because they finally turned off something that people were never supposed to do in the first place and they been warning people about for a very long time.
My entire career has been as a Java developer, I’ve never had an issue updating Java to a newer version causing breakage.
Upgrading third-party tools, like maven or various other things, can certainly do that. But I haven’t had it happen with Java.
I do wonder if that will be slowly changing under new ownership. Java 9 and 10 had some pretty risky changes, compared to the prior releases. Take a look at the migration guide.
Usually you don't see sentences like 'For every tool and third-party library that you use, you may need to have an updated version that supports at least JDK 9.', nor 'Check the websites for your third-party libraries and your tool vendors for a version of each library or tool that’s designed to work on JDK 9 or 10.' as things to watch for in a Java release.
Where I'm worried this fails is with dependencies. Also in the context of developing plugins for other Java projects. For example building a plugin for IntelliJ and the different versions of IntelliJ users may have.