* has faster cycle times and integrates into compilation workflow
* emits fewer false positives
* active maintainers fix issues
* releases several times per year
Cons:
* FindBugs has a greater breadth of checks
* current error-prone releases only work with Java 8
Also, error-prone works in a fundamentally different way to FindBugs: error-prone is a compiler plugin, whereas Findbugs is an after-the-fact static analyser. This means, for example, you can include dependencies on the codebase you're analysing in your findbugs checks, but not in error-prone.
This turns out to be really quite relevant if you're building domain-specific static analysis checks, as opposed to just running standard analyses.
Can you provide more information on how this is achieved? As far as I know there is no official, supported, portable API for Java compilers. The only thing I'm aware of is APT and anything beyond that would require significant rearchitecture of the existing compilers.
Technically, it's not a compiler plugin - it actually replaces javac by extending JavaCompiler, wrapping it and applying additional verifications without altering the output. Effectively it's introducing its own API for compiler plugins, with those being the checks, very much akin to APT.
This is not close to a plugin. I don't see why the term plugin is used here. This is misleading people into believing error-prone is using a supported API.
I also don't see how it is "very much akin to APT". APT is an API where the A stands for abstract and the I stands for interface. APT therefore is portable across compilers and supported. error-prone seems to be tied to the current implementation of javac.
It is as if words don't have a meaning anymore, all that matters is how you feel.
I am very reluctant to use such a tool because to me it looks likely that it's going to have similar problems to tools by Google relying on the implementation internal APIs in the past (eg. Android or GAE/J). Supporting new versions of Java is going to require serious amounts of effort and therefore going to be very late if it happens at all.
The standard annotation processing APIs don't provide enough information to do the analyses we want to do, so we do hook into javac internals.
It does require a lot of effort to keep up with OpenJDK updates, but that's my team's job, and we have to do the work anyway to keep Error Prone running inside Google. We tie Error Prone to a specific version of javac, which matches the one we are currently using inside Google (https://github.com/google/error-prone-javac), so we don't have to support multiple different internal APIs. We are currently using a version of javac 9 from about a year ago, and we're planning a bump to something close to head maybe in Q1.
Eclipse integration would be a pain but would look a lot like what the Checker Framework does.
This is all fine and good if you're a Google internal user who gets support form your team. Things are a bit different if you're a user from outside of Google.
The distinction I was trying to make was one between where in the toolchain it lives: error-prone is part of compilation, whereas Findbugs is after-the-fact.
Error-prone is a compiler plugin _mechanism_ - once you have an error-prone compiler, the checks are arguments to the compiler just like APT annotation processors are.
It's built on a standard API - the JavaCompiler mechanism built into the Java SDK - but yes, the code does have further dependencies on com.sun packages. This is a little confusing because why expose the compiler for invocation and extension when extensions have to rely on unsupported code?
So, yes, there are ways where the distinction is important, but they're orthogonal to the point I was trying to make.
Error-prone is somewhat tied to a specific version of the Java compiler -- so you need Javac 8, but you can set --source to an older version of the language. If the newer Javac does not emit bytecode that works with your runtime, you can run two compiles (one error-prone for the errors, one production compile with whatever compiler you need).
Haven't we learnt several times in the past that this is a bad idea? Eg. with Android or GAE/J. This is going to require a lot of effort for every upcoming Java release delaying support for a long time, again see Android or GAE/J. It also makes integration with Eclipse difficult at best.
It depends on what you mean by "bad idea". You've got three choices, basically: analyse source, analyse the AST, or analyse bytecode. There are advantages and disadvantages of each approach, but for what Google's trying to do - allow people to write their own checks - there are clear advantages to analysing the AST.
I've written checks in both error-prone and Findbugs and error-prone is simply more natural. Part of that is the Findbugs API, for want of a better word, is godawful, but the AST is simply the best representation of the program for analysis.
If you're going to analyse the AST, you then have a further two choices: build your own AST, or hook into the compiler. Building it yourself is dangerous here: firstly, it's repeating a lot of work that's already been done, but more importantly you want to be absolutely sure that what you're analysing is what you're actually building. Either way you have to do updates when the language changes, as any static analysis tool needs to.
Are there costs to this decision? Yep. Is it going to be the right tool for every job? Nope. But that doesn't mean that these decisions don't have reasoning behind them, and compelling reasons to go this way.
I am not questioning the AST approach. I'm questioning the approach of hooking into JDK internals and using JDK internals as an interface for custom check implementations.
Google has done similar things several times in the past (GWT, GAE/J, Android) and every time updates to the latest Java required a lot of effort, were therefore late and eventually abandoned.
https://github.com/google/error-prone
http://errorprone.info/bugpatterns
Pros:
Cons: