Looks nice, though less full-featured than the NLTK. I'd be interested to see how nice they'd play together and whether applications could exploit the strengths of both at the same time. The only thing that's better than one good NLP framework, is two good NLP frameworks, after all.
A recurring meme in terms of frameworks is people keep bringing out tools and some of them disappear, some find niche applications, and some become mainstream. Though I haven't heard of many nlp toolkits (but I'm not in that field).
I want to jump into some basic NLP, but I'd like to stick with one or two toolkits. I had heard of nltk before this, but are there any other comprehensive or sort of succesful frameworks out there one should be aware of? (Either in python or something else)
Many smaller components are made to be compatible with IBM UIMA (of Watson fame), so they are able to be integrated into a pipeline somewhat easily. For examples of this in biomedical TM, see http://u-compare.org/ .
People will kill me for saying this, but truly: Python's performance isn't adequate for large-scale text mining, _especially_ if you want to do deep/full parsing. Shallow parsing as shown in this package's demo is more feasible.
I personally find NLTK convoluted, but in its favor, it does have readers for a TON of corpora, which is really nice.
My friends in the natural language field tell me Python and NLTK are more common than Java. Then again, this is at a sort-of Python-centric university (Toronto).