You know, if you're already on the command line, gnuplot and github already provide ~similar functionality. I can do the same in ~the same number of keystrokes if things are aliased properly. Also, keeping things local is nice.
I use gnuplot to work with large datasets I extract from telco carrier logs (for client usage analysis). It's one of the few tools you can throw a 100,000 point dataset at and it graphs it in seconds. Try that with most scripting language graphing libraries, and you'll be there for a few minutes.
The problem I have with gnuplot -- and maybe I'm the limitation, not the software -- is that I find it difficult to use, and the result of some input is inconsistent. Not to appear self-deprecating, but I really do think I might be the limitation here. Regardless of that, I often waste time trying to figure out why gnuplot is behaving as it is.
I have no idea if chartulo.us is going to choke on my datasets, but I hope I'll be able to use it. I really could use a tool that is a little easier to use than gnuplot.
Tableau does nearly instant plotting of millions of data points with a nice user interface; it may be worth checking out if your job depends on visualizing massive amounts of data.
Nothing fancy. Either a shared folder in my home-directory for users on the same machine, or email. Most of my work is separate individuals showing off their data rather than working together on one dataset. That could be a cool feature though--project groups or something.
Ahh cool, i see the attraction. I went through a phase of using the google chart API instead of GNUplot (you just wget a URL piping your data). Coupled with a big bash history it works fairly well.
I've learned enough R to be dangerous. Ggplot2 looks gorgeous without trying. It's my current favourite torch for shining on perf and capacity issues (I dream of one day identifying a perf issues root cause with nothing more than the correlation function!)
It has a bit of learning curve but its an extremely flexible way to quickly visualize data in a lot of different ways to get a feel for what's going on in your dataset (you should probably already have some R familiarity, the language has its idiosyncrasies).
At least from my perspective, a major problem is maintaining control of your information. How does Chartulo.us store the data? Where? Can it be removed immediately post retrieval?
I would much prefer generating the graph locally and then uploading to the service - it's very rare that graphs I generate need to be shared rather than embedded in a paper.
another option/alternative is to use matplotlib, which is ideally integrated with ipython notebooks http://ipython.org/, well worth a look if anyone's interested.. their integration with numpy/pandas etc is superb.. and on top of all the benefits of plotting with matplotlib, you get ipython and all its features for free... for example http://imgur.com/Le8px
The animation of the command line is a little annoying because it has all this cruft of the package installer messages. I got bored half way through and missed and had to replay. Just put the two command lines there and be done with it. :)
When (as in the current case) I don't see a way to download source code, I assume it is a startup or a project that will become a startup if it becomes sufficiently popular.