Hacker News new | past | comments | ask | show | jobs | submit login

I wonder what their stack is. I can't readily determine that, which might just be a function of working too hard today. I don't have time to search for the info. (I was specifically wondering if they use Hadoop.)



They did a StackShare blog post [0] and podcast in February talking about their stack, and it makes for a pretty good read. About Hadoop specifically, it doesn't show up on their StackShare page [1] anywhere.

[0] http://stackshare.io/posts/how-farmlogs-is-building-software...

[1] http://stackshare.io/farmlogs


They use Clojure and I believe ClojureScript on the frontend. (I'm not sure about Hadoop, when I had spoken with their devs before this didn't come up--but since they're already on the JVM such tools are a natural fit.)

They're one of the many success stories in the community of startups who have chosen Clojure as their primary weapon. (See also: Climate Corp, now owned by Monsanto.)

Lisp is having its victories, here and there, but notably in meaningful and economically real ways.

Edit: They DO NOT use ClojureScript, sadly. Unlike say, CircleCI or Prismatic, who both have very compelling stories with regard to using Clojure from the front to the backend.


Frontend is currently Backbone + React, moving towards more React, written in CoffeeScript.


A bit of a shame it's not ClojureScript, to be honest. What's the rationale CoffeeScript?


I'm relatively new so I can't speak to the original rationale, but right now it fits our needs. I'd love to do some ClojureScript (or just ES2015 through Babel), but right now we're focusing on building stuff instead of playing with languages :)


> ...but right now we're focusing on building stuff instead of playing with languages

Which is exactly what's so compelling with regard to ClojureScript and say Om or any one of the React wrappers out there: you guys already use Clojure and ClojureScript and its libraries are incredibly pragmatic and designed for "real work". CoffeeScript would seem more like language play than ClojureScript, but of course I'm speaking as an outsider. :)

Also if you haven't already, take a look at David Nolen's recent talk on Om Next[1]. Personally if I were using Clojure on the backend, I'd be pushing hard for it on the frontend too.

[1] https://www.youtube.com/watch?v=ByNs9TG30E8


Personally, I'm not a big fan of CoffeeScript and it's something I brought up when I started, but at the end of the day you can write good and bad code in any language. I've watched that talk and found it really compelling, but I'd sooner get rid of Backbone than change languages. I've been following Dan Abramov's stuff pretty closely, the refinements in Redux seem to take the best parts of Om and Flux.


Just want to point out that I personally don't think Babel/ES6/ES2015 is ready for production yet. I just wrote a small library with it and ran into several hiccups with continuous integration, which was fixed by specifying a specific node and npm version of which both were I think just the default linux packages. If I ran into problems scratching the surface I can only imagine what headaches a deeper usage could cause. I realize this is a bit off topic, and would be interested to hear from someone whose done it, but I wouldn't want anyone to jump in head first and hit the bottom of the pool..


Can you explain why you feel it's a shame? We write a lot of Clojure at FarmLogs but there is certainly a time and a place for it.

We handle iOS push notifications with Ruby, a lot of data and image processing with Python, and even run .NET/Mono in a few areas. Our stack is very diverse and we do a really great job considering what the best technology for a particular problem might be.

Clojurescript is certainly interesting, but I wouldn't consider it a shame that our front-end isn't built with it. I love programming but at the end of the day you gotta remember that code is really just a means to an end.


It seems pretty obvious: You're giving up using one (great) language, which you already know, end-to-end. I can't think of a good reason to do this, especially when an entire UI revolution is being led, right now, in ClojureScript.

What possible reasons could there be for giving that up?

(See my other comment below as well.)


Anecdotal but whenever I see a lot of diversity in a stack it usually means there are a lot of different preferences on the development team, or that the stack was developed over a longer period of time with many members (and their preferences) coming and going. While yea, any SE should be able to pick up and run with any language, it is quite possible their team no longer has the same experience in Clojure or alternatively has way more experience in their current front runner (Python, Ruby, whatever). If that's the case I think it's quite reasonable (at least from a business stand point) to use alternate tools.


Coffescript is a powerful language meant for beginners, but you really need some experience with it to understand its quirks and that a lot of it should be avoided.

So basically, it's JavaScript with a compile step that doesn't do static analysis.

But back in 2012 when the first lines of FarmLogs were written, it was pretty cool. And that's a lesson in itself about why it really doesn't matter what stack you use.


They're big fans of Clojure (and are sponsoring clojure/conj).


Many of us will be there this year and will probably be in our borderline-offensive green shirts so please do say hello!


I can't remember seeing them looking for anything other than Postgres, which seems less like "big data" than... regular data. Maybe they don't put their heavy duty stuff in job listings?


Farming is often in the raster world as opposed to the line/row/text/vector world. Some of things in Postgres might be huge. Farm data sets could easily be only adding a few thousand rows a day, but the objects associated with each row could be several gigabytes. Meaning that the size of the data is bigger than so-called "big data" but the row analysis tool set looks more like your "regular data." However, there's a lot that goes into the raster analysis that's a whole different beast.


Huh. Thanks for taking the time to outline this, I don't know why it never occurred to me. I have to admit, I'm unfamiliar with "rasters" in the way you seem to be referencing them. It sounds, though, like the relational bits of the DB are really being used more as a file system than a database, if there are even really distinctions in the first place. If "a couple thousand rows" are basically being used as a metadata store for the rasters, is that an unusual use of the database, or is everybody doing this and I just never had enough data to care?


You'd be surprised at what you can do with PostgreSQL. Also, there are also some really exciting new ways to handle data pipelines using Docker and tiny applications (see Pachyderm) as opposed to classic approaches like HDFS and Hadoop.

Not everything needs to be in an enterprise grade multi-node C* cluster to be big data!


Thanks.

I did see a job listing that noted "familiarity with stuff like Hadoop" (paraphrasing). So it seemed to cast doubt on them using Hadoop, but it neither served to confirm nor deny.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: