Not really. There's much more than just to compile it. There's an init script, a logrotate configuration, etc. The script by the OP is great, I just wish it had the ability to overwrite the bundled version of Nginx as well.
Yes, believe it or not, I am. And did you look at all at the script above? It uses the standard package and just custom compiles with other modules and everything works just perfectly. To replicate this with FPM, I need to do an extra effort and possibly keep making changes with new versions of Ubuntu, test, etc. Why do that when I just want custom modules baked into the package?!
What's your project? I've been looking to buy something about that size, and I've found Flippa to be rather disappointing. If you want to chat, you can shoot me an email (address is in my profile).
I might be overlooking something obvious but I don't see an address in your profile (and I'd prefer not to guess). You can reach me at gmail (same user).
I've been in the market for something like this lately, shoot me the URL (dan [at] danwalker.com) and a price - I've already made an offer on another SaaS but if it falls through...
Gracenote: Emeryville, CA (SF Bay Area) - Full time, No remote - on-site only, relocation possible, no visa sponsorship possible.
Interested in working on crawlers and distributed systems? Interested in functional languages like Clojure and Scala? Gracenote is hiring a senior software engineer.
Gracenote is the top provider of entertainment information, creating industry-leading databases of TV, movie, and music metadata for entertainment guides and applications. Our technology serves billions of requests daily to hundreds of millions of devices around the world.
You’ll be working a set of crawlers responsible for discovering, acquiring and storing data and applications that make use of that data.
If interested email me at this username at gracenote. No 3rd parties, no recruiters please.
Responsibilities: - Write well-designed, well-tested code that performs well
- Design, implement, and own new systems – from design to operations
- Occasional on-call operations / support - Reduce technical debt in existing systems (refactoring, testing…etc)
- Proactively look for ways to make our software more scalable, reliable and fun
- Help change the way we think about solving problems
Requirements: - Strong background in Java, Ruby, Python or another OO language (our current stack)
- Solid understanding of the full web technology stack
- Familiarity with a variety of (relational and non-relational) databases/data stores
- Experience with AWS (or another infrastructure platform)
Pluses: - Experience with web crawling, scraping
- Experience with Clojure, Scala, Hive, or Go
- Experience with functional programming, functional architectures
- Experience with data processing architectures with Kafka, Storm, or Spark.
Gracenote: Emeryville, CA (SF Bay Area) - Full time, No remote, relocation possible, no visa sponsorship possible.
Interested in working on crawlers and distributed systems? Interested in functional languages like Clojure and Scala? Gracenote is hiring a senior software engineer.
Gracenote is the top provider of entertainment information, creating industry-leading databases of TV, movie, and music metadata for entertainment guides and applications. Our technology serves billions of requests daily to hundreds of millions of devices around the world.
You’ll be working a set of crawlers responsible for discovering, acquiring and storing data and applications that make use of that data.
If interested email me at this username at gracenote. No 3rd parties, no recruiters please.
Responsibilities:
- Write well-designed, well-tested code that performs well
- Design, implement, and own new systems – from design to operations
- Occasional on-call operations / support
- Reduce technical debt in existing systems (refactoring, testing…etc)
- Proactively look for ways to make our software more scalable, reliable and fun
- Help change the way we think about solving problems
Requirements:
- Strong background in Java, Ruby, Python or another OO language (our current stack)
- Solid understanding of the full web technology stack
- Familiarity with a variety of (relational and non-relational) databases/data stores
- Experience with AWS (or another infrastructure platform)
Pluses:
- Experience with web crawling, scraping
- Experience with Clojure, Scala, Hive, or Go
- Experience with functional programming, functional architectures
- Experience with data processing architectures with Kafka, Storm, or Spark.
1. Use Godeps to vendor your deps
2. Use Logrus for logging
3. Figure out a deployment script early one. I have a Rake script that uses chef-api to lookup current production nodes, cross compiles locally and scp's the resulting binary out and restarts the process
Gracenote: Emeryville, CA (SF Bay Area) - Full time, No remote, relocation possible, NO visa sponsorship possible.
Gracenote is the top provider of entertainment information, creating industry-leading databases of TV, movie, and music metadata for entertainment guides and applications. Our technology serves billions of requests daily to hundreds of millions of devices around the world.
Interested in working on crawlers and distributed systems? Interested in functional languages like Clojure and Scala? Gracenote is hiring for several positions (junior and senior).
You’ll be working a set of crawlers responsible for discovering, acquiring and storing data and applications that make use of that data.
If interested email me at this username at company. No 3rd parties, no recruiters please.
Responsibilities:
- Write well-designed, well-tested code that performs well
- Design, implement, and own new systems – from design to operations
- Occasional on-call operations / support
- Reduce technical debt in existing systems (refactoring, testing…etc)
- Proactively look for ways to make our software more scalable, reliable and fun
- Help change the way we think about solving problems
Requirements:
- Strong background in Java, Ruby, Python or another OO language
- Solid understanding of the full web technology stack
- Familiarity with a variety of (relational and non-relational) databases/data stores
- Experience with AWS (or another infrastructure platform)
Pluses:
- Experience with web crawling, scraping
- Experience with Clojure, Scala, Hive, or Go
- Experience with functional programming, functional
architectures
- Experience with data processing architectures with Kafka, Storm, or Spark.
Gracenote: Emeryville, CA (SF Bay Area) - Full time, No remote, relocation possible, visa sponsorship possible.
Interested in working on crawlers and distributed systems? Interested in functional languages like Clojure and Scala? Gracenote is hiring for several positions (junior and senior).
Gracenote is the top provider of entertainment information, creating industry-leading databases of TV, movie, and music metadata for entertainment guides and applications. Our technology serves billions of requests daily to hundreds of millions of devices around the world.
You’ll be working a set of crawlers responsible for discovering, acquiring and storing data and applications that make use of that data.
If interested email me at this username at gmail. No 3rd parties, no recruiters please.
Responsibilities:
- Write well-designed, well-tested code that performs well
- Design, implement, and own new systems – from design to operations
- Occasional on-call operations / support
- Reduce technical debt in existing systems (refactoring, testing…etc)
- Proactively look for ways to make our software more scalable, reliable and fun
- Help change the way we think about solving problems
Requirements:
- Strong background in Java, Ruby, Python or another OO language
- Solid understanding of the full web technology stack
- Familiarity with a variety of (relational and non-relational) databases/data stores
- Experience with AWS (or another infrastructure platform)
Pluses:
- Experience with web crawling, scraping
- Experience with Clojure, Scala, Hive, or Go
- Experience with functional programming, functional architectures
- Experience with data processing architectures with Kafka, Storm, or Spark.
Disclosure: I'm the founder of snitch.io - a fully automated ssl monitoring service that launched last month.
This is interesting. I suspect it will appeal to a certain type of person / use-case - similar to LogStash vs Paper trail / Logggly. (I use Paper Trail and love it - check it out.)
However, I'm not really worried about it since many people want automated monitoring, auditing, and alerting that "just works" without having to roll their own client - and then monitor that client.
Doing this at scale is hard. Doing it with frequency/interval guarantees is even harder. I've put considerable effort into a scalable architecture and self-monitoring.
I wish Ivan the best of luck. On a related note if you want to learn about SSL/TLS I highly recommend Ivan's book "Bulletproof SSL and TLS". It is great.
Snitch is already doing a few things SSL Labs isn't doing (supporting custom ports, and IMAPS) and over time the differences in our services will become more and more apparent. I'm very excited about my product roadmap :-)
This is still a pain point for many people and there are many unsolved problems that I'm having a lot of fun working on.
Happy to answer any questions - shoot me an email. This username at currylabs.com