Hacker News new | past | comments | ask | show | jobs | submit login

I still don't get it.



Same here. Why can't people talk with concrete examples instead of this high level vague stuff. This thing is for do devs - explain it like so.

The analogies are pointless, they don't convey anything to me.


All I can do is point to my real, practical experience in using Docker.

http://tryrethink.info - over 3,000 happy customers (unique, sandboxed instances) served in 24 hours. With about a day of effort total.

http://nick.stinemat.es - my blog, which is pretty Docker focused because that's what I've been working on since I decided to start improving writing.

This covers basic continuous integration and deployment of an application to pretty non-trivial volume.


Would you say it's one of those things you feel the warm fuzzy feeling with after using it? I've been wanting to give it a try with all the hype surrounding it, but I don't see what it's going to do that provisioning a $10 Digital Ocean server wouldn't (albeit with a little bit of hassle).


I could literally type for days about the benefits of docker over a $10 digital ocean server, but no one would read it.

What I will say is this. If you're writing a trivial application that you and only you will ever need to work with, in an environment completely controlled by you, and you have a recipe that works - you're right, Docker probably isn't for you.

If you, like me, work with a huge product suite with many buildtime and runtime dependencies (services and applications), with many different runtime configurations, where even automated installation can take up to 15-20 minutes because of the sheer amount of work that's going on, there's a massive massive amount of efficiency to be gained in the dev/test/release/packaging process, let alone the massive amount of efficiency to be gained by the ops team in working in foreign environments.

There are certainly lots of other use cases (PaaS/SaaS are easy,) and those are valuable business building tools, but less interesting to me personally.


Docker and Digital Ocean servers are different things that can be used together. Docker allows you to distribute applications that come bundled with their own OS-level packages/configuration.

Imagine you wanted to run Wordpress on your DO server. Instead of configuring a LAMP stack and setting up Wordpress on it, you could download and run a Wordpress docker container that came bundled with its own LAMP stack. It would be as simple for you as calling "docker run wordpress".Most importantly, its configuration would be completely isolated from that of the host machine.


But that's not quite true is it? You would still have to open up ports on your host, configure security, and somehow do the meta-config for the dockerfile to make sure that if the host goes down, `docker run wordpress` is called on startup right?

I really want to get excited about docker, but I guess I just don't understand it. Any links to more specific use cases?


Imagine if these were real commands:

    $ docker download-install-and-run-my-customized-build-of-postgres

    $ docker start-up-a-wordpress-instance-from-some-online-build-template
    $ # edit some config files, add some plugins
    $ docker pack-that-wordpress-instance-into-a-container

    $ ansible all -a docker download-and-run-your-wordpress-instance-on-all-your-app-nodes
That's basically what docker does; the syntax is only slightly more mundane, and is actually less verbose than that.


http://en.wikipedia.org/wiki/Dependency_hell

Stuff like apt-get works great until it doesn't. You've run into this no doubt. Jails/LXC abstract dependencies away in an attempt to avoid this class of problems.

In other words: this helps to avoid mutating your system state with every command you run from the shell. :)




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: