Hacker News new | past | comments | ask | show | jobs | submit login

We maintain a custom internal Docker image that everyone uses for their JupyterLab needs to make this manageable; that way, we only have to get everything to work once, and everyone else can just docker pull :latest and be good. It's become a rather large image, since it needs to have everything anyone uses, but for a small-ish team, this has worked well for about two years or so now.

Another upside: There is a GCP gcloud one-liner to start a VM for a given Docker image, and the image is designed to work both locally and in a GCP VM, with notebooks in a git repository that gets checked out automatically on container creation, so switching from laptop to 16-core VM to churn through a large dataset from a bucket is pretty seamless.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: