Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Hi, those caches are available in parallel?

CircleCI's remote docker have a restriction that only one of jobs can access same remote docker enginge at a time. Say, a job A build an image, then job B, C try to use same remote docker, but only one of them have the cache.

Google Cloud Build have no cache at all.

I don't know about GitHub Actions.



Yup the caches for each architecture are available in parallel and multiple builds for a single architecture can simultaneously use the same build machine for a single project. So we don't limit the concurrency.

I believe Cloud Build has no persistent caching so you are forced to use remote cache saving and loading. Which can incur a network latency that can slow the build to some extent. Cloud Build with Kaniko also expires the layer cache after 6 hours by default.

GitHub Actions is similar except that there is the ability to store Docker cache using GitHub's Cache API via the `cache-to=gha` and `cache-from=gha` directives. However, this has limitations like only being able to store a total cache size of 10GB per repository. You also have network latency for loading/saving that cache as well.

With Depot, the cache is kept on a persistent disk. So no need to save/load it or incur network latency doing so. It's there ready to be used by any builds that come in for the given projects.


Sounds great! Thanks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: