Yup the caches for each architecture are available in parallel and multiple builds for a single architecture can simultaneously use the same build machine for a single project. So we don't limit the concurrency.
I believe Cloud Build has no persistent caching so you are forced to use remote cache saving and loading. Which can incur a network latency that can slow the build to some extent. Cloud Build with Kaniko also expires the layer cache after 6 hours by default.
GitHub Actions is similar except that there is the ability to store Docker cache using GitHub's Cache API via the `cache-to=gha` and `cache-from=gha` directives. However, this has limitations like only being able to store a total cache size of 10GB per repository. You also have network latency for loading/saving that cache as well.
With Depot, the cache is kept on a persistent disk. So no need to save/load it or incur network latency doing so. It's there ready to be used by any builds that come in for the given projects.
I believe Cloud Build has no persistent caching so you are forced to use remote cache saving and loading. Which can incur a network latency that can slow the build to some extent. Cloud Build with Kaniko also expires the layer cache after 6 hours by default.
GitHub Actions is similar except that there is the ability to store Docker cache using GitHub's Cache API via the `cache-to=gha` and `cache-from=gha` directives. However, this has limitations like only being able to store a total cache size of 10GB per repository. You also have network latency for loading/saving that cache as well.
With Depot, the cache is kept on a persistent disk. So no need to save/load it or incur network latency doing so. It's there ready to be used by any builds that come in for the given projects.