Hacker News new | past | comments | ask | show | jobs | submit login

Yeah, DO Spaces is all around awful. Deleting is extremely slow as well. We had to write special code because DO cannot delete 1000 objects at a time (takes like 2 minutes for the api call to succeed, if it succeeds at all). To the extent that we had to just resort to delete entire buckets. The UI also keep crashing when there are many objects :(



I’ll assume Spaces works like S3 in where updates and deletes are eventually consistent, not immediately consistent:

https://docs.aws.amazon.com/AmazonS3/latest/dev/Introduction...


I recently had to delete a multi-TB S3 bucket and learned that S3 isn't great at deleting tons of files either. The AWS Console just hangs forever. I let it go for hours before finding another solution.


It sounds like you’ve already resolved this, but for the benefit of any others that stumble upon this, my solution for deletion of a large bucket is to set a lifetime rule with a short TTL, after which the objects are deleted.

Set that rule, and come back to a beautifully empty bucket 24 hours later, after Amazon’s gnomes have takes care of the issue for you.


That's what I ended up with as well.


Yeah, S3 is not flawless but DO spaces had problems with just 5000 objects.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: