I'm kind of shocked at some of the responses here... everything from outrage, to expressing dismay at how many things could break, to how hard this is to fix, to accusing Amazon of all kinds of nefarious things.
How hard is it for 99% of the developers and technical leaders here to search your codebase for s3.amazonaws.com and update your links in the next 18 months?
> How hard is it for 99% of the developers and technical leaders here to search your codebase for s3.amazonaws.com and update your links in the next 18 months?
I've got a number of hobby projects, some hosted on AWS, that I built ages ago. I have no idea how this change will effect those projects because ... I just frankly don't remember the codebases. I built them on a weekend, set them up, and now just use them.
It isn't the end of the world. But I'm not really excited about having to dig up old code, re-grok it, and fix anything that changes like these might break.
I suppose that's just the nature of a developer's life. But I think many of us long for a "write once, run forever" world. Horror stories about legacy software aside, it was nice to be able to write software for Windows and then have it work a decade later.
It's a reasonable timeframe, but not all codebases are actively maintained. In addition, it's concievable that there's some hidden custom library somewhere that crafts S3 URIs, making it near impossible to simply grep for a certain URI type in the codebase. So people may have to scour codebases they don't even maintain to look for random code which may craft an S3 URI in a certain way, then fork that project, fix the functionality, publish it, and use the fork. Then they may need to fork every other project that uses that original project, and do the same thing ad infinitum. If this is a private company, they have to do all that within some corp-wide globally available private repo, which either means (1) making this repo public on the internet, or (2) adding it to every security group they have that pulls code. It may even require adding direct connect or privatelink. So that means a long research project, followed by a project of fixing, testing, and releasing new code, followed by a project get network access to the custom repos and change software to use them.
So, surprisingly hard, but doable. And from the customer's perspective, a huge pain in the ass, just to save Amazon some pennies on bandwidth.
How hard is it for 99% of the developers and technical leaders here to search your codebase for s3.amazonaws.com and update your links in the next 18 months?