I doubt they'd add these protections for visiting a handful of links at human speed. Correct me if I'm wrong, but crawlers often send hundreds of requests per minute, testing random outlinks and routes they find on the site
My concern would be as a webmaster: serving useless content to users, and as a user: not getting the information from the site.
I probably wouldn't use this feature, since I often deploy static websites that use little to no resources, and the potential harm outweighs the benefit
I think mister tracerbulletx has drank the stupid juice. its not a problem with developers (and products/apps the developers make), it's with google, not allowing downloads even when you pay a subscription
everything's so partisan and not nuanced today - tons of people go with the idea "the [political party] are 10,000% correct, and nothing you say can make me think otherwise!!
i don't like any one party or group unconditionally. while I have a leaning, nuanced opinions on issues can be a good thing :)
Reading the commentary, this guy seems unhinged. He thinks he owns literal hex codes
he sucks at tech and has driven away everyone good at it. I don't use his software, but I hope he gets out of this episode soon (and learns he didn't invent material!)
Someone else described him as a lunatic. But, this is a security issue, and you shouldn't assume that someone who is successfully putting malicious code into developers' IDEs around the world is unhinged or a lunatic, but rather cunning and deceptive (or a front for an intelligence agency). It's not paranoid to have such suspicions about someone who is getting malicious code into developers' tools.
also, don't they have an email masking option? i remember seeing some generated @github.com email in commit logs