This vulnerability was genuinely embarrassing, and I'm sorry we let it happen. After thorough internal and third-party audits, we've fundamentally restructured our security practices to ensure this scenario can't recur. Full details are covered in the linked write-up. Special thanks to Eva for responsibly reporting this.
> We resolved the vulnerability within 26 hours of its initial report, and additional security audits were completed by February 2025.
After reading the vulnerability report, I am impressed at how quickly you guys jumped on the fix, so kudos. Did the security audit lead to any significant remediation work? If you weren't following PoLP, I wonder what else may have been overlooked?
That was solid. Nice way to handle a direct personal judgement!
Not your first rodeo.
Another way is to avoid absolutes and ultimatums as aggressively as one should avoid personal judgements.
Better phrased as: "we did our best to prevent this scenario from happening again.
Fact is it just could happen! Nobody likes that reality, and overall when we think about all this stuff, networked computing is a sad state of affairs..
Best to just be 100 percent real about it all, if you ask me.
At the very least people won't nail you on little things, which leaves you something you may trade on when a big thing happens.
And yeah, this is unsolicited and worth exactly what you paid. Was just sharing where I ended up on these things in case it helps
If you think someone is obviously wrong, it might be worth pausing for a second and considering where you might just be referring to different things. Here, you seem to understand “this” to mean “a serious bug.” Since it’s obvious that a serious bug could happen, it seems likely that the author meant “this” to mean “the kind of bug that led to the breach we’re presently discussing.”
I do not assume anyone is obviously wrong and prefer to ask questions. Most bugs exist in classes, and variants are something you typically consider when a bug results in a production incident.
I'm not sure I read anything that makes me confident this class of bugs could never recur. I could be reasonably confident this _exact_ bug in this _exact_ scenario may not happen again, but that only makes me more concerned about variants that may have equal or more serious implications.
So I'm wondering which claim did it for you? I only really saw pen test as a concrete action.
This is the wrong response, because that means that the learning would be lost. The security community didn't want that to happen when one of the CA's got a vulnerability, we do not want it to happen to other companies. We want companies to succeed and get better, being shameful doesn't help towards that. Learning the right lessons does, and resigning means that you are learning the wrong ones.
> If you get a slap on the wrist, do you learn? No, you play it down.
Except Dave didn't play it down. He's literally taking responsibility for a situation that could have resulted in significantly worse consequences.
Instead of saying, "nothing bad happened, let's move on," he, and by extension his company, have worked to remedy the issue, do a write up on it, disclose of the issue and its impact to users, and publicly apologize and hold themselves accountable. That right there is textbook engineering ethics 101 being followed.
> "we've fundamentally restructured our security practices to ensure this scenario can't recur."
"Yeah it was a problem but it's fixed now, won't happen again"
Sure buddy.
It's not something you fix, when stuff like this happen, it's foundational, you can't fix it, it's a house of cards, you gotta bring it down and build it again with lessons learned.
It's like a skyscraper built with hay that had a close call with some strong northern winds, and they come out and say, we have fortified the northern wall, all is good now. You gotta take it down and build it with brick my man.
I'm done warning people about security, we'll fight it out in the industry, I hope we bankrupt you.
> It's not something you fix, when stuff like this happen, it's foundational, you can't fix it, it's a house of cards, you gotta bring it down and build it again with lessons learned.
That's the last thing you should ever do within a large scale software system. The idea that restarting from scratch because "oh we'll do it better again" is the kind of thing that bankrupts companies. Plenty of seasoned engineers will tell you this.
I suggest reading one or two of Sydney Dekker’s books, which are a pretty comprehensive takedown of this idea. If an organization punishes mistakes, mistakes get hidden, covered up, and no less frequent.
Under what theory of psychology are you operating? This is along the same lines as the theory that punishment is an effective deterrent of crime, which we know isn’t true from experience.
I think you’re misunderstanding my point. The reality is more complicated than that.
There are some people who will be discouraged from committing a crime over threat of punishment. But many will not. Many people behave well because they’re just moral people, and others won’t because they’re just selfish and antisocial. Still others commit crimes out of desperation despite the risks. If the threat of imprisonment were effective, there would be no crime, because we already have prisons and penalties of punishment. But since we do have crime, it logically follows that it’s not effective.
The other point here is that threat of punishment is not particularly effective as a management strategy in the private sector. It doesn’t incentivize behavior in the manner you might believe. Mostly it makes your reports dislike you and it makes them less productive. It’s a thing you learn pretty quickly as a manager.
There’s a model of a person being a rational thinker, but in reality, people aren’t always rational. (Hell, adolescents are biologically programmed not to be rational and to stress test the limits of nature and society.) You find success in making less-than-rational people work together in harmony and achieve positive outcomes.
When I was younger I used to be much more influentiable, now you just can't change my mind, I made it up for good thank you.
And it pays off in cases like this, I'll be talking with someone about a topic like the seriousness of a vulnerability, they disagree, that's fine no need to convince me, you won't. And then it turns out they're left-leaning abolitionists who are against the idea of jails.
Many such cases, on the other hand I'll be disagreeing with someone on business strategy, and two lines later they reveal that they think taxation is theft. I can rest easy and ignore them.
> now you just can't change my mind, I made it up for good thank you
Respectfully, that’s not a very “hacker” way of seeing the world. Hackers learn from their mistakes and adapt. (Just like this software company is doing.)
> While I think that resigning is stupid here, asserting that "punishment doesn't deter crime" is just absurd. It does!
Punishment does not deter crime. The threat of punishment does to a degree.
IOW, most people will be unaware of a person being sent to prison for years until and unless they have committed a similar offense. But everyone is aware of repercussions possible should they violate known criminal laws.
Honestly I don't get why people are hating this response so much.
Life is complex and vulnerabilities happen. They quickly contacted the reporter (instead of sending email to spam) and deployed a fix.
> we've fundamentally restructured our security practices to ensure this scenario can't recur
People in this thread seem furious about this one and I don't really know why. Other than needing to unpack some "enterprise" language, I view this as "we fixed some shit and got tests to notify us if it happens again".
To everyone saying "how can you be sure that it will NEVER happen", maybe because they removed all full-privileged admin tokens and are only using scoped tokens? This is a small misdirection, they aren't saying "vulnerabilities won't happen", but "exactly this one" won't.
So Dave, good job to your team for handling the issue decently. Quick patches and public disclosure are also more than welcome. One tip I'd learn from this is to use less "enterprise" language in security topics (or people will eat you in the comments).
Point taken on enterprise language. I think we did a decent job of keeping it readable in our disclosure write-up but you’re 100% right, my comment above could have been written much more plainly.
With privileged access, the attackers can tamper with the evidence for repudiation, so although I'd say "nothing in the logs" is acceptable, not everyone may. These two attack vectors are part of the STRIDE threat modeling approach.
Following that logic it would be literally impossible to trust any part of their infra. They had a bad build container, the rest of their stuff was solid.
Annual pen tests are great, but what are you doing to actually improve the engineering design process that failed to identify this gap? How can you possibly claim to be confident this won't happen again unless you myopically focus on this single bug, which itself is a symptom of a larger design problem.
These kinds of "never happen again" statements never age well, and make no sense to even put forward.
A more pragmatic response might look like: something similar can and probably will happen again, just like any other bugs. Here are the engineering standards we use ..., here is how they compare to our peers our size ..., here are our goals with it ..., here is how we know when to improve it...
Sounds like it was handled better than the authors last article where the Arc browser company initially didn't offer any bounty for a similar RCE, then awarded a paltry $2k after getting roasted, and finally bumped it up to $20k after getting roasted even more.
Well for one it was a gift so there is no valid contract right? There are no direct damages because there is nothing paid and nothing to refund. Wrt indirect damages, there's bound to be a disclaimer or two, at least at the app layer.
If you give someone a bomb, or give someone a USB stick with a virus, or give someone a car with defective break, you are absolutely liable. Think about it.
If you give someone a USB stick with a virus, and you don't know about the virus, you aren't liable. Unless maybe you gave them some sort of warranty or guarantee that it was virus-free.
The lesson: don't use USB sticks people give you, unless you have your own way of verifying that they're virus-free.
Also, don't give people bombs. That's usually illegal, unlike giving someone software with unknown bugs in it.
This vulnerability was genuinely embarrassing, and I'm sorry we let it happen. After thorough internal and third-party audits, we've fundamentally restructured our security practices to ensure this scenario can't recur. Full details are covered in the linked write-up. Special thanks to Eva for responsibly reporting this.