> Risky Biz (and really anything Patrick Gray touches) is awesome. Would recommend.
Patrick Gray is a good interviewer and news source but he is also a big booster of the American surveillance apparatus, and has spoken against such reasonable reforms as requiring the the FBI get a bloody warrant when it searches for Americans data in the 702 database (data that is collected specifically for foreign intel purposes and thus subject to less constitutional scrutiny). He recently defended the NSA's acquisition of netflow data because the NSA needs it to do its work. as if constitutional privacy rights should give way to a spy agencies priorities.
He is just way to trusting of these agencies abilities to police themselves. I swear his quote when they talked about nsa getting netflow data was something along the lines of "if people only knew how many meetings they had to have before they would understand". Those are both examples I am pulling from memory so don't take them as gospel. And of course, no source of news / commentary is unbiased.
I listen to and enjoy the risky buiz podcast. And institutional trust is a legitimate aspect of security, especially in infosec. I just wish he was more skeptical of western law enforcement and intelligence agencies (he is already more then skeptical of non-western law enforcement and intelligence agencies, which is fine, I just wish he did not give the five eyes countries a pass because we are "the good guys"). He recently interviewed people at NSA headquarters for petes sake.
I find it somewhat hilarious the way that he dismisses the possibility that it could have been a five eyes attack as it would be illegal and the five eyes guys are too good let their backdoors to be open to a replay attack.
He has written about it before, but listening to the podcast, the most interesting part for me that led to the Freund's discovery is that the hotspot had no debugging symbols attached to it. In free software, trying to hide something actually makes it stick out more.
I wonder if a subsequent, more sophisticated attack could add some bogus debugging symbols to better hide its track.
- A serious SSH backdoor was discovered in the xz Linux compression library, allowing attackers to compromise SSH servers.
- The backdoor was discovered by Andres Freund, a Postgres developer, who noticed suspicious CPU usage and login attempts on his systems.
- The xz backdoor allowed attackers to bypass authentication and gain root access on compromised systems.
- Microsoft faced significant criticism from the CSRB (Cybersecurity Review Board) for a cascade of errors related to a China-based hack.
- Ukraine was able to leverage an old WinRAR vulnerability to hack into Russian systems as part of the ongoing conflict.
- There have been recent "MFA bombing" attacks targeting Apple users, combining push notifications and social engineering.
- A ransomware gang leaked stolen Scottish healthcare patient data as part of an extortion attempt.
- Renowned security expert and author Ross Anderson passed away.
- The episode features a discussion with Andres Freund about his discovery of the xz backdoor.
- The podcast sponsor, Island, discusses how enterprises are moving away from VDI (Virtual Desktop Infrastructure) towards security-focused enterprise browsers.
Some group of coders pissed off because the were rushed into shipping and a middle manager or two that were trying to make an arbitrary deadline blaming it on incompetent devs.
2024-02-29: On GitHub, @teknoraver sends pull request to stop linking liblzma into libsystemd. It appears that this would have defeated the attack. Kevin Beaumont speculates that knowing this was on the way may have accelerated the attacker’s schedule. @teknoraver commented on HN that the liblzma PR was one in a series of dependency slimming changes for libsystemd; there were two mentions of it in late January.
I think that this is a plausible explanation for a rushed schedule and an actually justified deadline.
(From the "history may not repeat itself but it sure does rhyme" files: Berkeley releasing a new ftpd with fixes for a bunch of buffer overruns (in Fall of 1988, when this was a New Thing) was widely suspected to be a reason the Morris Worm launched when it did, because some of the holes it used might have gotten closed too. I'm not sure if this ever got confirmed but it was part of the earliest responsible disclosure narratives...)
We're within weeks of cut-off for Ubuntu's next LTS, Red Hat 10, etc. Fail to deliver their code upstream in the next few weeks and they'd be out of luck for getting it in to an enterprise distribution for the next 3-4ish years.
There's nothing arbitrary about the deadlines, the clock was absolutely ticking.
You're not wrong, I'm mostly joking here because its a little funny to think about the bureaucracy of spycraft and malice.
Not James bond stuff, no beautiful high value targets getting seduced, its just some guy with ADD that spent about 15 days too many bike shedding and botched a critical deadline.
Why would they, imo signs point to a nation state or at minimum a very sophisticated APT so it seems incredibly unlikely they're going to make a mistake like that.
If a nation-state then revealing themselves could cause a serious diplomatic incident. If not a nation-state then revealing themselves could expose them to criminal prosecution. The perpetrators almost certainly have been busy erasing their tracks and going underground. If they're not caught in spite of their best efforts, we'll never hear from them again.
where is there any law that says you can't put any code you want into open source that is completely visible, auditable , and reviewable by everyone who uses it?
The CFAA is quite broad. I wouldn't be surprised if it's broad enough to allow for prosecution in this case. If you interpret the CFAA as narrowly as possible then you might only be able to prosecute when the backdoor is exercised, but at the very least that you could. And then there's the world outside the U.S., where the laws might be broader.
Why would it matter that the thing being sabotaged is "completely visible, auditable , and reviewable"? Do you have any specific laws in mind that you think would not apply?
If you were referring to the malicious code being contributed, not to the open source project as a whole, I don't think "completely visible" is an accurate description of the deliberately obfuscated chain of m4 gobbledygook and binary blobs that makes up the backdoor.
It just slays me that Andres Freund, absolute bastion of database goodness, is now just "the guy who found the xz bug". Funny old world.