Hacker News new | past | comments | ask | show | jobs | submit login
Open source supply chain security at Google [video] (swtch.com)
126 points by mfrw 11 months ago | hide | past | favorite | 25 comments



Interesting. I didn't realize that the Air Force security analysis of Multics in the 1970s contained a description of what we call the Thompson attack (have the compiler insert a back door, and because the compiler for the language is written in the language, have the attack persist even when the compiler is recompiled), and this was almost a decade before Thompson's lecture.


I'm pretty sure he's credited the air force report before but I believe the actual report only resurfaced recently (maybe some time in the last year?)


Yes, the credit is in the original Reflections on Trusting Trust.

> Acknowledgment. I first read of the possibility of such a Trojan horse in an Air Force critique of the security of an early implementation of Multics. I cannot find a more specific reference to this document. I would appreciate it if anyone who can supply this reference would let me know.


I attended/presented at this event and loved this keynote.

I want to push back on the idea that there is no consensus on what an SBOM is — there are two very popular specs! Yes, quality, content and completeness does vary. And every SBOM does need to be enriched with vulnerability and runtime context. But those don’t need to be in the file itself (and aren’t static anyways).


The lack of consensus is on what an SBOM is for. Even the NIST recommendation which came out of Executive Order 14028 had little guidance on how to apply SBOM .

At any sort of scale, it isn't clear how an SBOM shipped with each package can be consumed to any great effect.

A central database of all dependencies, on which queries and analysis can be performed, however, can be very useful, and in a large software shop, I've seen it used to rapidly get a very real sense of the company's exposure to events like the Log4j debacle.


A Debian/Ubuntu status file is a good start (of course you need to dig further for build depends) and helpful enough for "provenance" that I've found it useful at a couple of startups to deploy code as debs packages specifically to be part of that graph - obviously not perfect, but often good enough to go automatically from CVE -> USM -> upstream package -> what part of our code cares about that - someone still has to think about the vulnerability, but it reduces a lot of obvious noise and helps drill down quicker.


Additionally, from what I can tell a lot of SBOM tooling is manual/honor based, and the automated ones don't recurse dependencies well.

Trusting the current state of SBOMs seems sketchy


A SBOM is for the producer at this time, not the consumer. It is about requiring the producers to at least try to figure out what they put into your soup. The sort of engineering 101 processes that almost all software development lacks. It is about making it possible, though not necessarily easy, for somebody to know what they are making.

The next step is exhaustiveness and automated production. At this point it will be accurate and exhaustive, but not necessarily easy to use. This is about making it possible, though not necessarily easy, for the consumer to know what they are getting.

The next step is then regularity and consistency to make it easy for the consumer to know what they are getting since it follows common, standard rules.

After that point we may get to a full reproducible build/linking manifest that allows automated validation that the SBOM matches the delivery. But this step has a good chance of fizzling out as a standard practice in general industry.

The first three are clearly where this is all going and constitute a very minor cost relative to a very outsized benefit. We might not even get to the third step, but even then at least general software development would have reached engineering 101 process sophistication which is a massive improvement over the status quo.


> A SBOM is for the producer at this time, not the consumer. It is about requiring the producers to at least try to figure out what they put into your soup.

> The next step is exhaustiveness and automated production.

There are lots of vendors selling automated SBOM generation tools/services, my company's security team is using it. Is the output correct? I don't know, they don't know, nobody looks at it. But we have SBOMs [checkmark]


Yep, that is the point of the first phase. The next phase is going to be attaching liability for incomplete SBOMs.

The way it will likely play out is that if you were breached due to a undisclosed component in a purchased product the product will either be deemed defective or the vendor will be liable. If CISA succeeds at pushing that you will see the SBOMs becoming correct and exhaustive real fast, though likely excessive due to ass-covering.

But at this point the goal is clearly just establishing a paper trail so that it can eventually be audited for consequences. Maybe they will fail at the next step due to industry pushback against actual consequences for shoddy work, but that is clearly where it is trying to go.



For anyone else who has thusfar avoided hearing the acronym "SBOM":

> A “software bill of materials” (SBOM)... is a nested inventory, a list of ingredients that make up software components.

https://www.cisa.gov/sbom


Any idea of the minimum amount needed to bribe an open source developer to add nefarious code which will be used in a trusted program?

Is there legal liability for making that change, even when the license says NO WARRANTY including for FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT? It's not even like the developer even delivers the software to anyone else.

Or is it only social standing? If so, I think $2 million would be enough to retire on.

It would seem that this attack vector is relevant for supply chain management. How is it currently evaluated and managed?

Also, at 8:27 (https://youtu.be/6H-V-0oQvCA?t=507), "Supply chain security has to be an industry-wide effort."

Would an unpaid FOSS developer, perhaps doing this as a retirement hobby, really be considered a part of industry? (For example, a statistician who continues to develop and distribute some very useful analysis software developed over her career, simply because she is interested in the approach.)

If so, what if they refuse to follow the demands of for-profit corporations demanding a SBOM and reproducible builds? If not, what should companies should they depend on that FOSS software?


Like literally every other engineering discipline, it should be considered software engineering malpractice to use a component that is not guaranteed to be fit for purpose or that you have not audited to be fit for purpose. If that means you can not use some whizbang library because you do not know if it works and you are too lazy to figure that out for certain, tough.

But I want to use the random steel sheets of unknown quality I found lying around to make a airplane is not a acceptable position. But, this does not mean you need to use aircraft grade steel everywhere. You just need to use things at a appropriate level for your purpose.

If they really must use software that guarantees nothing, then they must verify it works for their purpose or pay somebody to make it fit for their purpose, simple as that.


So, no Python, no gcc, no R, ... even Visual Studio says "The software is licensed “as-is.” You bear the risk of using it. Microsoft gives no express warranties, guarantees or conditions. To the extent permitted under your local laws, Microsoft excludes the implied warranties of merchantability, fitness for a particular purpose and non-infringement." and limits direct damages to $5.00.

What software do you use which is guaranteed to be fit for purpose and/or have fully audited?


I’m pretty sure you can’t (ever) disclaim liability for wilful actions. ie if you deliberately add nefarious code you are liable no matter what your disclaimer says. $2m would be eaten up by the legal bill and the remainder would seem scant compensation when the person was “retiring” in prison.

Secondly, by bribing the person even if they were covered by the disclaimer I’m pretty sure you would be liable for civil damages and probably violate one of many sweeping computer misuse laws.


What you wrote puts a shiver down my spine. If I can't disclaim any sort of warranty, and my FOSS has a bug which a company incorrectly thinks is nefarious code, what defense do I have should that company decide to sue me? (Or threatens to sue me unless I fix the code to their satisfaction.)

I thought I could point to the explicit disclaimer of warranty&usability, the lack of a contract between us, the lack of payment, and the lack of copyright permission to even use my software unless they agreed to the license terms, as an absolute defense to request the court to dismiss the case.

And now you suggest it might be otherwise, and I might end up paying a lot in lawyer and court costs over years?

Has anything like this ever been taken to court?

For example, was there any legal liability for the people who added the Bitcoin miner to event-stream?

Or the denial-of-service caused by colors.js?

It comes down to what specific law is broken, and where the person is living. Extradition requires the crime be illegal in both countries, yes?

And who says the nefarious code must be hidden? What if it's described in the CHANGELOG and documentation? Because I know few people read those.


> What you wrote puts a shiver down my spine. If I can't disclaim any sort of warranty, and my FOSS has a bug which a company incorrectly thinks is nefarious code, what defense do I have should that company decide to sue me?

I am not a lawyer, you are not my client and this is not legal advice.

My layman’s understanding is that it is not possible to disclaim liability for malicious action which harms another.

If someone believes that you have maliciously harmed him, then he may try to recover damages from you for doing so — this is independent of software (he could do the same if you broke his leg).

> And now you suggest it might be otherwise, and I might end up paying a lot in lawyer and court costs over years?

Yup. The mechanism for sorting out claims of truth and liability is court, and wise plaintiffs and defendants engage lawyers. Unfortunately, this can be expensive, and in some cases the process is the punishment: one might be exonerated, but only after years and years of litigation and worry.

I imagine that generally a potential plaintiff’s lawyers would advise him that absent clear evidence of malice that he is unlikely to win, and would be wasting his money. Particularly if you have no real assets, the game is not likely to be worth the candle. But if he is rich or foolish, he might waste it anyway.


> for malicious action which harms another

Here's are things a lawyer might be able to answer.

What specifically is my action that I am legally responsible for? I did not install my software on your machine - you did that. And I explicitly warned you not to trust it. Yet some reason you decided not only to trust that code base, but to keep trusting me in the future enough to auto-update, likely without even telling me.

What makes it malicious? I can sell you a tool to re-format your hard disk. That is not malicious, because I described it that way.

So how well do I need to document some "malicious" feature before it's no longer legally malicious?

And, if I sell my company to another company, who then carries out these actions, is it my responsibility?

Are these criminal liabilities, or civil? What is the relevant law? What is the maximum penalty?

> Yup. The mechanism for sorting out claims of truth and liability is court

Sure. But clearly invalid cases can get summary judgement for far less than taking the whole thing to trial, and I've been judging my risk based on the former overriding any of my users' claims of malicious intent.

This doesn't seem like a topic that can go any further without an appropriate lawyer involved.


Firstly I'm not any kind of lawyer and could be wrong. I was giving you my understanding.

Secondly I said you can't disclaim liability for willful actions. If you do something by mistake or it's an unintentional side-effect of course you can disclaim liability. If you do something on purpose you can't say you weren't responsible for harm you caused because you put a disclaimer in somewhere.

Yes some of these sorts of things have been taken to court. In the UK where I am they are typically prosecuted under the Computer Misuse act[1] but many countries have similar legislation. See the bit about a person being guilty of an offense if "the person intends by doing the act to cause serious damage of a material kind or is reckless as to whether such damage is caused.". There is specific UK guidance from the prosecution service that DDos is covered[2]

Secondly Extradition doesn't require the conduct to be a crime in both countries. It requires it to be a crime in the country requesting extradition, there to be an extradition treaty between the two countries, and the conduct and request to meet the standard for extradition in the country where you are. That doesn't necessarily mean a crime in the place you are. It also means the evidence doesn't necessarily need to be at the standard that would be required where you are. So in the UK for example you can be extradited to the US if the US show "Probable cause" (the US standard) that you committed an offense in the US because the extradition treaty says that's equivalent to the standard that would be required to get a warrant for your arrest in the UK.

I didn't say anything about the code being hidden - it would depend on the legislation but as you can see from the sources I posted, the important standard in the UK is that it was intended to cause harm or that you were reckless as to whether or not it could cause harm, not anything to do with being hidden. If you did hide it, that would probable help to show that it was intentional rather than an error though. In that legislation the max sentence for that part is 14 years or life if you cause harm to critical national infrastructure.

[1] https://www.legislation.gov.uk/ukpga/1990/18/section/3ZA

[2] https://www.cps.gov.uk/legal-guidance/computer-misuse-act and DPP vs Lennon


Clearly I am not a lawyer, and my understanding of extradition faulty. I'll have to increase my bribe rate enough to get a lawyer involved. Wikipedia tells me that Namibia has no extradition treaty with the US...

I looked at [1], and I think it comes down to what "authorised" means. If you install one of my non-malicious FOSS packages, does that automatically authorise me on your systems? If not, then I'm clearly already in trouble for doing an "unauthorised act in relation to a computer".

Who authorised me? Who decides if I am unauthorised? How do I know it's unauthorized, as required by (b), when I don't even know you installed it?

My likely naive belief is that when you get a software package from me - one which disclaims even usability of purpose, and which disclaims all liability for damages, to the fullest extent allowed by law - then you are responsible for vetting the software and authorising its use on your system.

If the documentation says "doing X will wipe your home directory", and you do X, then you've authorised that action, yes?


That’s not what authorised means and I think know that and are just being deliberately obtuse. If I install your package I would need to be authorised not you.

But as I say if you deliberately put something in your software package intended to cause damage you can’t disclaim liability for that. That’s exactly the same as if you write any other piece of malware and it gets installed on my computer you are responsible for the damage it causes.

I’m not going to respond futher on this thread now.


I was pointing out that the law you pointed to specifically requires that "the person does any unauthorised act in relation to a computer".

My comment was to highlight that discussion of "authorised" does not make sense, strongly suggesting the specific law you mentioned was not relevant.

You give an example saying that I can't disclaim liability for software which deliberately causes damage. That may be true, but what law am I breaking?

Can I distribute an "rm" tool which will let you do "rm -f /"? Can I disclaim damages for using that tool? I really hope so!

Your "it gets installed" is doing a lot of heavy lifting. Who installed it? You? Why did you install it? Did you read the documentation to learn about any changes? What counts as sufficient notice for the potential of causing damage?

But back to the chilling effect - if I did NOT deliberately put something in your software package intended to cause damage, but a company thinks I did, must I go through a lengthy and expensive lawsuit, or can I point to the lack of warranty, etc. and have the suit dropped?

Because if I can't easily get the suit dropped, then a company can use the threat of a lawsuit to get me to make changes to the code, eg, to "fix" the code they think I maliciously added.


It's way less than $2M, since they don't bribe people to be evil, they just buy the tool and make their own alterations.


Right, but an SBOM (under the big assumption that it's fully accurate) would at least let you detect project ownership changes like that.

Not that I know what you would do with that information.

My scenario was to give an example of a supply chain attack where even a fully accurate SBOM would give no signal.

Fundamentally, if you are using FOSS distributed for no cost, and with no contractual relationship with the developers, then how complete and useful will a SBOM really be?

To be absolutely clear, I don't mean to dismiss the idea. But I remember Y2K when companies sent Y2K compliance request to vendors in their supply chain, including open source projects that had no obligation to the company, which struck me then as presumptuous. I don't like the assumption that because company X decides to use your free/open source software that obligates you to be aligned with industrial requirements.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: