Yes, I can. I can picture a product vendor using a modified Linux kernel with that code in a driver, or at least a moral equivalent. Eg, a NAS box which is "only compatible with brand X drives".
Just because the code is available doesn't mean it does no damage. Only that you have the ability to find and fix the code.
A NAS box "only compatible with brand X drives" is nowhere near a NAS box that intentionally bricks non-brand X drives when attached.
As a side note, are there any known cases where a vendor has released open source code that intentionally bricks a device? I'd be surprised if they were not found legally liable if the intent was spelled out so clearly.
There are physical holes on the disk. "The test program writes the sections with a test pattern which generates a change in the pattern of magnetic domains of the medium, a subsection at a time, with a subsection responding to the test pattern only in the absence of indicia thereon, to form a stored pattern on the given section. An expected pattern and the stored pattern are compared at least a subsection at a time to determine if corresponding subsections have a predetermined pattern of magnetic domains."
If you have "insert key disk now", followed by a write to the disk, and then a verification, then a standard disk/non-key disk will get corrupted.
That 'bricks' non-brand floppies by design, under the aegis of copy protection.
There's been open source with back doors in it. Eg, Firebird had one that took about 6 months to detect, after source code release.
> Always keen to shave the last few cents from their bill of materials, these manufacturers tend to procure their firmware from low-cost suppliers that have in turn delivered open source software without passing on its licensing terms. It can come as a surprise to the hardware manufacturer to discover a violation of the license terms.
While that's about incorrect use of licensing terms, it shows there's no reason why the manufacturers are aware of all of the code in the software they sell.
Thanks for the second example. It's a more clear use-case than the mischievous do_damage() function I had in mind.
I think intent would definitely play a role in establishing liability for damaged equipment (i.e. the user misusing the device versus the company tampering with the user's equipment).
> Just because the code is available doesn't mean it does no damage. Only that you have the ability to find and fix the code.
Also your shenanigans become visible. You'd have to code extremely carefully to break the fake with plausible deniability, and even then your name as a developer would forever be mud.
Can you picture a Linux or *BSD driver in which there is an intentional