Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No one can agree on what should actually be part of the regulation though. It would end up becoming a box checking exercise like security is where complying with requirements and actually being secure are unrelated.

Can you imagine if governments regulated something like if your code touches important data, it must either be written in a verified safe language like Rust? How would people feel about that?

Regular engineering is a lot easier to regulate because there are a lot of agreeable rules like “fire escape stairs need to be x cm wide with a step depth of z cm. And there is no situational context that needs to be taken in to account.

While all programming best practices are vague helpers that should be discarded as soon as they aren’t helping.



> Can you imagine if governments regulated something like if your code touches important data, it must either be written in a verified safe language like Rust? How would people feel about that?

Considering the history of memory-not-safe bugs and exploits and the efforts to rewrite critical things in rust due to memory safety I imagine this could be a real debate.

In the “real world” we have requirements for materials and tools and practices because some are less safe. Imagine if a building was welded with a technique that could fail? We’d legislate away from that technique. Even when people say “there are ways to make unsafe welding ‘safe enough’”.

Software could maybe use some rigor even if it means “go slow and don’t break things”.


I think regulating against unsafe languages like C might be one of the few agreeable rules. But I'd be concerned it would end up regulating in a mountain of useless design patterns and bloat which could freeze the state of software development.

If these laws came out 20 years ago I'd imagine we'd still all be using Java with UML diagrams legally required.


Software - or computer systems - are regulated in cases where it matters. Airplane stuff is regulated by the FAA. Medical devices are regulated by the FDA. One of those "sudden acceleration" suits went against the mfg partly because they hadn't followed proper coding standards. Credit card stuff is regulated by the industry rather than government under the pci-dss standard. Cloud providers advertise as being hipaa compliant, so that set of rules must have a technical component.


I have worked on medical devices and HIPAA controlled data. I have seen PhDs leading their field work next to a college grad who didn't even have a CS degree work on the same code. Trust me: you do not want to be operated on by a robot made by the kid out of college.


Software isn't all that different. There are plenty of things that the government could mandate.

For example, there could be coverage requirements. Want to work on software that determines if a bridge will be safe or not? Then you need 100% branch coverage.

You gave as an example something from a building code. That's just a test case. There's no reason why for common types of software (CAD, fluid simulators) we could not have repositories of test cases that you must pass to within some tolerance. You could even make it adverserial and let people suggest new test cases.

Or we could decide on some reference implementation that you must either match, or exceed. So if you design CAD software, on any input that both your CAD software can handle that this reference implementation can too, you must match. That's more of a programmatic open ended test case.

If we're worried about discrimination, there's no reason why the the government couldn't publish a set of typical profiles (fake names, locations, credit scores, etc.) that your algorithm must treat equally to within some tolerance.

etc.

We can totally regulate software like we do other engineering disciplines. If we wanted to and thought that the resulting increase in cost and slowdown in development is important enough.


Code coverage is not a very useful metric. It’s very easy to have 100% coverage without really testing anything. It works as a vague measure of test quality as long as none of the developers care about the value. As soon as they start making the coverage stat a goal itself, it becomes useless.


I think the regulation should happen upstream of the software. Code that goes in a medical device should be subject to different levels of rigor than code that goes into a free to play mobile game.

To use your example of CAD software, certainly the system used to design a bridge should be built differently than the one used to make inputs to a consumer level 3D printer.

There’s no such thing as a “casual” bridge or a “good enough” pacemaker, but that analogy holds in the software industry.

Software’s infinite flexibility means that it should be regulated within its context of use.


Code coverage tells you code was executed, not tested. What a worthless metric.


> ... governments regulated something like if your code touches important data, it must either be written in a verified safe language like Rust?

You are making a strawman.

Having a professional body that people could be ejected from makes a lot of sense in case of egregious misconduct and malice.

In events like Boeing 747 MAX, the car emission defeat device, backdoors & spyware, engineers should have safe channels to report wrongdoing, and get legal protection, but also face expulsion if they say nothing and.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: