... because if you don't and someone malicious also discovers this vulnerability they can use it to do bad things? If I can get a vulnerability patched before it can be exploited, I can potentially prevent a hacker from stealing people's identity, credit card numbers, private data, etc. To have that opportunity and not act seems irresponsible.
I must be misunderstanding. Would you mind expanding on this more?
You are not misunderstanding. I do not in the general case have a duty to correct other people's mistakes. The people deploying broken software have a duty to do whatever they can not to allow its flaws to compromise their users and customers. Merely learning something new about the software they use does not transfer that obligation onto me.
I would personally in almost every case report vulnerabilities I discovered. But not in every case (for instance: I refused to report the last CryptoCat flaw I discovered, though I did publicly and repeatedly warn that I'd found something grave). More importantly: my own inclination to report doesn't bind on every other vulnerability researcher.
Well, I'm glad you do report the vulnerabilities you find. Maybe it's my own naive, optimistic worldview, but I profoundly disagree with your stance that a researcher is not obligated to report. I think it is a matter of public safety. If you found out a particular restaurant was selling food with dangerously high levels of lead, aren't you obligated to tell someone, anyone for the public good? If you don't, you aren't as culpable as the restaurant serving this food, but that's still a lot of damage you could have prevented at no real cost to yourself.
I understand morality is subjective, but that's my 2 cents on the matter.
EDIT: about the vulnerabilities you didn't disclose, I really can't understand why not. Why not just send an email to the maintainer: "hey, when I do X I cause a buffer overflow"? You don't even have to help them fix it. You probably won't answer this, but can you tell me why you wouldn't disclose a vulnerability?
I do not report all the vulnerabilities I find, as I just said.
I confess to being a bit mystified as to how work I do on my own time, uncompensated by anyone else, which work does not create new vulnerabilities but instead merely informs me as to their existence, somehow creates an obligation for me to act on behalf of the vendors who managed to create those vulnerabilities in the first place.
Perhaps you have not had the pleasure of trying to report a vulnerability, losing several hours just trying to find the correct place to send the vulnerability, being completely unable to find a channel with which to send the vulnerability without putting the plaintext for it on the Internet in email or some dopey web form, only to get a response from first-line tech support asking for a license or serial number so they can provide customer support.
Clearly, you have not had the experience of being threatened with lawsuits for reporting vulnerabilities --- not in software running on someone else's servers (which, absent a bug bounty, you do not in the US have a legal right to test) but on software you download and run and test on your own machine. I have had that experience.
No. Finding vulnerabilities does not obligate someone to report them. I can understand why you wish it did. But it does not.
I see you point about it being overly difficult to report vulnerabilities, especially legal threats, that seriously sucks. I guess I believe you have an obligation to make some effort to disclose, but if a project is just irresponsible and won't fix their shit, or will try to sue you, it's out of your hands.
Somehow my doing work on my own time creates an obligation for me to do more work on behalf of others.
Can't I just flip this around on you and say you have an ethical obligation to spend some of your time looking for vulnerabilities? If you started looking, you'd find some. Why do you get to free-ride on my work by refusing to scrutinize the stuff you run?
> Somehow my doing work on my own time creates an obligation for me to do more work on behalf of others.
To some small extent, yes, though how much work is up for debate. Maintainer's email and PGP public key is right there on the website? Yeah, I think you're obligated. No email you can find, no way to contact them, or are just outright hostile? No, I think you shouldn't have to deal with that.
But I feel like you agree with that, though maybe not in those exact words. After all, you've had to jump through all kinds of hoops to disclose vulnerabilities, been threatened with lawsuits for doing the right thing, and yet you still practice responsible disclosure in almost every case in spite of the burden of effort and potential risk. Aren't you doing it because you think disclosure is the right think to do? That's all I mean by obligation.
EDIT: sorry, not "responsible disclosure," "cooperative disclosure" or whatever term you want to use for disclosing the vulnerability to the maintainer.
I think it is a matter of degree. Here - not sure how this is handled in other countries - it is a crime if you come across an accident and do not attempt to help. And to me this is obviously not only the right thing to do because it is required by law but because there is a moral obligation to do so.
Nobody has to enter a burning car and risk his life but at least you have to call the emergency service or do whatever you can reasonably do to help. And it really doesn't matter whether you are doing your work delivering packages, whether the accident was the fault of the driver because he was driving intoxicated, if somebody else could also help or whatnot.
Discovering a vulnerability is of cause different in most respects - the danger is less imminent, the vendor may have a larger responsibility and so on. But the basic structure is the same - more or less by accident you end up in a situation where there is a danger and you are in the position to help to make the outcome probably better.
So I think one can not simply dismiss that there might be a moral obligation to disclose a vulnerability to the vendor on just the structure of the situation, one has to either argue that there is also no moral obligation in the accident scenario or argue that the details are sufficiently different that a different action - or no action in this specific case - is the morally correct or at least an morally acceptable action.
Accidents and vulnerabilities are not directly comparable, so a position on vuln disclosure does not necessarily imply a particular position on accident assistance.
I would feel a moral obligation to help mitigate concrete physical harm to victims of an accident. I feel no such obligation to protect against hypothetical threats to computer systems.
Chances are, you recognize similar distinctions; for instance, I doubt you feel obligated to intervene in accidents that pose only minor personal property risks.
That is also my point of view, severity and other factors matter. But that also seems to imply the same thing for vulnerabilities - discovering a remote code execution vulnerability in Windows might warrant a different action than a hidden master password in an obscure forum software no one really used in a decade. The danger is still more abstract but it can still cause real harm to real people.
I would personally disclose RCE in Windows, not least because I think Microsoft does a better-than-average job in dealing with the research community.
But I need to be careful saying things like that, because it is very easy for me to say that, because I don't spend any time looking for those kinds of flaws. Security research is pretty specialized now, and I don't do spare-time Windows work. I might feel differently if I did.
I would not judge the (many) researchers who would not necessarily disclose that flaw immediately.
IF there is a vulnerability, it might already be in use by hackers. People need to know about it immediately, so they can defend themselves (by closing a port, or switching to a different server or something). Companies need to be encouraged to find and fix this kind of thing without waiting for a embarrass them by finding it.
There is no such thing as responsible disclosure. The concept is nonsensical. Also, you're overestimating the consequences of a single bug. The boring reality is that bugs rarely matter.
When you say obligation, do you actually mean that? An obligation is enforced by some sort of penalty, either legal (ultimately a threat of violence) or social (public shaming). There is no incentive for meeting an obligation outside of avoiding punishment, so why would individuals and private enterprises do any infosec work?
Then you misunderstood your own logical conclusion...
You said (and I quote):
Can't I just flip this around on you and say
you have an ethical obligation to spend some
of your time looking for vulnerabilities?
No. No, you can't. Unless you could convince me that my Dwarf Fortress skills have a similar magnitude of real-world affect as the vulnerabilities I've discovered on my own and decided to pocket for one reason or another.
No. By my logic, you are better off not doing vulnerability research in your spare time if you have to worry about the legal ramifications of your actions.
The ethical conundrums are unavoidable, and those calculations are indeed difficult.
The legal consequences are artifice, and by agreeing to those (while ignoring externalities and not going public), you are likely putting others at risk.
Forget the legal consequences. Reporting vulnerabilities is work. By your logic, by doing some work in my spare time, I am morally obligated to do more work for others. I'm better off just picking something else to work on.
Perhaps so... As long as you recognize that your work is inherently dual-use (it has effects beyond your initial intent), and you don't intentionally hide that fact from yourself or others, then I have no problem with what you do.
To your second question: because some projects are fundamentally irresponsible, and providing vulnerability reports to them means making an engineering contribution, which decreases the likelihood that the project will fail.
I must be misunderstanding. Would you mind expanding on this more?