Read your question again and then realize you just wondered if it would be possible to engineer a virus to kill all of humanity, with no possible immunity.
See the problem?
This is why people hate (software) engineers, they are oh-so-dumb on the "empathizing with humanity" front... pie-in-the-sky ambitions that are more likely to be shitty solutions replacing ones which worked well enough, to blow up, potentially catastrophically, or cause various humanity-ending apocalypses, than to actually solve the problem in most cases...and what do they have to say for themselves?
The notion is quaint that there exists Knowledge Man Is Not Meant To Know. If it's something humans can find out, it's better that it be right out there in the open so that the danger can be met head on.
Can you explain where the uncertainty is coming from?