The argument from those in power against encryption is mind-bogglingly stupid, and I don't know if it is due to an extreme ignorance, or because of a lust for the power and leverage that mass surveillance grants a governing entity (I suppose it could be both?)
Regardless, using the argument of child pornography and sex trafficking is an emotional play, and it is solely designed to resonate with those who also do not understand the technology.
If this same argument took another form, e.g. if this were an attempt ban walls made out of non-transparent material because opaque walls allow child abuse to occur hidden from sight, the obvious violation of privacy would be evident to the average person.
What inevitably happens with laws that create such a drastic power imbalance between your average citizen and the governing entity is that those with power and status are exempt.
> using the argument of child pornography and sex trafficking is an emotional play
It is and it's used everytime something like this comes up actually. But if you take it for what it is; most of the lawmakers don't fully undersand what their deciding on, so they depend on lobbyists for their info and unfortunately, they're always just interested in making money, so the lawmakers get a skewed view and some nice talking points. I'm not sure if I feel bad for them or just be confident that the next generation of life long politicians might be people like us who are aware of this problem and enact laws to protect privacy.
> just be confident that the next generation of life long politicians might be people like us who are aware of this problem and enact laws to protect privacy.
Do you want a pay cut? I don't want a pay cut. Unless we take a pay cut and go become a elected, it won't be people like us who are the next generation of life long politicians.
You don't have to take a pay cut! We from MegaCorp value excellent politicians like yourselves, if you get elected we will have many speaking and consulting opportunities for you. Just make sure that you keep doing the right thing and stand for the truth! (Unlike our previous guy, who started believing in lies about MegaCorp -- we had to discontinue his contracts.)
I don't think it's pay that's the issue. I'm sure you can survive perfectly well on a politician's salary. Some of us make little or nothing anyway, because we don't want to play the corporate games. The issue is that rational people who value established facts will have difficulty getting elected when running against people who rely on manipulating emotions and telling people what they want to hear.
Edit: also some parts of the world, like the US and UK, have district-based voting systems that favour large parties. You can't actually get anywhere in politics without the backing of such a party, and it would be difficult to convince the party to make you a candidate when you aren't playing the same political games.
A good life tactic used by and useful for most elite post graduate tech people I know is treating their careers as if they were sports stars.
The big time early years have massive rewards. One in a handful go on to be the team coach / manager. A few more graduate to coaching support staff at a much lower salary. A very lucky few land a job for life at ESPN or BBC Sport. Some don’t need the money. Some do.
The majority move on though and retrain as gym coaches, teachers, parents, therapists, firefighters. They take a pay cut and a role with much lower global impact and a much higher local impact. Often because the latter is genuinely very rewarding.
It’s something to look forward to as you get older but a drop in salary is going to be a fact of life unless you have exceptional transferable skills that put you at the top of the game in your chosen second career, be that in the private or public sector.
It’s something to look forward to as you get older but a drop in salary is going to be a fact of life unless you have exceptional transferable skills that put you at the top of the game in your chosen second career, be that in the private or public sector.
I can't imagine how this would ever become the norm, unless your early career is in crazy SV startup world or some similar environment, where VC funding leads to hugely inflated salaries but ageism is rampant. And if you do go into that environment but you haven't then earned enough to retire and/or learned enough to start your own business by the time the gravy train runs out, you're probably doing something wrong.
That might be their official salary (I'm assuming?) but unless senior politicians in the US are different to almost everywhere else, that will likely only represent a proportion of their total income as a result of their political career. Between the perks that come with the job while it's happening and the consultancies and board positions and speaking engagements that come afterwards or even during, a politician who reaches that kind of level is probably going to make far more money from it than just their salary.
People who are at the top of whatever they do also have many other forms of income-- consultancies, speaking engagement, board positions, but also often get paid better.
...and except for media celebs are usually not in the public limelight. Being in the public limelight frequently stinks.
That depends very much on what they do and what "at the top" means.
There are few professions where those in the upper levels can command the kind of auxiliary income that a high profile former politician can. Even in fields like tech, it's not usually the case except for in the US.
Very explictly: everyone should try and become a member of congress - they are exempt from SEC insider trading laws. Your friend at a FAANG who has the quarterly results early? Take their info and short TSLA and face no repercussions from the SEC!
There are only 535 members of congress. If you're good enough to be top 500~ in the country at something you could _probably_ make more money (lobbying/extra curricular excluded...) doing something else.
Probably very true for the laughable amounts of money I generally see in news stories about political corruption. It really seems like it should take more to buy a politician than a 20k bribe for a pardon for example (https://www.courier-journal.com/story/news/politics/ky-gover...)
Well if you want the top 500 people at benefiting society then if they earn much more than the median societal income they're almost certainly then not in the top 500 people at benefiting society.
If you want the top 500 people at benefiting you, and only people like you, and making sure that society as a whole doesn't benefit, even by preserving the planet for them ... yeah, sure, pay oodles.
Personally, no, I'll be the first to put my hand up and say I don't have the patience to try and deal with the bureaucracy but I do believe that there are some people who go into politics to genuinely make a difference and improve the society. As cynical as we can be about politicians, for sure there are people out there who would just like to 'fix' things. - Usually the roadblock to that is the interest bodies with influence.. It's annoying at best, but I hold out hope..
Someone once said to me that every politician was a class president somewhere. As I think back to every class president I've ever had, it explains pretty much everything I've ever hated about any given politician.
Under many circumstances, that would be a likely explanation, but who outside government is lobbying for this kind of invasion of privacy and what do they hope to gain from it? It's certainly not the tech firms, nor other established big businesses like retail and finance that rely on encryption to operate with reasonable security. It's obviously not the human rights and civil liberties crowd. Is it clueless "think of the children" types, and the manipulative media that serve them?
ALL the governing entities, or any entity with bad intentions, because whatever magical backdoor or whatever are available will surely leak and in that case everyone is at the mercy not just their own government (let's say they're responsible folks) but other government groups who do not care what happens to anyone outside their borders....
Why can't people who also understand technology be vulnerable to emotional pleas too? Basically all of my tech friends are passionately against guns despite never having been victims of guns, knowing someone who was a victim of guns, or guns being especially problematic by the numbers in the US. And yet they're drawn to the position because of the emotional overtones that are ubiquitous in coverage of the subject.
Not that the situations are perfect parallels, but I think it's worth reflecting on since this ultimately seems like consternation born from hubris. Everyone is vulnerable to emotional appeals in the same way that everyone is vulnerable to appeals to logic and authority. Highlighting the rhetorical nature of a specific appeal without making a counter appeal doesn't do anything productive to shift the dialogue.
It's the same reason that the 'nothing to fear if you have nothing to hide' crowd haven't lost much ground in the mainstream despite years of the digital privacy movement trying different tactics. Most people in the mainstream, even if they accept the expanded definitions of privacy unique to the digital landscape, don't feel like they've suffered in a significant enough way to be worth resisting the emotional appeal of helping the 1% of victims who the current system fails.
> And yet they're drawn to the position because of the emotional overtones that are ubiquitous in coverage of the subject.
or because they are tools made to kill?
Let's be honest this is not a good comparison (to stay polite). Not having guns at worst would force a group of people to find a new and less dangerous hobby, while not having encryption pretty much puts people's lives at risk in many countries and make having a democracy much more difficult.
Surely you see that this is the same kind of emotional appeal the original commenter is decrying. "Tools made to kill" and "less dangerous hobby" aren't dispassionate statements of fact.
I don't think they are really. Guns are tools made to inflict kinetic damage. They can inflict damage on humans, damage on animals, or damage on targets.
That may seem like pedantry, but a huge number of guns aren't design to be used in anger, fired at a living thing, or will ever be purchased for the intent of firing at a living thing. I own a shotgun exclusively for shooting at clay targets for example, and I have no intention of ever pointing it at a living thing. Firearms used for sport are frequently explicitly designed for use in sport rather than being designed for combat, self defense, or hunting.
What if you become a paedophile ? Your current intentions (regarding cryptography) will be subverted by your own emotional state. Guns and Crypto are dangerous tools and should not be in the hands of the ordinary citizen since citizens can turn dangerous. Think of the children!
Even assuming the ridiculous premise, most paedophiles don't abuse children. (And not all child abusers are paedophiles, either, but that's getting into the really dark side of human psychology.) Support structures are important, as with all such things. https://www.bbc.com/news/uk-41213657
Scratches head. I see, good to know that you agree that most (nearly all) gun owners don't shoot people. Actually, I suspect as a percentage lesser gun owners shoot people as compared to paedos abusing kids, but sadly I don't think I can get any sort of citation for that.
What I wrote doesn't imply that. I mean, I also believe that most gun owners don't shoot people, but that's certainly not implied by what I wrote (except in that both are probably-true statements).
Bringing paedophiles into the conversation in the first place was misdirection. And I resent your attempt to use my correction of an implication you introduced in doing so to support your argument.
Becoming suicidal is something that happens to a lot of people. Spontaneously gaining a shitty paraphilia is not. Spontaneously gaining a paraphilia and losing your morals is a completely fictional situation constructed purely to dodge a valid point; that's disingenuous arguing.
I can well believe that people going through a depressive episode, who are contemplating suicide, are more likely to commit suicide if they have easy access to a gun, above and beyond other potential weapons such as knives, blunt objects, water and high places. Guns take minimal planning, and don't involve overriding nearly as many instincts.
If you want to be right, instead of just winning an argument, focus on the weakest parts of your argument and the strongest parts of your opponents' arguments. You dodged from suicides to "shoot[ing] people" via a deft "think of the children", which is not arguing in good faith. You want to think yourself already right; you don't want to become right if you're not already.
For me, protecting people from themselves should never be a policy priority. I have struggled with depression, probably still do, but I genuinely don't think the ideal solution to gun related deaths is to take away the right to own a gun. Even if it's effective. We're hitting a point where we are becoming capable of protecting people despite their own wishes due to technological advances, and we're going to have to make some decisions wrt how far we want to take it.
Personally, I'd rather err on the side of freedom than safety. There are alternatives available to us that don't infringe on freedoms so much, such as universal health care and increased treatment options for mental illness, which I think would be very much preferable. I think this argument extends to encryption such that encryption can be used for nefarious ends, just as guns can be used for nefarious ends. There are definitely parallels. How much freedom do we want to trade, how far do we want to take this thing?
You need to also consider knives, rope, large bodies of water, tall structures, etc. They all can be used for suicide.
If you want perfect safety from suicidal tendencies, consider going under a 24/7 watch and having your hands tied. For some extremely severe cases, this may be warranted.
But there's no reason to subject the rest of the population to the same measures.
I see. So personal explosives would be fine by you then ? I mean explosives are used for construction, mining, rock-quarrying, excavation, materials research - all deeply vital aspects that keep human civilisation operating.
No, not at all, nor is concealed or open carry right by me, because of exactly that reason. If we want to say that the state doesn't get monopoly on violence, I don't see any logical reason to limit that simply to a handgun. IMO every person, in such a world, should be allowed to arm themselves with nuclear weapons. Absurd but I see no difference between that and guns other than scale of destruction (and the scale is still enormous for a gun).
Sure, if a law-abiding citizen was trained and followed the military nuclear safety procedures, could actually afford said nuclear weapon and implementation of all applicable regulations along with appropriate inspections and all applicable oversight and background checks, I honestly wouldn't care if they had a WMD.
At that point, there is little difference between between a military officer having access or a citizen having access.
I happen to have access to a biomedical research lab and medication / chemicals that could kill me with a very high success rate if I decided to go that route, so it's not a relevant concern.
What's the difference between a kitchen knife and one that can be used to kill someone? Nothing, you can kill someone with a kitchen knife—but that isn't what it was designed for. There are guns designed to kill unsuspecting people at a distance (sniper rifles), there are guns designed to be carried for self-defense, and there are guns designed for hunting or target sports. Any of them could be used to commit a murder, but that doesn't mean they are all equally suited for it. In the end a gun is just a tool. It's the intent of the person using the tool that counts.
I think you pretty much nailed the bottom of that argument. I'd like to point out one thing: from a country¹ in which firearms are heavily restricted (non-law enforcers can't carry), there is little stigma if any on recreational shooting², and it's certainly a right to own such firearms at home. You just can't take them to work or to do shopping, only at the shooting range or equivalent, with precautions like partial disassembly, unloaded, in a bag... Normal practice.
So it's really not about weapons in and of themselves that the rest of the world is puzzled about this American debate, not about the supposed cruelty of everyday people (nobody believes that). It's really about the fact that carrying a gun to work is a very slippery slope, even if the gun is in the car. Same idea with not carrying dangerous explosives if you can help it, the risk is too high compared to most perceived benefits.
I think the self defense argument is very much biased by the fact that once others have guns, you may feel threatened not carrying yourself; conversely if no one carries you'd rather it stayed that way... It's a snake eating its own tail from both sides.
The truth is, it's actually not normal people carrying that kills a lot in the US (although child accidents are statistically too high compared to eg Europe or Asia iirc). It's really the problem of gangs etc. Most lethal shootings are statistically related to someone's lifelong "job", not everyday honest people. But removing guns from wide circulation means we don't have e.g. teens shooting others anywhere else in the world, nowhere near the same magnitude, which is a troubling fact. Indeed, it's the person that holds the gun that counts, and young minds shouldn't have access to guns in that regard. Not enough control yet, it's a biological fact.
[1]: France, but it's the same culture in most western EU countries afaik. Not sure about those closest to the Russian federation but I'm inclined to think they generally agree with us on the matter.
[2]: Hunting is certainly midly popular here in rural areas, and those who voice criticism are 99% about the animal cruelty angle, they couldn't care less if the killing was done with knifes or arrows instead. The gun angle is just not a thing in most countries where guns are effectively banned from regular society but obviously totally accessible for sports: it's OK, really.
> I think the self defense argument is very much biased by the fact that once others have guns, you may feel threatened not carrying yourself; conversely if no one carries you'd rather it stayed that way...
Guns are not a prerequisite for feeling threatened. The self-defense argument is based on the fact that two arbitrary people armed with guns are much more likely to be evenly matched than two disarmed individuals. In particular, habitually violent individuals tend to be much more experienced at, and prepared for, unarmed combat than the general public. Skill with firearms also benefits from practice, of course, but almost any armed individual would at least stand a chance of winning against a determined attacker, whereas someone without extensive martial arts experience would be unlikely to successfully defend themselves in hand-to-hand combat. Guns represent an equalizing force.
I'm not entirely sure what you are asking. If you're asking "what's the difference between a shotgun optimized for sport and a shotgun optimized for military / police / self-defense use?" then the answer is configuration mostly. Shotguns are pretty crude instruments so they don't vary much in terms of actual mechanism between use cases. The most obvious difference would be ammunition capacity. Sport shotguns (depending on the sport and exceptions apply) typically carry between 2 and 4 shells because trap / skeet only require you to fire 2 successive shots at any given time. "Combat" shotguns have much higher ammunition capacities so you aren't reloading as frequently.
Additionally sport shotguns typically have long barrels (I believe because pushing the sight farther away from the shooter's eye has been shown to improve accuracy among other things), are heavy because weight is less of a concern, and typically lack accessory mounting points (eg for ammunition holders, flash lights, other shit) because they are unnecessary and throw off the balance of the gun which may reduce accuracy. Competition guns are also frequently configured to fire a different type of ammunition which produces less recoil for the user and puts less strain on the shoulder over long bouts of practice. This ammo may do less damage to what it hits, so it would be less appropriate if you are trying to kill a person or a large game animal.
If you are asking "what's the difference between any gun optimized for sport vs one optimized for killing things?" then what you're effectively asking is "what's the difference between a computer optimized for hitting an overclocking record vs a server optimized for running your mission critical thing in production?" Nuanced rifle differences are out of my wheel house as I don't shoot rifles much, but the biggest difference from a design philosophy perspective is the performance:reliability trade off. If your rifle fails in a competition setting, that sucks but you're not going to die. If your rifle fails in combat while someone is shooting at you, you have real problems. Like servers, rifles designed for military or self defense use are therefore designed to operate correctly under a much wider range of conditions because they can't fail, and they sacrifice accuracy to meet that requirement. The most obvious example would be the AK-47 which is notoriously reliable to the point where it's a meme due to its simple mechanism and loose tolerances, and not nearly as accurate in standard configuration as many other rifles, even other rifles used by armed forces.
The other obvious difference is fire modes in a military setting (this is obviously context dependent). There is no need for burst fire (one trigger pull firing multiple bullets) or full auto fire (holding down the trigger yields continuous fire) in a competition setting because those fire modes are mostly used for scaring your enemy and getting them to stop shooting at you. You're not interested as much in hitting them, just suppressing them. You don't need to suppress a paper target because a paper target isn't shooting back at you.
If I look at your question literally, it's a hard question to answer because anything can be used to kill someone. What's the difference between a tank and a truck that a terrorist uses to run over people?
"Black people are more likely to score lower on IQ tests" and "illegal immigrants are more likely to be involved in drug trafficking" are also factually correct. It is possible for a statement to be factually correct and politically inflammatory at the same time.
The rub is in definition, though. You've assumed far too much with your "factual statements," specifically on what does "black people" mean, what even is an "iq test," or what an "illegal immigrant" or "drug trafficking" is. That's why those are not really even factual statements but are quite politically inflammatory. The people who feel they're wrong may think they're facts but can't figure out why they aren't, not realizing that they aren't actually facts at all.
The statements are true under common and reasonable definitions of those terms.
And you can do the same thing for anything. I could easily contend that "less dangerous hobby" is factually incorrect because there are plenty of more dangerous hobbies not being prohibited, like drag racing or skydiving. And we had the whole discussion in the other thread about "tools made to kill."
Sport (marksmanship), crime deterrent (possible without actual violent use), a store of value (physically small high-value durable good with generally stable value over time), craftsmanship (manufacturing), collecting, psychological security blanket for vulnerable individuals, etc.
Shooting at a target isn't killing anything, it's a competition the same as golf or basketball or archery. People who collect firearms commonly keep them in a display case like vintage toys or sports memorabilia; it's like arguing that the sole purpose of a baseball signed by Babe Ruth is to play catch with. In many cases people manufacture firearms for the exclusive purpose of making a political statement about the ease of doing so (e.g. with a 3D printer) and the manufactured product is never actually intended to be used.
If you want some economic evidence of purpose other than killing, notice that the vast majority of firearms are never actually used to kill anyone, nor do their owners desire to kill anyone with them. Then explain how their owners nonetheless derived enough value from them to justify paying hundreds to thousands of dollars for them.
> People assault for sport too.
Assault is already, independently illegal.
A law against killing people with guns is redundant (killing people is already illegal), but a law against not killing people with guns is incoherent, so what evil is left to prohibit that isn't already illegal?
> And people would make all kinds of illegal things for craftsmanship if they could.
But they can, that's sort of the point. Since individuals can manufacture them on their own regardless, isn't it better that they be available to the people who follow the law and not just the people who don't?
> Store of value seems particularly like a stretch.
There seem to be a fair number of second hand firearms dealers who make their living from it.
It was a light-bulb moment when I learned that in medieval warfare, battles were relatively rare, and the primary mechanism of military force was the siege: camp outside and block trade/supplies, until the enemy runs out of food and gives up. Even if one has overwhelming force, fighting is expensive and risky, whether for an army or an individual [0]. This pattern replicates throughout nature: many animals develop signals to proxy their fighting strength without having to fight, due to the risk it would incur (such as growling as a signal of chest cavity size).
It's perfectly cogent to own a gun, not with any intent to kill, but to establish a power dynamic, such that one could respond with deadly force if necessary [1]. This is how America projects its military power across the world, through 400+ bases and several aircraft super-carriers, with the majority of that force going unused. It's still a projection of power, and still subject to moral scrutiny; but having a military base parked outside Qatar, just in case, is not the same thing as "that military base is a tool for invading Qatar".
I get your core point; weapons being deadly is the whole point, and even weapons acquired purely for deterrence can lead to a positive feedback loop of escalation, resulting in violence that would not have occurred otherwise. And humans are not purely rational actors; there's a simple numbers game, where the more guns are in a populace, the deadlier a small number of maniacs or extremists are going to be. It's not a problem we should ignore, and it's frustrating that NRA hardliners seem to be fine with doing so.
I don't own a gun, and I'm in favor of something resembling "common sense gun control", as well as other harm reduction interventions (particularly universal mental health care); at the same time, I consider effective self-defense to be an inalienable human right (I don't declaw cats, either). But to say that guns exist only to kill is a little overly simplistic: to take another example, North Korea acquired nukes not to use them, but to dissuade the U.S. but ever thinking about instigating regime change. They know using them can only result in their immediate obliteration; yet owning them tilts the game-theoretic dynamic in their favor.
[0] Aircraft and drones somewhat change the dynamic on this, but we can consider those out of scope in a 2A debate.
What you're getting at, I think, is that the statement "guns are a tool to kill" is a little too simplistic to be helpful.
You might event continue such a statement with something like "guns are a tool to kill, but it's not clear that their existence has lead to more killing than if they didn't exist."
Comparing one country to a completely different country doesn't really tell you anything. There are a lot more firearms murders in the US than most European countries, but there are also a lot more non-firearms murders in the US than most European countries, so all you really know is that the US has a lot more murders. (Which are incidentally concentrated in some specific cities.) The proportion of murders that use firearms also doesn't tell you much, because first you'd have to know what proportion of murderers would have just used a different weapon if they didn't have a gun.
The interesting data is what happens following the passage of gun control legislation. The proponents are always happy to point out that the number of murders involving the specific weapons being prohibited goes down, but no kidding. The real question is the effect on the overall number of murders (i.e. the ones that didn't just use a different weapon), and in particular the effect over and above the existing trendline. (You don't get to just take credit when the existing long-term trend of declining violent crime rates continues, you have to move the needle more than it was already expected to move.)
But the effect turns out to be little if anything. It turns out murders tend to be caused by things like drugs, gangs, domestic disputes or revenge moreso than access to firearms. People will use a gun if they have it, but there are a hundred different ways to kill a man and taking away one doesn't change much. Also, a disproportionate number of murders are committed by gangs with no qualms about using prohibited weapons anyway.
It actually has a more significant effect on suicides, because some of the most popular alternative suicide methods aren't as effective (as opposed to the most popular alternative homicide methods which mostly are). But we already separate known suicidal people from guns (and shoelaces etc.), and it seems like the better answer there should have more to do with addressing the fact that so many people are suicidal so that the question of which method they might use becomes irrelevant.
To be fair take look outside the media's favorite cities to pick on for gun violence and start taking a good look at the per capita instances of gun violence, particularly against women.
America definitely has a serious problem with violence in general and we can't just blame Chicago or whatever other flavor of the day the NRA has picked.
Cities like Chicago and Detroit really do represent a disproportionate number of homicides in the US. Baltimore is at more than 10 times the (already high) US national average, meanwhile states like Iowa and New Hampshire have a lower homicide rate than Canada.
More than 77% of homicide victims in the US are male.
The things you said all revolve around the gun's primary purpose and only reason for existing, which is to kill.
Target practice is just practicing getting better at killing.
Crime deterrent is threatening to kill.
It's valuable because it's good at killing.
It's a well crafted killing machine.
It makes one feel less vulnerable because you hold the ability to instantly kill someone.
Why deny the gun's purpose is to kill? It seems to imply you think it's bad to kill people, as if we could demonstrate that yes guns are only good at killing people, it might risk your guns being taken away? Seems the strongest rhetorical position is one that argues in favor of the gun's ability to kill and why people should be allowed to have that ability.
I'd argue that target practice isn't just getting better at killing (but perhaps it is for many people). And, as a result, "well-crafted killing machine" isn't necessarily all there is to it -- though only specific kinds of guns are good for shooting targets and not killing people. All your other points are valid.
Yes, there are target only guns (Olympics comes to mind), but I bet you still would follow all tenants of gun safety while handling one of those guns...
I enjoy target and clay shooting but I believe they are simply metaphors for the gun's original purpose which is to shoot living things.
We're kinda off track the original topic here though :p
I mean, I follow all tenets of gun safety even when I'm handling a nail gun, or a pressure washer, or anything else with a trigger that may or may not seriously injure someone if I accidentally pull the trigger while it's pointed in one's direction (including in my own direction). Taking a nail to the face ain't pleasant.
That is to say, whether or not you do something safely has no bearing on whether or not something is designed for killing.
> Yes, there are target only guns (Olympics comes to mind), but I bet you still would follow all tenants of gun safety while handling one of those guns...
Explosives are used by the military to kill the enemy, but you follow all the tenants of explosives safety when you're blasting on a construction site too.
> Target practice is just practicing getting better at killing.
Seems pretty farfetched given that nearly all of the people who shoot targets neither intend to nor actually do ever kill anyone.
Would you argue that the purpose of a baseball is killing people because it's practicing getting better at throwing a rock? To say nothing of javelin.
> Crime deterrent is threatening to kill.
Would you say that the purpose of the criminal justice system is to put people in jail and it fails if it manages to deter crime and then doesn't actually have to put people in jail?
> It's valuable because it's good at killing.
Why can't it be valuable because it's good for target shooting or for deterring crime?
> It's a well crafted killing machine.
That's just assuming the conclusion. If it's a killing machine then it's a well crafted killing machine, but if its purpose is to look pretty (or look scary) or satisfy local cultural norms or make a political statement, then it's a well crafted political statement.
> It makes one feel less vulnerable because you hold the ability to instantly kill someone.
Which is a similar situation to serving as a deterrent -- it succeeds even when you don't use it to kill anyone. Especially then.
> Why deny the gun's purpose is to kill? It seems to imply you think it's bad to kill people, as if we could demonstrate that yes guns are only good at killing people, it might risk your guns being taken away? Seems the strongest rhetorical position is one that argues in favor of the gun's ability to kill and why people should be allowed to have that ability.
Killing might be a purpose of a gun, but it's being alleged that it's the only purpose. Which still doesn't make sense given that it's mostly not what they're actually used for in practice.
Killing people isn't even a purpose in general, or if it is then it's a bad one. A purpose is a motive, not a means. Nobody sane has a motive of killing for no reason. Plenty of sane people have a motive of winning a sporting event or not getting robbed.
This is why "guns are for killing" is political rhetoric. Killing is bad and everybody knows it, so if guns are only for killing then guns are bad. But if guns are for deterring crime or similar, deterring crime is good and not deterring crime is bad. It's a much harder motive to argue against because it's a legitimate motive, whereas killing for no reason is just a strawman.
These baseball arguments always fall flat in the face of actual danger to population imo. Yes, a gun is designed to kill, and regardless of caliber, until you get down to a pellet gun, it is especially good at it.
The baseball comparison doesn't stand: in the hands of an adult, a great deal of work is involved in killing someone with a baseball. Threatening to kill someone with a baseball doesn't immediately give you power of life and death over them - they can fight back or run. And in the hands of a child, the baseball is harmless, no matter the harm the child wants to mete with it.
A gun is none of those things. A toddler can kill in an instant with a gun, and this has happened, and will continue to happen.
Guns are for killing. Of all the things just about any American to handle, they are the best at killing. If we stop letting people walk around with guns, there's nothing they could carry instead with even close to the level of accessible (a toddler could use it) killing power.
Like I said before, maybe try acknowledging that and arguing from their killing power perspective? I think there are strong 2fa arguments regarding the ability of minorities to defend themselves that center around the gun's design in making it extremely easy to kill people.
Guns aren't "killing for self defense". They're killing machines which lie around for a long time, and may have many effects one of which is self defense.
> ... and make having a democracy much more difficult.
Are you unaware of the fact that this is exact argument made by those who argue for the 2nd amendment? While you could argue that they are fear-mongers, with the tree of liberty talk, you can't then go on to talk about a threat to democracy by way of the government's constant violation of citizen's other rights (the 4th in this case). Emotional appeals are much easier for the confiscation proponents, because nobody expects internal consistency from someone waving a bloody shirt.
Just because two people use the same argument does not mean they have both the same validity.
Almost all democracies gives the government an exclusive right to violence. The government can order the army and police to shot people, while every citizen are forbidden to make a similar decision. The exception that exist are narrow defined and up to the legal system to decide per case if a decision to kill by a citizen can be forgiven based on circumstances.
Almost no democracies gives the government an exclusive right to private communication and private secrets. Countries which governments does claim an exclusivity in this area are called totalitarian and is seen by many as the contradicting in terms to the definition of democracy.
Yes, I understand that logical correctness is different from universal truth. But using the exact same formula to argue two mutually exclusive points in the same breath is a very good way of demonstrating why emotionally driven arguments yield poor results.
> ...exclusive right to violence.
I know that duty to retreat is much more common outside of the US, but I don't think any democracy demands that you just die in the face of a determined attacker - which would be required in your "exclusive right" characterization. While the classic way of describing it is a "monopoly on violence", the scenario you describe would be better characterized as an exclusive right to classify murder and manslaughter.
> ...exclusive right to private communication and private secrets.
Because that would be impossible, as they can't exclusively have a right to information that you generated - at worst it would be a shared right. The US does claim shared rights to everything that is possible though: the moment you share that information with anyone they claim that right - 3rd party doctrine.
Your two points would have better symmetry if you added that bit about the definition of democracy onto the end of both. That would make it easier to spot the fact that you've just made the exact same "threat to democracy" argument I just replied to.
I would argue that for the US, the textually unqualified 2nd amendment right to bear arms is clearly a more exclusive right than a citizens ‘right to privacy’ in person, possession, or communication than what is granted by the 4th amendment.
The 4th amendment was written with the express intent that it only protects you when the government (embodies as the judicial branch) feels it should. Also, given the past 200+ years of jurisprudence on the 2nd and 4th amendments, I’d say that the 2nd amendment is absolutely a stronger guarantee than the 4th. There may be some right to privacy read into the text of the Bill of Rights, but that right is not universal and not inalienable by the government.
> The exception that exist are narrow defined and up to the legal system to decide per case if a decision to kill by a citizen can be forgiven based on circumstances.
That's exactly right. It's a rare, exceptional case that someone really has to defend their life with lethal force. If you must but legally can't, how free are you?
It's an even rarer case that you would absolutely need unbreakable end-to-end encryption. So yeah, it's the same argument, and you're right that one is less valid than the other.
The number of injuries per year due to firearms (including homicides and suicides) in America is 106,000 approx. The percentage of firearm deaths including suicides is slightly less than 2% of total deaths. Those are barely significant figures. Less than .03% of Americans are effected by those 106,000 firearm incidents. There were 39,773 firearm deaths total in 2017 while there were 47,600 opioid causes deaths in the same year. And that is with an estimated 393 million small arms in the country.
I’m not claiming that these deaths do not matter, I’m just saying that the impact of firearms is much more prominently discussed than other causes of death. I find it very much in line with the emotional pleas against encryption.
Are you willing to take a bet on what fraction of guns sold to civilians have been used in a homicide?
If you’re going to insist on a teleological argument, you might as well use the same standard we use for other tools: how people actually end up using them. No rational person describes knives as “tools to stab people” even though this is part of the historical basis for their development. Instead, the vast majority of knives are used to cut food or spread butter or open boxes. Indeed, these are all more plausible descriptions of what a knife is “for”, just like target shooting or hunting food is a much more plausible description of what guns are for.
The earliest known use of cryptography is found in non-standard hieroglyphs carved into the wall of a tomb from the Old Kingdom of Egypt circa 1900 BC. These are not thought to be serious attempts at secret communications, however, but rather to have been attempts at mystery, intrigue, or even amusement for literate onlookers.
Some clay tablets from Mesopotamia somewhat later are clearly meant to protect information—one dated near 1500 BC was found to encrypt a craftsman's recipe for pottery glaze, presumably commercially valuable.
I don't trust your source. 1900 BCE is Middle Kingdom, not Old Kingdom.
Also, a lot of those older cryptic hieroglyphs turned out to be in an early Northwestern Semitic language. The idea was apparently that you're going to have spells effective against snakes coming in on ships from Byblos, the spells should invoke the deities they know in the language they know.
> "Not having guns at worst would force a group of people to find a new and less dangerous hobby"
No, not having guns at worst would deprive minorities of the ability to protect themselves. It is not only a hobby; it is also an effective means of self-defense.
Not having a gun puts your life at risk, if you encounter a deadly threat beyond your ability to defend against.
For 110 pound woman this might be a large man who wants to rape her, or for a 200 pound man capable of reasonable self-defense this might mean somebody attempting to rob him at knife point.
I wonder what things they have in common, if guns don’t seem to be a factor.
Maybe we could figure out whatever that is and work on that, instead of focusing on gun ownership, which demonstrably does not lead to more violence, and in most cases, leads to less violence.
Those countries tend to already have better economic safety nets / welfare systems and mental health systems, thus mitigating the reasons why someone might be driven to engage in violent crime (let alone use a gun in said engagement) in the first place.
That is: you're arguing that wet streets cause rain.
I think most people agree that a country needs a well regulated, armed militia to protect itself from nasty invading forces. But arming amateurs and everybody around is about as safe as everybody rolling out their own encryption algorithm.
You are right saying geeks are emotional like everybody else.
The answers you receive to your comment are a proof of this, and I, for one, felt a strong emotional urge to respond :)
Yet I don't know anybody personnally who has been killed by a crane or a forklift and still want those to only be in the hands of trained professionals.
Some tools demultiply the potential for a single human to do damage so much that I don't want any individual to casually be able to yield them.
You cannot have an encryption accident in the same way you can have a car accident. It's ok to use signal drunk, it's not more dangerous than drunk irc. It's ok tu use https if you are uneducated or in rage, it's not more dangerous than http. You have a can fully encrypted veracrypt file in the hand of children. Nobody is going to kill you with a pgp key. Your encrypted hard drive has no more storage risk than the regular one, even in a plane or in a facility with fire hazard.
All in all, encryption works by making it hard to do damages to you. It's passive. Weapons work by promising than attempting to do damages to you will result in damages to them as well. It's an important nuance.
That being said, I go to the shooting range myself. I like it. It's fun, and it's a knowledge I don't want an elite to be the only owner of. But I want my ability to do so to be heavily supervised.
It's true that the US has more gun violence than other first world countries, but we also has more violence and crime in general.
Within the US, there is no statistically significant correlation (r = -0.02) on the state level between firearm ownership and firearm homicide. (There is a statistically significant correlation between firearm ownership and firearm deaths, but this is due to suicides.)
Make of this what you will, but it's not substantiated by the facts the claim that restricting gun ownership would have a strong impact on homicides.
> It's true that the US has more gun violence than other first world countries, but we also has more violence and crime in general.
These statements are misleading as they stand because the variation in the US is so high. Basically, for all types of violent crimes, the US is divided into two very different regions: (1) particular large urban and dense suburban areas, which have rates of violent crime higher than any other developed country; and (2) the rest of the country, which has rates of violent crime lower than almost any other developed country. Even looking at the stats by state doesn't fully capture this dichotomy.
So what you’re saying is, the real problem is that Americans are just a violent bloodthirsty lot.
Maybe the rest of the world should stop telling Americans to control gun ownership (ie ”ban guns”, despite nobody ever suggesting a total ban) and instead we should be promoting control of Americans (ie. “ban Americans”, although nobody would suggest a total ban. I’m sure zoos would want some).
> the real problem is that Americans are just a violent bloodthirsty lot.
More like "Americans are more subject (relative to, say, Europeans) to the economic and mental health hardships that typically beget violent crime".
That is, I'm far less motivated to rob a store at gunpoint if I can actually feed myself and my family through legitimate means. I'm far less motivated to kill someone in anger if I have access to mental health resources that can help me address that anger less destructively. I'm far less motivated to join a gang if my life - and that of my family - doesn't depend on me doing so (or more precisely, if that gang is no longer able to convince me of that).
Guns are tools. Taking away the tool does not take away the desire to commit crime. I'd expect Hacker News, of all places, to understand the importance of root cause analysis.
A car is a tool. A truck is a tool. A 747 is a tool. All have the capacity to very quickly cause a death and injury to a unit of people colloquially known as "a fucking lot" when operated incorrectly.
Guess how society generally tries to overcome the problem of death and injury caused by "incorrectly" operating those tools?
Education, combined with testing (i.e. government-enforced licensing for operation of the tool).
A gun is a tool. A goodly number of the models favoured by Americans have the capacity to very quickly cause a death and injury to a unit of people colloquially known as "a fucking lot" when operated correctly.
Guess how society (outside of America) tries to overcome the problem of death and injury caused by "correctly" operating those tools?
Education, combined with testing, and in many case, restrictions on which kind of "tools" are deemed an acceptable "tool" for one of the tasks it can achieve, outside of death and injury to other humans.
No one needs an AR-15/etc for hunting deer or bears or whatever other lesser animal needs to die, or sport shooting, or anything really, other than laying down covering fire against the "Charlies in the trees". It'd be like if you decided to buy a 400ton mining truck, and drive it on the road.
At 9m+ wide, with a turning circle of 42m, it's safe to assume the level of damage to the roads and infrastructure wherever you go, could be measured with a unit colloquially known as "a fucking lot".
Gun control laws does not equal a ban on guns. Most US states require no licensing to own and no permit to purchase a firearm, and a number require no special license to carry it on your person in public, either concealed or otherwise. Those that do impose an age limit, generally set it at 18 - so you can buy a tool to murder your neighbours but you can't buy a bud light.
This is an interesting philosophical argument, but it doesn't actually stack up to the facts. While it may feel righteous to regulate them more, it is all but guaranteed to have next to no effect on gun violence.
The AR-15 undoubtedly has a military background, although it isn't the exact same weapon as the M16. Does this mean firearms with a less militaristic style, like the Ruger Mini-14[0], should be allowed?
This weapon wasn't banned during the assault weapons ban, for example. Yet from a purely mechanical point of view, bullets come out of the operational end when you pull the trigger. It is no less unpleasant to get shot by a sport or hunting rifle, even if the designer did not intend for it to ever be used against humans.
> While it may feel righteous to regulate them more, it is all but guaranteed to have next to no effect on gun violence.
.... You realise that there are dozens of examples of other countries that have enacted gun control, and then seen gun violence go down, right?
> The AR-15 undoubtedly has a military background, although it isn't the exact same weapon as the M16. Does this mean firearms with a less militaristic style, like the Ruger Mini-14[0], should be allowed?
Ok, so a few things here.
(a) I just referenced AR-15's because most people have heard of them. A mini-14 is no more appropriate for sport shooting than the aforementioned AR15.
(b) The rifle you mentioned is named so because it resembles a previous military rifle (M14), and is itself used by a number of military and law enforcement agencies around the world, so it's still "military style" anyway
(c) My point was that semi-automatic rifles aren't needed for hunting, or target shooting, regardless of how "militaristic" they look. 12 people being shot in a school aren't going to feel any less shot because the gun doesn't look like the military use it.
> This weapon wasn't banned during the assault weapons ban,
Given that the law in question required a weapon to have 2 feature from a list that includes grenade launcher to be considered a "assault weapon", that's not really surprising. It's a semi-automatic rifle. It also wouldn't be affected by a ban on trans fats either.
> It is no less unpleasant to get shot by a sport or hunting rifle
If someone is shooting into a crowd with a bolt action rifle, you are a lot less likely to be shot in the first place. That's literally the whole fucking point: it's a much slower rate of fire.
> .... You realise that there are dozens of examples of other countries that have enacted gun control, and then seen gun violence go down, right?
Abroad, I'm only familiar with Australia, which didn't see any significant effects either way. There was a secular decline[0] in firearm violence which continued through the ban, but it doesn't appear as if the ban had anything to do with it - the USA saw an even larger decline, and it doesn't seem to have dropped any faster after the ban[1].
It seems like the easiest way is to just look at the USA, where there is no correlation at all. So on a state level, these regulations are utterly ineffectual. It does not seem to follow that more of the same would have any effect.
It is sort of like metal detectors in airports: On some level, it makes intuitive sense that they should help, but in reality they are just useless security theater - it's possible to build a grenade from stuff you can buy after airport security, and people have accidentally brought guns on planes without knowing it.
> If someone is shooting into a crowd with a bolt action rifle, you are a lot less likely to be shot in the first place. That's literally the whole fucking point: it's a much slower rate of fire.
On the other hand, the opportunities for serious terrorism are arguably worse[2].
> and in many case, restrictions on which kind of "tools" are deemed an acceptable "tool" for one of the tasks it can achieve, outside of death and injury to other humans.
I'm not okay with this. Who are you to declare from on high which tools are acceptable for a given task?
> No one needs an AR-15/etc for hunting deer or bears or whatever other lesser animal needs to die
No one needs an AR-15 to kill people, either. Indeed, the vast majority of gun violence (among civilians at least) is with guns that are very much not "an AR-15/etc". If you actually care about reducing death by guns, you'd be going after the Glocks, not the ArmaLites.
Bringing up the AR-15 betrays an opinion driven mostly by emotion and irrationality. It's a gun that "looks scary" despite being no more lethal than any other semiautomatic .223 rifle (among which there are a lot, and despite your implied belief to the contrary, the .223 is a very common caliber for hunting/varmint rifles, and semiautomatics - while not as common as bolt-actions - are still pretty mainstream for hunting).
Again: who are you to declare from on high which tools are acceptable for a given task?
> It'd be like if you decided to buy a 400ton mining truck, and drive it on the road.
No, it'd be like if you decided to buy a Tundra instead of a Tacoma (the Tacoma here being, say, a Ruger Mini-14). A "400ton mining truck" in this context would be something more like the GAU8/A.
> Gun control laws does not equal a ban on guns.
Not yet, but anyone with a basic understanding of what the Overton Window is can see the writing on the wall.
> Those that do impose an age limit
I can't think of a single place where there's not an age limit to buy either the firearm itself or the ammunition thereof. I don't require a permit to purchase here in Nevada, for example, but I still had to present ID and go through a background check. And Nevada's among the most gun-friendly states in the US, even after the Las Vegas shooting.
> so you can buy a tool to murder your neighbours but you can't buy a bud light.
You can do a lot of things before you can buy a Bud Light that you can't do before you're an adult. Quite a few over-the-counter drugs, for example, fall into that category. I fail to see how that's relevant (unless you're arguing to lower the drinking age, in which case I'd have no real objection).
Edit: I realised after writing a heap of this that my clarification about why I mentioned the AR15 at all was a reply to another user, not to you, so I'll edit accordingly.
> I'm okay with this, to an extent.
Why would you not be fully on board with educating people when they want to use a deadly "tool"?
Edit: removed snippiness.
> Who are you to declare from on high which tools are acceptable for a given task?
Well, it isn't me though is it? I mean I'm writing the text but what I'm writing is what other governments have implemented successfully, based on simple logic.
Perhaps you could try less attacking me, and more explaining why a semi-automatic is required to hunt deer or shoot targets?
> If you actually care about reducing death by guns, you'd be going after the Glocks, not the ArmaLites.
... Still a semi-automatic dude.
> Bringing up the AR-15 betrays an opinion driven mostly by emotion and irrationality.
As I mentioned in a reply to another user: I mentioned it because most people know it. That's all. In reality the point is about all semi-autos, either long barrel or pistols.
> the .223 is a very common caliber for hunting/varmint rifles
I don't know whether you're unaware that .223 is used in bolt-action rifles too, you just want to ignore that aspect because it makes your argument stronger, or you actually just meant .223 semi-automatics are common for that "task" - it's irrelevant. My point is that it's unnecessary - you don't need a semi-automatic to hunt, unless you're fucking shit at it.
> who are you to declare from on high which tools are acceptable
Again: you need to explain why a semi-automatic is required for hunting or some other non-people-killing activity. Unless Americans uniquely have decided that "hunting" now means mowing into a crowd of deer/what have you in some kind of perverse attempt to justify the use of a semi-auto for "hunting".
> No, it'd be like if you decided to buy a Tundra instead of a Tacoma (the Tacoma here being, say, a Ruger Mini-14).
Well given that both are semi-autos, and you just compared two 'pickup trucks' I guess at least the analogy is somewhat correct but you've missed the point, and keep somehow obsessing about a different semi-auto rifle just because it's less popular with Americans. That doesn't make it not a semi-auto.
> Not yet
Seriously, slippery slope argument?
> I can't think of a single place where there's not an age limit to buy either the firearm itself or the ammunition thereof.
Well federal law stipulates 18, but I have zero clue how you enforce that on private sales without any kind of licensing or permits required.
> I don't require a permit to purchase here in Nevada, for example, but I still had to present ID and go through a background check. And Nevada's among the most gun-friendly states in the US, even after the Las Vegas shooting.
Well you (Nevada) need a permit for concealed carry and background checks are required (by the state, it seems some counties are ignoring that law), so no you aren't among the most 'gun-friendly'. Plenty of states require no background check, no permits for concealed carry, etc.
> Why would you not be fully on board with educating people when they want to use a deadly "tool"?
To clarify: I'm fully on-board with education. I'm wary about licensing/permits primarily because they have potential for discriminatory abuse against "undesirables" (e.g. ethnic/religious/sexual minorities, political dissidents, etc.), especially in places with "may issue" rules instead of "shall issue" (like seriously, how exactly is one supposed to judge "good moral character" without making personal biases inevitable?).
> Perhaps you could try less attacking me
That's fair. Sorry for my own snippiness.
> and more explaining why a semi-automatic is required to hunt deer or shoot targets?
To be clear, very few people are hunting deer with .223 (that I know of); that's more the purview of, say, the .30-06 or .308 (which are better-rated for medium/large game). .223 (and .22LR, on that note - another possible chambering for AR-15s, by the way) are more common for the smaller end of game (think rabbits) or livestock-attacking pets (think coyotes).
As for why a semi-automatic would be desirable, that should be obvious if you're dealing with multiple rabbits or multiple coyotes or multiple whatever else you're hunting (hell, in the case of defending yourself or some other human being against coyotes, having a semi-automatic could be a literal lifesaver).
And as for targets, well, you generally want to practice shooting the guns you'd actually use in the field. That's how you learn to use them safely, responsibly, and effectively. Even with, say, competitive target shooting, competitions specifically for semi-automatic firearms are not unusual.
In general, a semi-automatic is arguably easier to use than, say, a bolt-action or break-action or lever-action; it's therefore not unreasonable for people to prefer them.
Also, ARs specifically happen to be both highly modular/customizable (making it flexible enough to fill a lot of different niches - including hunting) and relatively affordable.
> ... Still a semi-automatic dude.
Well yeah, the vast majority of handguns owned by Americans are. You're still not addressing my point, though: why are we fixated on semi-automatic rifles when semi-automatic pistols seem to be the preference for those actually using guns to commit violent crimes?
(That's partially meant to be a rhetorical question - I'd argue that it's specifically because an AR-15 "looks scary" and because mass shootings with rifles, despite being a negligible proportion of gun deaths, tend to get higher-profile news coverage - but if you have another explanation I'd be interested in discussing that)
> Seriously, slippery slope argument?
Just because a slippery slope exists does not make it automatically fallacious, especially when you have politicians like Beto O'Rourke or Donald "take the guns first, due process later" Trump betraying the end-goal of repeated "compromise".
Again: the end goal should be patently obvious to anyone with a basic understanding of how the Overton Window works.
> but I have zero clue how you enforce that on private sales without any kind of licensing or permits required.
Maybe the same way you'd enforce similar laws against adults giving cigs or booze or pot to minors?
> so no you aren't among the most 'gun-friendly'
I meant culturally, not necessarily legally. Even legally, though, there are relatively few restrictions on the types of guns one may own (no "assault weapon" ban), lots of reciprocity with other states re: concealed carry permits, state-wide "shall issue" policy for concealed carry permits, state preemption of open carry laws (i.e. counties/cities can't impose further restrictions, with the sole exception of designating "safe discharge" areas), and state preemption of firearm registration laws (which eliminated Clark County's firearm registration program), among many other factors.
Really the only significantly gun-unfriendly policy is the recent "red flag" law included in AB 291 (which also included a bump stock ban and safe storage requirements, both of which IMO are reasonable, even if I personally disagree with the former). I'm concerned about the risk of abuse of such a system, and hope that Nevada's implementation can resist the seemingly-inevitable risk of treating skin color or sexual orientation or political activism as one of those "red flags".
> Plenty of states require no background check
Literally all states require background checks for sales through licensed dealers (that's a federal law). It's the so-called "gun show loophole" that states currently have discretion on. You're right that Nevada closed that loophole, but IMO that ain't really much of a dent in its gun-friendliness given the above IMO-significantly-more-impactful factors.
> they have potential for discriminatory abuse against "undesirables"
Perhaps worry about addressing the issue of discrimination then? Do ethnic or sexual minorities currently have difficulties obtaining a drivers licence?
> that should be obvious if you're dealing with multiple rabbits or multiple coyotes or multiple whatever else you're hunting
I've been rabbit (or maybe they were hares? Never seen rabbits that big before) hunting with a group of friends (only one had serious experience and owned the rifles used), and literally the only way I can see that a semi-auto action would have helped was because no one (except the owner) had any real experience shooting. Unless the animals are in a fucking pen, there's no way they're not moving in every fucking direction they can the moment they hear the first shot, so unless you're against a militant bugs bunny in a fox-hole, or you're a professional hunter (i.e. you're being paid to hunt the animals en-masse), I don't buy this angle, sorry. Can you use a semi-auto to hunt rabbits? Of course. You could use a fucking shovel or a kitchen knife or a god damn bazooka to hunt rabbits if you're determined enough. The point is that the risk to other humans increases greatly when the general population has relatively easy access to semi-automatic weapons, of any kind.
> why are we fixated on semi-automatic rifles when semi-automatic pistols seem to be the preference for those actually using guns to commit violent crimes?
"We" aren't. I'm sorry if my posts gave that impression. Semi-auto pistols (so, essentially, pistols, unless you're gonna carry around a Derringer?) are indeed likely more of a problem in overall gun violence, and their legitimate usefulness for anything besides killing/injuring another person are even more limited than semi-auto rifles.
Despite what people may think, semi-automatic rifles and pistols are not banned in Australia. They're heavily regulated.
Want to use a semi-auto rifle? You need to prove that you're using it for some serious animal control. Some actual farmers may still be using them but my understanding is that it's mostly 'professional' shooters now, e.g. controlling/culling kangaroos, camels, wild pigs, etc from helicopters.
Want to use a semi-auto pistol? Become a cop, become an armed security guard, or join a target shooting club and leave the gun at the club. That's pretty much the only way to get a Cat.H license now, to my knowledge. You can of course go to a shooting range and shoot some of the above stuff supervised without a licence or gun of your own, but I somehow doubt that even classifies as "the real thing" for Americans does it?
> Perhaps worry about addressing the issue of discrimination then? Do ethnic or sexual minorities currently have difficulties obtaining a drivers licence?
We've been working on that for nearly 250 years now. That doesn't happen overnight.
Re: discrimination with drivers' licenses, said licenses are "shall issue", so there's a lot less room for discrimination than, say, a concealed carry permit in a "may issue" county in California; the criteria are based solely on (ostensibly) objective assessments of driving ability rather than nebulous factors like "being of good moral character".
> there's no way they're not moving in every fucking direction
Yet another reason why a semi-automatic rifle makes hunting them easier.
> You could use a fucking shovel or a kitchen knife
Unless you're Usain Bolt that probably ain't gonna work too well.
> or a god damn bazooka
That... defeats the whole point of hunting rabbits in the first place, lol (unless you're doing it solely as pest control, e.g. because they're eating your crops, but 1) that seems excessively cruel and 2) that seems like it'd leave your crops worse off).
> so, essentially, pistols, unless you're gonna carry around a Derringer?
Or a single-action revolver.
> and their legitimate usefulness for anything besides killing/injuring another person are even more limited than semi-auto rifles.
High-caliber pistols are useful for defense against predators like bears (non-lethal deterrents - e.g. bear spray - are obviously preferable, but plenty of hunters carry a handgun with them for this purpose nonetheless). There are also hunters who hunt with pistols for the challenge of it (similarly to why hunters hunt with bows beyond reasons of traditionalism or legality).
----
I think, however, you've pre-assumed that killing or injuring someone (or being able to threaten to do so) is not also a legitimate purpose of these tools, even for civilians. I'd argue on the contrary; self defense is a very valid reason to use these tools for what you claim is their primary purpose. They are obviously the absolute last resort, but they are a resort that sometimes must be taken to defend oneself or others from the imminent and tangible threat of violence, and I for one would much rather have that tool available in my toolbox, so to speak.
There are numerous reasons why reliance on law enforcement to do this is inadequate, chief among them being
1) Law enforcement officers in the US have a track record of racial discrimination (and while I'm white, I have siblings and nieces/nephews and other family members - not to mention friends and colleagues - who are not, and for whom reliance on racist police to keep them safe is a crapshoot at best). Like I mentioned above, the US has been chipping away at the problem of systematic racism for its entire existence, and it ain't a problem we'll solve overnight; in the meantime, minorities still need the right to defend themselves against violence.
2) Unless there's a law enforcement officer on every street corner (and even then), it's highly improbable that they are in any position to actually stop a violent crime from happening; at best, they're frequently put in the position of after-the-fact investigation and enforcement. This is a factor even in urban areas (a police officer being a minute away is cold comfort when you already have a gun or knife in your face), let alone in rural areas where police response time might be on the scale of hours.
> Despite what people may think, semi-automatic rifles and pistols are not banned in Australia. They're heavily regulated.
Ain't that the same country where bikies are somehow getting their hands on rocket launchers?
> Ain't that the same country where bikies are somehow getting their hands on rocket launchers?
I assume you're referring to the 10 RPG's that were marked for destruction, but instead sold to criminals by the Army officer who was responsible for them, 17 years ago ?
I sympathise with the issue of racial discrimination, but as I said before, if the issue is racial discrimination, in particular when dealing with a member of law enforcement, adding a gun to the mix doesn't really sound like a smart solution. The smart solution would be to solve discrimination. Perhaps start with a simple policy of not hiring racists as cops, and firing those who are already cops? Radical idea I know.
If your argument for why the average man-on-the-street needs a gun is "well someone else might have a gun" and you don't see how that is a clearly active 'slippery slope', I can't help you.
Elected officials in your country have suggested - seriously - that the problem to "gun attacks in schools" is... "give the teachers guns", because apparently those officials live in some fantasy world where teachers are perfect and infallible, and students are all on the honour roll without a single infraction for misbehaviour. I mean seriously this idea is fucking insane. I imagine if they had a house in the path of a wild fire, and for some reason had a bunch of highly flammable material, their solution would be "let's stack it up around the house".
This is why I made the original comment: if violence (and apparently discrimination) is so ingrained in American culture, and gun control won't work (either because of lack of trying or because unlike every other country it actually doesn't work in the US), then problem is clearly not the guns but the culture, and if you won't control the bans, we (the rest of the world) should simply control the culture.
> I assume you're referring to the 10 RPG's that were marked for destruction, but instead sold to criminals by the Army officer who was responsible for them, 17 years ago ?
Sounds about right. And yeah, that might sound like a one-off issue from a long time ago, but if even Australia ain't immune to improper disposal, the US' prospects don't seem like they'd be much better.
More to the point: the ACIC estimates Australia to still have hundreds of thousands of guns in illicit circulation (i.e. not corresponding to authorized civilian ownership): https://www.gunpolicy.org/firearms/citation/quotes/13159
> The smart solution would be to solve discrimination.
And as I said before, that doesn't happen overnight.
> Perhaps start with a simple policy of not hiring racists as cops, and firing those who are already cops?
And neither does that, especially when the very people hiring those cops are themselves racists/homophobes/fascists or otherwise sympathetic with them, or (far enough up the chain) are outright elected by them.
Besides, that still doesn't address the issues with response time; even if all cops are perfectly just and rational and moral and do their jobs perfectly, until we invent teleportation and/or build a pool for some telepathic teenagers to swim around in and predict crimes before they happen, they are very unlikely to actually stop crime before it's already happened, at which point it's too late and the best you can do is hope the perpetrator gets caught before committing further crime.
> If your argument for why the average man-on-the-street needs a gun is "well someone else might have a gun"
No, there's no "might" in that argument. Criminals already have guns. They will continue to have guns even if we were to ban civilian gun ownership entirely (thankfully nobody of political significance is suggesting going that far, at least not yet). Some of those criminals, in fact, happen to wear badges and uniforms, and are exempt from such restrictions (and will continue to be for as long as they wear those badges and uniforms).
> Elected officials in your country have suggested
Elected officials in my country have suggested plenty of ridiculous things, like mandating that pi = 3 or banning end-to-end encryption.
That said...
> I mean seriously this idea is fucking insane.
Do you have a specific objective reason for that belief?
If a teacher wants to carry a firearm in the defense of oneself and one's students, and is voluntarily trained and certified to do so safely and responsibly, I don't fundamentally see a problem with that. There's not really an objective reason to object to it, with the possible sole exception of "what if a student snatches the gun off the teacher's holster" (which is applicable to law enforcement and security personnel, too, if not more so, and yet rarely happens, especially with modern holsters being designed specifically to prevent that).
And frankly, I'm more inclined to trust the average teacher to wield a firearm than I do the average police officer. The latter is statistically more likely to just end up killing the students one's ostensibly there to "protect and serve". The former is statistically more likely to actually care about the students' well being.
You're right that outright mandating it as a job duty is insane, though; nobody (or at least no civilian) should be forced to carry a firearm, especially when one is not comfortable or experienced/practiced with using one, for the same reason nobody should be forced to vote in an election or speak a politically-dissenting opinion or otherwise exercise one's Constitutional rights.
> then problem is clearly not the guns but the culture
Or, like I originally said (and to which you seemingly haven't really responded), the economic and mental health factors that are the much more visible and obvious and Occam's-razor-compatible difference between the United States and the rest of the "West". Guns don't magically induce criminal intent, nor do they magically induce mental health problems. People with criminal intent or mental health problems still have those problems regardless of whether or not they have legal access to guns. If we fix those problems, then they wouldn't feel as strong of a desire to hurt people in the first place, let alone with guns.
Those problems don't fix themselves overnight, either, but even partially fixing them makes things a lot better for a lot more people than even perfectly-executed gun control does.
Guns are an offensive implement, while encryption is a defensive one. A better parallel would be arguments in favor of banning bulletproof vests for civilians, because a criminal might use them during an attack.
You may be onto something. It took my wife years to accept my gun ownership. In her case it was very emotional response based mostly on never having seen one ( and I guess seeing all sorts of movies and breaking news on a regular basis ). She recently started talking of getting her own.
I may have said before on this site, but we, as a group, suck at framing issues. We try to appeal to logic and reason, while other groups play full-blown manipulation campaigns. It also does not help that as a group we are not that cohesive.
I don't think E2EE and guns are exactly the same but I can't help but notice that the arguments that always get brought up by E2EE hardliners are pretty similar to those of gun supporters. Namely:
1. It's vilified, by the numbers more people use this safely than for harm
2. Sure it can be used for harm but it's the only thing stopping the government from taking over
3. It's so simple that you could build it at home, and banning it would amount to banning math/geometry
4. The bad guys will have it anyways
5. Bringing up the 2nd/1st amendment a lot
6. Violence/identity theft will increase
Anyways, not to pass value on any of the arguments, just wanted to note that I've seen a pattern in the types of arguments made.
Your analogy to guns would make sense if other developed nations didnt have civilized gun policies. Seeing as they overwhelmingly do, its nonsense. Stop using a legitimate issue to pry nonsense about guns into it.
You're almost certainly trolling, but rest assured everybody does not want to go to the US, especially those from other prosperous nations. There are plenty of places just as bad as the US in terms of violence and public disenfranchisement, for whom America is a trade up.
> Basically all of my tech friends are passionately against guns despite never having been victims of guns
Because the statistics are against you for the vast majority of people who don't live in crime prone areas.
For people in relatively low crime areas, the mere presence of a handgun has increased their extremely low probability of death by an inordinate amount because of gun accidents. The probability is still low, but is vastly higher than if a gun wasn't present at all.
You can argue that this is due to stupid gun owners. Perhaps, but as technology folks, we also understand that you set up systems so incidents don't occur in the first place.
Every single person I know who has a handgun has at least one "accident" over 20-30 years. I know of zero who actually used a handgun against a criminal.
I will also point out, that I know a lot of people with rifles and shotguns, and those almost never have "accidents". I think I know of one over the last 30 years. Draw your own conclusions.
Obviously, if I lived in a high crime area or in a profession where it mattered, that's different.
> You can argue that this is due to stupid gun owners. Perhaps, but as technology folks, we also understand that you set up systems so incidents don't occur in the first place.
You set up technologies for this, yes. In this particular case, you design handguns that are much harder to have accidents with.
You do not set up legal systems to restrict everyone's liberty because some people have accidents.
Sure you do. See speed limits, driving licenses, etc. They were not in existence until people started getting hurt by cars. Before that anyone could drive.
> Sure you do. See speed limits, driving licenses, etc.
Speed limits are revenue sources and bear little if any relation to actual road safety. The extremely low level of enforcement of them is proof enough of that.
Driving licenses don't stop people from driving, so they are not a good analogy to what anti-gun people want to do with legal restrictions on guns. Driving licenses are analogous to permissive gun licenses in "shall issue" states, where the government can't deny you a license unless it has abundant evidence that you are simply incapable of properly handling the relevant technology (cars or guns). Considering how easy it is to pass a driving test, that's a pretty low bar. And roughly the same number of people in the US are killed each year by cars and by guns (although a much higher percentage of gun deaths are suicides, so if we just consider people killing others, cars are worse).
But I can do all sorts of dangerous stuff already and it isn't illegal.
Guns are like vaccines, I think. If only some people own guns for self-defense, as you say, they will not have too much use for them. But if some double-digit fraction of the population are packing heat at any given time, there is a sort of "herd immunity" effect. While the individual chance any single individual will get to use them is still low, it will put powerful incentives against robbery and free up our prisons.
Encryption isn't like transparent walls in my opinion.
In the real world, the police can get into anywhere, and get basically any physical object, once they have a warrant. Most people seem to agree that is reasonable -- I don't think there is a big push for an easy way for people to hide physical objects from police.
It's reasonable because police forces simply don't have enough resources to abuse this system. As a result, innocent people are generally left alone and don't live in constant anxiety that their homes could be randomly searched by armed operators. That's exactly how things should work.
Compare that to remote backdoor access to everyone's encrypted communications. All they have to do is push a button in order to instantly get access to someone's mail, calls, instant messages, social media, phone location history, etc. Abusing this power is a trivial matter and there is little if any oversight. The Five Eyes intercept, decrypt and store even their own citizen's communications as a matter of course and will tip off law enforcement so they can parallel construct a case using means of investigation that are actually legal. Intelligence employees have been known to abuse their powers to spy on their significant others, a practice that became known as LOVEINT. People who are a threat to those in power -- political opposition, whistle blowers, journalists, dissidents -- will probably be targeted despite their rights to privacy.
In order to obtain encrypted communications, police forces should have to get a warrant and obtain the physical storage media that contains the messages they want.
>It's reasonable because police forces simply don't have enough resources to abuse this system.
Could this principle be applied to encryption? A cryptosystem for personal communications carefully designed to be crackable using certain amount of resources. So getting access to one person's data would cost you say $1000 of cloud resources. And first you have to obtain an escrowed hash to crack using a warrant.
It’s not only the factor that police raids cost money; raids are also very visible, and there aren’t enough police to conduct very many raids all the time, so there’s an inherent limiting factor. Both these factors make it impractical to conduct blanket raids on everyone all the time.
> Compare that to remote backdoor access to everyone's encrypted communications. All they have to do is push a button in order to instantly get access to someone's mail, calls, instant messages, social media, phone location history, etc. Abusing this power is a trivial matter and there is little if any oversight.
Much as you can build a physical policing system with or without oversight and with or without covert powers, you can do the same for encryption. It is absolutely possible to build a system where the company, the police, and the courts all need to agree in order to hand over access — in fact it's more possible than with physical security. It's also possible to build regulatory bodies that make relevant details publically visible.
Analogies make shit arguments when we can discuss the matter at hand directly instead. Systems with central access controls are vulnerable to at least 2 separate sorts of attacks. One procedural and one technical end.
On the procedural side its trivial to degenerate from
* subjects privacy is breached when the lawfully constituted authorities present proof to a judge of reasonable suspicion that subject has engaged in antisocial and immoral acts against society or its members
to
* it would be be awfully nice if we had access to everyone's personal data so that we could punish anyone we like for anything we like
In the United States we have already standardized on breaching everyone's privacy to the greatest degree possible with little or no recourse for the citizenry. Once you have all the information on whom speaks or believes in a certain way acting against them is a relatively smaller step.
On the technical side a centralized system is impossible to secure. When not if it is compromised to the victors go the spoils. It is comparatively harder at least to attack millions of clients even if in theory a central node could be used to compromise clients this can in practice trivially be made harder to exploit by just not allowing server to push to clients and not updating every second. Attacks may need to remain undetected for a substantial length of time before a substantial portion of the user base is poisoned and the chance of detection climbs towards 100% the more parties you attack.
Furthermore if you become 100% effective at detecting illegal porn shared via whatsapp doesn't it follow that users will use a decentralized means of transmitting illegal information. We would be substantially disadvantaging the population as a whole for little gain.
We ARE talking about porn I'm not sure how you can be confused on that point. Nobody is defending the act of abusing kids or scumbags sharing media of same. Both acts are evil.
The biggest point is that backdooring everyone's communication will do exactly jack to prevent child abuse. It wont even make it much harder for scumbags to share it even more of them will use just use privacy preserving p2p applications. In fact if you fuck with everyone's sense of privacy you will be liable to push a much much larger portion of the population to use such tech.
The greater number of normal people who aren't drug dealers, crypto nerds, or pedos using stuff like TOR the harder it will be to pick out the weirdos. Meanwhile most of the harm to children wont be captured on digital media because it will keep happening when trusted adults abuse minors they ought to be protecting instead of harming.
Attacking the sexual abuse problem in America by backdooring communication platforms is about as effective as reducing deaths due to traffic accidents by hanging out in parking lots.
> Meanwhile most of the harm to children wont be captured on digital media because it will keep happening when trusted adults abuse minors they ought to be protecting instead of harming.
Can you explain what you’re getting at here? I don’t get how this fits into your argument.
The problem they are highlighting is child abuse. Child abuse is indeed a horrible thing. It's also a very hard thing to combat because it largely happens when a trusted adult abuses a child and mostly doesn't document it for posterity or share it on the internet. The victims face the double whammy of shame and social pressure not to accuse the victimizer.
Attacking the perverts sharing pictures is worthy but does almost nothing to prevent child abuse. The best case scenario is you expose a small portion of perverts and thus prevent idiots from trusting their kids with them.
Unfortunately attacking messaging services will do little to combat perverts sharing pictures as they can trivially switch to using slightly better technology.
The effect of multiplicative. Imagine you start with 1% of victimizers sharing media online. Imagine that you catch 1/10 of 1% of them before the rest figure out centralized sharing is not the best idea. You are now stuck with permanent downsides and will catch few additional people.
0.01 * 0.001 = 0.00001
The logical conclusion is that this isn't a very great way to combat child abuse.
Great! But you seem to be the one who’s confused. I was commenting on semantics. The “illegal porn” being referenced above is not porn. It is abuse.
Carry on with your discussion of why backdooring encryption is bad. I wasn’t commenting on that matter. This entire thread has no real discussion of how to solve the “encryption-while-scanning problem.” With the massive growth in CSAM sharing in the last few years, enabled by easy to use services like facebook and dropbox, it’s clear that solving this dilemma is just as important as protecting some normal person’s ability to use TOR. I’m extremely privacy focused, but I’m curious to learn if crippling fb/dropbox/etc is worth it for the sake of solving this issue.
Also, “traffic accidents” isn’t preferred, either! Do you hate me now? Language is important. I’m a transportation planner who focuses on safety, so I can talk about this all day.
You CAN'T solve the encrypted while scanning situation. AI is capable of misidentifying a naked baby picture as porn. Nobody will want to run a piece of software that might accidentally report them to the FBI to have their life ruined. So long as users have the ability to write and distribute software from regions outside your jurisdiction and users have the ability to install software of their choosing you can't stop people from communicating information you don't approve of. This is equally true of political discourse you and I both agree ought to be freely disseminated as is of evil material you and I would both desire to ban.
Maximal crazy is implementing 1984 in order to pretend to stop child abuse while it continues to go on all around us.
> In the real world, the police can get into anywhere, and get basically any physical object, once they have a warrant.
Objects/documents aren't analogous to walls.
If you were required to have transparent walls then the police could put cameras everywhere in the street and use them to record everything that happens in your house. If your walls are opaque then they can't. Even if they get a warrant, there is no way to go back and look at a recording of what happened inside your house the day before, since the opaque walls prevented any such recording from being made. It's well analogous to transport encryption.
Consider this: If encryption is suddenly turned off for all communications, and police have access to all of it via warrants, that's like getting a warrant for every home in the US. Obviously if the police had a perpetual warrant to your home, you wouldn't want that, why is it reasonable they have one to your data?
And it has been proven many times that not only will the access be abused, the secret backdoor will not remain secret and now every unscrupulous person can potentially have the same access.
The physical world is tangible and can be accessed by pure force. The mind and thought, and by extension our intangible data, are not subject to the same rules and are (and hopefully will remain) a safe haven of privacy.
There will be no more freedom if we can't keep our thoughts secrets from the authority.
I feel like there is a difference between having a thought, and expressing that thought in (entirely tangible, in terms of electromagnetic signals) data over a wire that someone else owns. If you want your thoughts to stay yours, you’re free to keep them to yourself; just don’t transmit them through electromagnetic media that other people currently control by force.
To make this analogy work, the government is asking everyone to install new front door locks. They have a master key. They will always have a master key. They can stop by whenever they please, if they suspect a crime. If you run a red light, they can search your house (no such thing as a "limited" backdoor -- they can't partially decrypt a hard drive. They just clone the whole thing).
Except that it's more like: In addition to the locks that they can access anytime, they're also installing CCTV, microphones and other sensors so they can be sure you're fully compliant at all times.
I do not see it that way. In years past if one did something he thought government might find offensive he could hide the evidence. Which was perfectly fine -- one would not be jailed for refusing to provide self incrimination. "Go search, but I will hide it; I never had it by the way".
Perhaps a better analogy: we haven't seen legislation that attempts to outlaw physical safes that are capable of destroying the contents in response to an attempt to tamper with or force them open, have we?
Such safes are very uncommon. If some major bank announced that it's going to install destroy-on-tamper on all its safety deposit boxes, I'd definitely expect some reactive legislation about it.
Whether or not the tech companies have too much power does not sound like the most relevant question here. The lack of power of citizens of communicate privately is the more worrisome thing. And maybe some random person is right to say that all of his communications are not something to hide. It becomes something else if lawyers and journalists can't protect their communications. If the state cannot suffer that it is rather likely that it is engaging in dirty business that it wants to hide, as pretty much all of them are.
The funny thing is that true end-to-end encryption actually weakens the power a lot of tech companies have. That content isn't available for them to scrape.
By mandating that the encryption can't be end-to-end guarantees these large companies access to private data they wouldn't otherwise have. Data they can then use for their own gain. When people complain that they have that access, they'll now have the excuse the government made them do it.
> If this same argument took another form, e.g. if this were an attempt ban walls made out of non-transparent material because opaque walls allow child abuse to occur hidden from sight, the obvious violation of privacy would be evident to the average person.
Someone should move to amend this act to mandate new construction be transparent.
Just sell the counter-argument as... Do you really want China/Russia to know the inner details of the airplane you're flying on? Or to be able to intercept communications with the military and congress?
Which is frankly, much more honest than what the govt is selling reducing encryption as.
> the obvious violation of privacy would be evident to the average person.
I keep on thinking the violations of privacy we have been enduring would spark some pushback but it has not. I know some very intelligent people who realize the implications, but they eventually just give in.
With politics its ALWAYS an emotional play. That's the only way you're going to achieve any kind of support in a large portion of your constituency. Politicians are only going to try plays that they think will work, and logical arguments have to be so blindingly obvious that they are hardly logical at all -- because there will always be someone on the other side, with a playbook that knows how to shoot down anything.
There is another reason its not really effective to try to ban or even limit e2e encryption. If you're intentions are nefarious, and you have a group that you need to communicate with, you will just implement the encryption yourself. Or buy burner phones etc.
Honestly, with my cynical hat on, I feel this is actually being pushed by the marketing and advertisement lobbyists, such as Facebook, google and others, in order to data mine for advertisers into your communications..
Maybe my tinfoil hat is a little bit too big, but I really use encrypted communications so that at least there is one dialogue that's not being warped into advertisements for me.
I genuinely fear for future generations privacy, in all regards. It's worrying and it really does deserve more attention. It's so crazy to my younger nieces and nephews that when I was their age, I didn't have a phone. It blows their minds. And I'm only in my 30's.
This assumption comes up again and again: that criminals are somehow motivated and competent enough to use encryption even when the easiest way to do it are no longer available.
There’s no reason to believe that. Criminals are just as lazy and stupid as everyone else. If e2e requires an extra step, fewer will use it. If it comes with their phones’ built-in messenger, more will use it.
Every unencrypted E-mail, SMS, ICQ chat among criminals of the past is evidence here: they could have all used PGP, or one-time pads. But they didn’t. For the same reason nobody in this thread has ever used a custom one-time pad implementation for any real communication: it’s fun to imagine when you’re in the shower planning your evil empire, but it sucks in reality.
Note that I agree that this law is a particularly stupid idea, and that e2e encryption should be considered a universal right. It’s just this argument I don’t buy.
The thing you're overlooking is financial motive. If there's money to be made selling illegal secure communications software to criminal enterprises, then someone will make that software available and easy to use.
You and I don't use custom e2e software because there are good products already available.
To somewhat counter the point about unencrypted communication between criminals, see the leaked opsec guides of some terrorist organizations. They're fairly sophisticated and it seems unlikely that lack of legally available solutions would prevent them from acquiring replacements.
Exactly, the problem for the baddies has been solved since at least 1882 (Frank Miller iirc: Edit Wiki says yes). So this argument about security is obviously just what information they're getting.
You can't blame someone for not knowing what they don't know, but lawmakers are supposed to enact laws based on the benefit for their citizens. Seems not to be the case here.
I've seen OTPs described as being a replacement for e2e encryption before, but I just don't see it as actually usable outside of large established organizations (NSA, cartel, etc.) to use for passing short text messages. And even then the usability seems cumbersome enough that they'd only use it when strictly necessary.
I don't see why it's limited to short text messages.
Your pre-shared OTP can be a rack of 8TB hard drives delivered to an embassy by the Marine Corps which covers a whole lot of documents and media before it's exhausted.
Something I really wish existed: a very cheap, pocket key chain size, battery powered USB device that stores crypto keys and OTP pads. The idea is that you set it up (plug it into a PC or phone) with an identity and contact information (in the spirit of vCard). The key feature of the device is that touching two of the devices together should automagically exchange identify information, generate and exchange public keys, and continue filling all available storage with shared noise usable for OTP. Each side generates their own random[1] stream of data and shares it with the other device; both sides then XOR the two streams of data together and stores it locally.
After exchanging keys and pads in person, plug the device into a computer like a flash thumbdrive, and (with hypothetical software support) both people can now use the keys for end-to-end encryption without having to worry about authentication, and a utility could perform OTP while keeping track of how much random pad is remaining. If they want more pad, leave the devices plugged in longer. The actual encryption should be performed on the device, so the host computer never sees the keys/pad.
As long as crypto is hard to use, people will rely on centralized men in the middle. A key chain dongle that you could simply connect to your friend's dongle for a few seconds (or longer, if desires) is easy. Instead of trying to solve the entire authentication problem with PKI or web-of-trust, you let people solve the authentication problem themselves, using the social skills they already have. Yes, this isn't useful for communicating with someone you cannot meet physically; use some other solution for those situations.
(Imagine if this became popular and you could simply go to the local branch of your bank and plug your crypto dongle into a kiosk that generates a few months worth of random pad data so all of your online banking is secured by OTP)
[1] OTP requires truly random data, which probably requires some type of hardware entropy generator. Perhaps something like this: http://holdenc.altervista.org/avalanche/
The Yubikey has similar storage features and is the perfect size and shape. Unfortunately, it's missing the main feature I'm talking about: easy inter-device communication.
> What purpose would you need it to be battery powered if it's USB?
It needs to be self powered because
>> The key feature of the device is that touching two of the devices together
I want people to be able to protect their communication with someone simply by meeting them in person and touching their USB crypto devices together. It should be a device someone stores on their person along with their other security devices like their house/car keys. The goal is to make the crypto easy so the individual can use it in situations where they already solved the authentication problem. If exchanging keys with someone depends on something complex like a computer or phone, a lot of people won't use it.
In Vernor Vinge's A Fire Upon the Deep, traders traffic in cubes of material that acts as a super dense source of pad data. Your communication partner on another ship would have the twin cube, and the two would be synced up and then provide the carrier data stream for video and other content. When your cubestuff is exhausted your secure authenticated comms cease.
I think what you're missing is that the comment you replied to never suggested OTPs are a replacement for e2e encryption. He only said that we can't take OTPs away from the bad guys.
I've mentioned this before, but I'd like to write a children's book that teaches one-time-pads. The book ends with little Jimmy getting hauled away by the feds for doing illegal math. Fun for children and parents!
The point of course, is to make clear how futile encryption restrictions are, and to perhaps make people wonder if it's really "the land of a free" when a child can commit a crime by doing some math. Fortunately, I think we're still a ways off from encryption being a crime.
> I've mentioned this before, but I'd like to write a children's book that teaches one-time-pads. The book ends with little Jimmy getting hauled away by the feds for doing illegal math. Fun for children and parents!
I absolutely love this. Animal Farm comes to mind.
Please write this, our descendants need those echoes from the past.
One-Time pad is not a practical solution in any sense. The key distribution becomes so impractical that there is basically no justification to ever use the one time pad. Also, it's a symmetric encryption algorithm and what you really need for end-to-end encryption in the sense they talk about here is an asymmetric encryption algorithm.
I don’t know, is it that impractical for a group of conspirators to meet in person a single time, exchange a bunch of 1TB hard drives with identical one-time pads, and then conduct their business accordingly? Surely that’s enough for a substantial amount of communication. Impractical for large-scale communities of illicit behavior but definitely practical for small groups of people
Quite a few "asymmetric" encryption schemes are only asymmetric for as long as it takes to get both ends communicating via symmetric encryption. This is, most notably, how SSL/TLS works.
In this case, you'd only need the asymmetric encryption to distribute the one-time pads.
lol, your suggestion is to send a key through asymmetric encryption (which must be definition be as long as the plain text). Then you send an encrypted message. If someone eavesdropped on your key (send through asymmetric encryption) the message is compromised. You might as well send the complete message through asymmetric encryption then. You just wasted bytes and gained nothing in security. The problem with the one time pad is that it opens up a myriad of side channel attacks on the key.
Maybe. It really depends on actual use case. I remember reading a case about a drug ring that used half of a dollar bill to ensure that the person courier deals with is the right person.
I think this is tinfoil hat free territory. Barr has been public about his feelings on encryption and this proposed law (unless it's been changed in the last month) specifically gives the attorney general effective carte blanche authority over how it's implemented and interpreted.
No tinfoil hat needed unless we ask why Barr has made this a personal crusade.
I hear what you're saying, but I live in the EU, where at least we have some more protections by way of privacy. (Not perfect by any means)
To speak to your point on Barr, and I'll try to be neutral here, his motivations might very well be positive to protect the US people but the method is really panning out to be damaging long term.
>what happens when messaging with ciphertext becomes illegal
Back in the late 90s, Ron Rivest proposed Chaffing and Winnowing [0]. In that scheme, no ciphertext is transmitted and therefore no encryption is involved; only plausible plaintext is transmitted. The receiver filters out the chaff, leaving only the intended message. Perhaps one could construct an end-to-end winnow-chaff scheme (E2EWC) to replace existing E2EE schemes.
Technically one can argue it's not encryption. But it wouldn't be that difficult for law makers to add a clause that "any similar method of information hiding" is equal to encryption under this law. E.g. "A donkey is regarded as a horse under this law."
I'm not sure what you mean about the outdated, you might be writting to another post but:
> Anyway, what happens when messaging with ciphertext becomes illegal?
Those who have nefarious intentions don't really care what the legality is.. Also those who are not nefarious encrypted aren't identified because they're encrypted.. Thats why this argument seems very.. Moronic at best.
Defending these attacks on encryption require defending the same laws with ALL countries in the 5 eyes spying partnership. (Australia, New Zealand, United Kingdom, United States & Canada). Failure to do so will ensure these laws will come into effect in time due to either A) the use of the 5 eyes pact to leverage the laws of other countries within the pact. Or B) through constantly pointing to the 'standard' set in Australia (or any country which does progress these laws, I use Australia as it has already implemented legislation which enables meta data collection & require participation in weakening encryption).
The Australian legal/political system IS the backdoor here.
I get downvoted everytime I suggest that a ton of what we see coming out of government, the news, popular culture, all at the same time, is being orchestrated by a focused and very powerful few and that most of what we actually see is nothing but theater intended to distract us from their real purpose (like the "impeachment trial").
But honestly... when the same idea tears across the planet at the same time from a dozen "independent" sources... how is that so hard to believe?
I have a feeling that the EARN IT act will pass based on that it so cleverly disguises it's ability to ban end-to-end encryption... I imagine most elected officials will hear something like this:
Person explaining: "It's establishes a committee to make sure people are using best practices to ensure child pornography etc isn't being distributed on their platforms"
Elected official: "Hmm, it actually sounds like this will help the children!"
Maybe it's just me shrug, but I have little faith in our elected officials to parse out the ramifications to encryption based on how the act is written.
As much as guns have grown deadlier since 2A was penned, they ultimately scaled linearly; we don't currently have weapons available to citizens which can destroy a city. If the police need to counter a citizen's misuse of guns, they can easily deploy 10x guns of their own.
This used to be true of secrets: if there was a legitimate reason for police to crack a safe, it was difficult, but possible. But with encryption, you no longer need 10x force; you need closer to (10^100)x force (not an expert, but close enough for illustration).
What's neglected/forgotten is that weakened crypto creates a power asymmetry in the other direction: while there's no way to scale safe-cracking to (10^100), such that the feds can auto-crack every safe at will, that is absolutely feasible for digital locks, if those locks are forced to be arbitrarily weak. The NSA would have the resources to pre-crack and cache every single encrypted signal, "just in case".
The nature of the mathematical asymmetry leaves no middle ground: either every citizen has access to unbreakable locks, or no one will [0] (except for the Feds themselves, of course): https://www.youtube.com/watch?v=VPBH1eW28mo
[0] There is one cogent comparison to the arguments of 2A advocates here: "if you outlaw crypto, only outlaws have crypto". In fact, it's even worse: hypothetically the feds can track gun-running, and guns are (currently) non-trivial to manufacture oneself. But all it takes is a small snippet of GitHub code and/or a white paper to craft an unbreakable lock, and disguise the data as noise, regardless what violence governments threaten. The only people affected by restrictions would be law-abiding citizens.
I might be playing Captain Obvious here, but anyway, this worldwide coronavirus pandemic state of panic is the best possible scenario in which corrupt politicians in any country could enact restrictive laws or do nasty things while their citizens look elsewhere.
If they were waiting for a good mass distraction weapon, well, this is it.
Governments don't have to wait for such natural distractions; we already live in a world where nefarious law changes are snuck into bills with positive sounding policies when major holidays occurs... Look at net neutrality, Patriot Act, etc.
Making effective encryption that lets law makers access our data, isn't just "hard" in the traditional sense. I'm willing to bet it's impossible (At least with the kinds of technologies we'll have in my life time).
Yes the cops can get a warrant for our doors, why not our computers? Because cryptography isn't an efing wooden door. They are not the same. Get it.
It's not like all of silicone valley hates lawmakers (Although many do), it's that they're asking for unicorns and pixie dust.
Because they aren't the same kind of "thing". Data isn't manifested in physical space in the same way as "my house" is.
For one, my home is in one physical place. A bad actor needs to at least get to me, to... you know, get to me. With my data, if it's open to be seen, it's open to people in China, Russia, and on the moon as well. It's not limited to my government.
Second: When the police are given a warrant to my house, I can be there, and I can observe that only actual police officers are entering my home. Data doesn't have any kind of global logging and auditing system like that. Any particular system may or may not log or audit access, but it's not a sure thing. Once it's accessible, once again, I have no way of knowing who actually accessed it.
Third: As WiseWeasel mentioned, I would argue that in the modern world, having access to my data isn't like having access to my home. It's like having access to my mind. My phone knows everything: My intimate conversations with my wife, where I go shopping, what I buy, what my political opinions are, every single photo I've taken in the last 10 years (Which in large part means: Knowing every single place I've been in the last 10 years), my financial transactions, and much more.
If the cops cannot get a search warrant to forcibly read my mind, I don't think it's reasonable they should be able to see my data either.
Fourth: It encroaches on the privacy of others. Generally speaking a warrant for my stuff, is because I am suspected of things. My neighbor isn't also searched because they happen to be near me. But plenty of the data in my life (Especially in the context of conversations) isn't just my data, it's the data of my conversation partner too. Who is now having their data investigated, without knowing it at all (And depending on where they are from, do the local cops in my own even have jurisdiction over the data of my friend in Europe?).
I could go on, but I think I've made my point: Data is not "a thing", it's an entirely different plane of existence. And we need lawmakers that understand that, and are ready to tackle the real challenges that it undoubtedly is, to write reasonable legislature about data.
What we currently have, is a bunch of people trying to fit this new square data peg, into their existing round hole legislature.
Perhaps it’s a gross oversimplification, but I imagine there will come a day when we can finally interface our brains directly with computers. When that day comes, what will prevent those in power from getting a warrant for your brain? This is the logical end of the road breaking encryption leads to. Just like the metaphor of opaque walls, this too should have an obvious answer to the layman.
The argument against backdoors by law enforcement is a very clear and understandable one - “misplace” and now everyone has access. Also if they can use it for getting data from an individual under warrant, there would be nothing stopping them from getting all data from everyone in aggregate when none is looking “to protect the children” and the like. And lastly even if those two are not the case, there’s also foreign governments.
OK so this stems mostly of the fact that real warrants are bound by locality- if you get one warrant it’s physically impossible to just use it for everyone. Thus reducing the risk of wrongly given one of affecting everyone else on the planet.
The question is - is there a way to effectively replicate something like that with encryption / maths? I mean obviously not with public / private key encryption, but we do have more complex ones already in the form of cryptocurrencies. Maybe there’s a way to make reading a backdoor very expensive and public operation so everyone knew a backdoor was used, it would be expensive to employ (as in money and resources) and could only be used once per backdoor use?
I’m genuinely curios though I’d wager the answer would be that even if it could be made it would be too expensive and impractical...
Is there any real evidence that restricting distribution of CSAM creates better outcomes for victims? More to the point is there evidence that increased aggression in enforcement of CSAM possession laws produces proportionately better outcomes? I would expect the ROI curve to go flat pretty quickly.
It's one of the most powerful weapons authorities can swing at anyone at anytime.
It's dead simple to poison some website, insert a thumb drive, or just flat out lie and put it on a device after the fact and make any target an instant socially repugnant felon. The ROI on that is fantastic
There's no evidence of it at all. In fact, logically, it doesn't make any sense that there would be: the deed has already been committed, all you are preventing is the possibility for what is called "re-victimization" and the only evidence that that ever happens is when authorities do it themselves by forcing victims to confirm they are the ones in the photos/videos.
It's a hot button for lawmakers because they want the privilege reserved for themselves. That is why these psychopaths lust after power over others, right? To partake in what money cannot buy.
If one of them is behind on encryption relative to others it's a competitive disadvantage. If all of them are forced to abolish encryption it's a welcome opportunity for better ad targeting . Additionally this increased regulatory and technological burden is a good start-up deterrent. What's not to like?
The bill is specifically going after Section 230 on the surface, though, which I would think would be of more immediate concern to them. Most blog posts even note that the encryption aspect is being targeted as a run-around.
I'm willing to entertain the logic, sure, just not sure I agree with it. Feels like there's more at stake for them (collectively) here.
This, if passed, will only accelerate private citizens in various countries into launching their own clandestine “internets” outside of traditional channels.
This is shaping up to be the end of a golden age of internationally-open communication.
I agree, in this context it's likely an emotional play to go against encryption (especially if you look who is behind the law), I'd like to know more about what tech companies mean when they say they have their own plans to tackle this problem.
To me, it seems the requirements that:
1) you should be unable to access the data
and
2) you should monitor the data for certain content
seem hard mutually exclusive:
Even if you developed some magic AI technology that could detect child porn in an end-to-end encrypted connection, well congrats, you broke the encryption, but then it's not end-to-end encrypted anymore. (And you could likely use the same technology to detect other things)
The only way I can see how you could keep both promises in some sense would be to move the detection into the client scan for content before it's being encrypted.
This would require that you tightly control all clients any kind of tampering or use of alternative clients impossible. I'm not sure, that's a good vision for the future either. (It would also render the encryption mostly useless, because the vendor could simply instruct the client to extract whatever data they are interested in. I can get the same level of security with plain HTTPS and a vendor promising not to look at the data.)
If this is the alternative, maybe a controlled, traceable way to intercept connections would be the lesser evil.
Obviously, as TFA describes, this is a huge end-run to create a problem.
But, would this leave a loophole for text-only and/or highly bandwidth-limited communications to remain end-to-end encrypted?
If you cannot sent a photo, audio or video, kind of hard to send CSAM material, yet end-to-end real-time SMS type messages are still somewhat useful in many instances (better than nothing).
Good point, but if you bandwidth-limit and message-size-limit the channel to something like the speed of a world-record typist, sending a Base64 encoded pic of any good resolution would take days rendering it essentially useless for that sort of thing.
What is your argument - that "think of the children" therefore ban all end-end encryption? If so, then any form of encryption should be absolutely banned. So should carrier pigeons, as they are very hard to intercept, and a leg band could carry a chip w/gigabytes of CSAM images.
My point is that in within a regime of highly restricted encryption (for the purposes of CSAM prevention), there should be space for a text-based conversational system that is very useful for text comms, but highly impractical for CSAM.
Limiting it to 30KB/day of transmission leaves plenty of room for useful conversational communication. That's ~45min of world-record typist speed on a full keyboard (17char/sec), or about 10 pages of single-spaced text -- a more than adequate secure conversations channel.
Yet transmitting a single 1MB image in Base64 encoding would take 45 days. One could literally be 2x faster by carrying it on a bicycle from Los Angeles to NYC.
My argument is that limiting online data transmission to text or whatever still doesn't solve the problem. Hell just get two transmission sources and two receivers and parallelize this data transmission to get CSAM twice as fast.
Yes, I fully understood from the beginning that there are workarounds that mean that this can technically still be used for transmitting images.
The point is that this remains a pragmatically useful solution.
While it could technically be used to transmit images by parallelizing it, etc., the goal is to make it bad enough for those purposes that practically, other solutions are better.
By creating a really bad channel for images, many other solutions become pragmatically better (e.g., snail mail & encrypted USB drive, carrier pigeon, etc), even though it remains mathematically possible to use this.
The analogy is a common home front door lock set. They are good enough to prevent casual break-ins, but can cracked in minutes by a professional burglar. The lock does not need to be perfect, only good enough that it becomes pragmatically easier to break in by other routes (e.g., crack a window, etc.)
The question is, what is your solution? Ban this too because it is mathematically imperfect -- i.e., ban all encryption -- or what else?
The risk of backdoors in encryption has changed since the last crypto wars.
With Huawei entering the 5G market, all ostensible law enforcement encryption backdoors now become de-facto Chinese communist party backdoors because of the pervasiveness of their equipment, and that nations interception capabilities.
The UK and Canada have approved Huawei to supply critical networks, and now end-to-end encryption on our personal devices is the only thing preventing interception by Beijing.
It also explains why the US president was so angry with Bojo over approving Huawei, because it means if the U.S. allows Huawei, it must also allow end to end encryption for citizens to protect themselves. The national security priority of mitigating that aggressive foreign interception capability for every business in the country should outweigh the special interest of law enforcement using victims groups as human shields.
That does not mean that the government has approved it. In fact, the telcos are freaked out right now that the government will say no and force the telcos to replace the Huawei stuff.
> With Huawei entering the 5G market, all ostensible law enforcement encryption backdoors now become de-facto Chinese communist party backdoors because of the pervasiveness of their equipment, and that nations interception capabilities.
How is this different from those backdoors being CIA/NSA backdoors, when that equipment was made by US vendors?
Nothing fundamentally changed. We should be building protocols that don't require us to trust the underlying network.
World-wide? It doesn't. In US? it does. Also once Google will obey to out Signal from their store, usage will decrease (and is not a wide usage currently to begin with).
Short-term effects? Power back to surveillance / power grabbers. They don't care about CSAM, they care about money/power.
Long-term effects? Like dark web, Signal (or something similar) will be used mostly by criminals, so EARN IT will fail it's "honorable" goal 100%, while achieving its hidden goal (strip privacy from ordinary citizen).
Also this will accelerate Splinternet. The future looks bleak, welcome to it.
Why does it need them in the US ? - From what I understand section 230 protects a company from lawsuits related to content posted by their users, signal is not a content platform as it does person-to-person communication
Section 230 isn't only for services that handle public posts. It covers any "interactive computer service" [0], which includes internet forums, internet messaging applications, ISPs, and even public libraries with public computers. Check out the cases in which judges cited section 230 [1].
Thanks, the most relevant case for signal appears to be Delfino v. Agilent Technologies where imunity was upheld for an employer whose email system was used by an employee to send a threatening message.
It seems strange for something like this to hinge on section 230 though, were emails providers liable for such things before it was passed ? What about cellular carriers for SMS content ?
We’ve already had the Cryptography is Free Speech test, so why is this not the same argument - you can’t tell what a person or company can or can’t say, including how it says something (plaintext or encrypted)
While I do appreciate the sentiment, and would prefer to see a bit more sanity in terms of blanket protections... Leaving this definition as obtuse as it is, is worse than actually creating a limited definition.
Realistically, having requirements similar to a DMCA notification would be more prudent. As well as requiring a contact form and/or email address that does not require login for notification.
Negotiation. Open with something ludicrous. Scale it back in exchange for concessions.
Also known as bullying when round one is “end all end to end communications” but round two is “ok we’ll drop that if you pay more tax”.
Taxing FAANG is the perennial government concern about big tech over which they slather the most. How can they get their hands on more profits! (At least, that’s the case outside the US, for the US corporations.)
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
> So in short: this bill is a backdoor way to allow the government to ban encryption on commercial services.
I'm all for this. Suddenly people in tech would have to start taking free & open source decentralized services seriously instead of lazily relying on Google and Facebook while complaining about how evil they are.
We want decentralized systems so that people will be more free. Giving up freedom in order to get them is completely counterproductive.
Decentralized systems aren't nothing; they're more resilient against these types of attacks than the alternatives. However, decentralized systems don't welcome these attacks any more than anybody else does. These attacks still hurt us, and they still make our life harder -- China isn't more free because its centralized services are all back-doored.
There's a (thankfully fringe) group of people who keep saying that if Section 230 goes away everyone will just switch to more Open protocols and it'll be fine. But this is painfully naive; a pseudo-requirement to backdoor communications will make it harder to build Open platforms and onboard users, because no commercial host will want to touch a platform that exposes them to liability. Getting rid of Internet freedom turns the people using these systems into criminals, which will make Open platforms much more dangerous to use, much more risky to sponsor, and much harder to advertise or develop.
What happens when you go to host your private encrypted email on Linode, and Linode says, "no, because then we can't scan your server for CSAM"? What happens when the Matrix org tries to set up a free server to onboard new users and the government prosecutes them? What happens when every user running a Tor exit node becomes liable for content traced back to that IP? What on earth makes you think the DOJ won't prosecute hosts of Open services?
There are so many ways this law can go wrong, and so many ways it can be expanded from here to shut down the projects you think it's helping.
> There are so many ways this law can go wrong, and so many ways it can be expanded from here to shut down the projects you think it's helping.
It'd have to be expanded by a ridiculous amount. To the point where it's practically speaking illegal for anyone to run non-approved software on their computers, if it can send messages over the internet. I'm not saying they wouldn't try that, but in practice it's completely unrealistic. Which means, at best, it'll be a law that criminalizes everyone and nobody really cares.
Hilariously, the same tech behemoths that people give up their freedom to are doing everything they can to push us into the same exact situation, with their walled gardens and power asymmetry that allows them to squeeze out competition.
> But this is painfully naive; a pseudo-requirement to backdoor communications will make it harder to build Open platforms and onboard users, because no commercial host will want to touch a platform that exposes them to liability.
If you play your cards right, these "platforms" are more like internet routers and load balancers that facilitate message exchange between computers. You build your platform on top of this technology. Banning the technology would be akin to banning TCP/IP+TLS or UDP+DTLS. I don't see hosting providers having a case for banning encrypted transport protocols. And the law in question doesn't go that far; again, for it to go there we'd need a law that virtually bans all encrypted message exchange on the internet.
> What happens when the Matrix org tries to set up a free server to onboard new users and the government prosecutes them?
Once decentralization becomes a thing techies care about because they need it, I'm sure we can spread software by word of mouth just like we did back in the early days of Kazaa, DC++, Torrents, etcetra. In fact that's pretty much how software gets adopted today. You only need faddy & flashy onboarding when you're trying to growth hack a product that has no intrinsic demand for it. We'd be way past that point.
> What happens when every user running a Tor exit node becomes liable for content traced back to that IP?
That's already something you should worry about if you're about to run a Tor exit node. I don't recommend it.
Exit nodes are just bowing down to the centralized clearnet. A proper decentralized network is one where the exchange stays within, and doesn't rely on a single point source as a hosting node that can be taken down. If a message is in the network, it can be anywhere, and it can and will replicate itself if requested.
> What on earth makes you think the DOJ won't prosecute hosts of Open services?
The easy way out for them is that everyone uses the same handful of services provided by a handful of tech behemoths. Then they only need to prosecute those, if they don't play along. That's largely where we've been headed, and I think it really sucks, because I lose my freedom and privacy both to these tech companies as well as their government.
If individual people started participating in a network and hosting their own nodes, the situation becomes much harder. If they seriously tried to prosecute everyone, it'd end up being much like the neverending war on drugs (or piracy or similar). Except that in this case, you wouldn't be up against just potheads and kids downloading movies, you'd also be up against professionals (and not only in tech) keeping their comms confidential. But everyone, individually, is a small fry, so fighting a legal battle against everyone is very nonproductive, unlike slapping one megacorp with millions in fines they can actually pay.
The status quo is terrifying, and it is not getting better because even technical users are too lazy to care if they don't have to. If I move to decentralized services that try to provide freedom and anonymity, I'm just isolating myself from everything else and simultaneously painting a target on my back. That's because these things do not have enough mindshare, and they will never have enough mindshare when techies just encourage everyone to keep using google & fb & co.
> It'd have to be expanded by a ridiculous amount.
> If you play your cards right, these "platforms" are more like internet routers and load balancers that facilitate message exchange between computers.
I think it's important to point out that your model of how a decentralized world should work isn't the current reality or even direction that most projects in this area are going. In the Fediverse we heavily rely on homeservers/instances that coordinate with each other. Tor is the only completely blind system that springs to my mind right now, because it turns out that for a lot of stuff (Peertube, Matrix, Mastodon, Email, PixelFed, etc...) instances are a really good, scalable solution.
What that means is that if you run a popular Peertube/Mastodon/Matrix instance, you're in the same legal position as YouTube/Twitter/Slack. This law will apply to you in its current form without any exansion. If a public Mastodon server has E2E encrypted messages, they're in the exact same situation as Twitter. If a PeerTube instance isn't doing whatever scanning the DOJ decides is the standard, then they're in the same situation as Youtube.
The only hope you have is that these instances will be small enough that the DOJ won't care. But of course the DOJ will care. This law is an effort to block encryption. The only reason the DOJ doesn't currently care about these systems is because nobody uses them. If everyone migrated to Matrix today, the DOJ would absolutely start prosecuting individual servers just to make an example of them. Because the entire point is to scare people away from using encryption. The DOJ will care about wherever people go.
----
> You only need faddy & flashy onboarding when you're trying to growth hack a product that has no intrinsic demand for it.
And on the subject of popular homeservers, my Matrix example wasn't theoretical. Matrix's current free homeserver is doing a ton of really valuable work in this space to onboard new users right now, because setting up servers is stinking hard. See also Mastodon, which got a huge amount of early traction because there was a way for ordinary people to quickly join.
I run my own Peertube instance, I'm working on setting up Matrix servers for communities that I host. But in the meantime, I'm coordinating my family on Matrix's "official-ish" homeserver. This stuff is genuinely useful if you want to get ordinary people interested in decentralized systems -- it's not just marketing fluff. Nobody's going to install a server to try out Matrix. They're going to create an account on an existing service, sign up for an existing community, and then once they get used to the protocol they might migrate their identity/communities to a new self-hosted homeserver.
----
> It'd have to be expanded by a ridiculous amount.
> Once decentralization becomes a thing techies care about because they need it
All the stuff I said above is kind of a side discussion. Here's the main problem with your philosophy:
If you really want to argue purely on the merits of this bill in isolation, and you want to argue that we can take away a few freedoms and it won't quickly become a slippery slope, and that the DOJ won't make the exact same arguments about hosts like Linode, fine. But in the world you describe, E2E encryption is not something that most techies will care about. It's something that privacy advocates will care about, but privacy advocates are a minority.
If Facebook got rid of E2E encryption in WhatsApp, and none Facebook's competitors like Signal were able to roll it out either, people would not migrate off of Facebook onto distributed alternatives. They'd shrug their shoulders and go on with their lives. It would not trigger the mass exodus you want. So if you genuinely believe that the DOJ isn't going to go farther than this bill and they're not going to use it as a springboard to lock down encryption in general, then it's still a bad bill -- because it's not currently restrictive enough to get ordinary people (or even technical people) off of Instagram or Twitter.
If you're a techy, and you can't get your friends to switch from Twitter to Mastodon right now, even when Mastodon has objectively better user-freedom guarantees, better privacy, better data export tools, better moderation tools, better APIs, a more responsive development team, and more focused communities -- do you really think Mastodon rolling out E2E encrypted DMs would be enough to make those friends switch?
----
> If they seriously tried to prosecute everyone, it'd end up being much like the neverending war on drugs (or piracy or similar).
Quick side note, these are really bad examples. The war on drugs has been awful. The war on piracy has been awful. I don't think it's reasonable to look at the war on piracy and say, "this has actually helped Internet freedom when you think about it." The war on piracy has been used to clamp down on ordinary people's speech, it's killed entire businesses, it's greatly reduced adoption of torrent technology for ordinary people. Sure, torrents have still survived, for some people. Yes, people still break the law and pirate media. But the situation we're in today has not been improved by making piracy into a felony; the Digital Millennium Copyright Act isn't something that Open Source advocates like.
The current laws you allude to have real, negative consequences, and they are aggressively wielded against advocates. Ordinary people have had their lives ruined as an example to the rest of us, and that has a chilling effect on the Internet as a whole. Think about people like Aaron Schwartz, effectively killed by the Computer Fraud and Abuse Act. Selectively enforced laws that can be easily brought down on specific targets are far more terrifying than anything Facebook is doing today.
For whatever it's worth, I am pretty optimistic about the future of decentralization and federation; I am realistic and pragmatic about the work required, but I don't share your cynicism. I think that for people paying attention to this space, the status quo is currently getting better (slowly but surely). We don't need to slide towards totalitarianism to speed things up. Always remember that the goal isn't decentralization; it's user freedom. Decentralization is a means to an end. So again, it is counterproductive to take away user freedom in the pursuit of decentralization.
If you give E2EE to the masses, then the endpoints will need to remain vulnerable by design, or LE/IC won't be able to do their jobs fighting criminals.
If the endpoints are designed to be as free of vulnerabilities as possible (which isn't the case anyway - consumer phones and computers are still Mickey Mouse by design), and provide E2EE at scale, criminals will be able to operate with impunity at scale. This isn't a desirable solution.
I'd rather see a trend towards locking down endpoints, but allowing Exceptional Access for communications at scale. Allow the math to exist (code for encrypted comms can exist on GitHub, for instance) but disallow it to be distributed at scale (walled garden App Stores, large Social Networks for instance). Reduce the (growth of) entropy.
Guns aren't sold via the App Store, and Signal shouldn't be given to the masses.
The community here seems more or less unified in the belief that essentially unbreakable E2EE at scale, distributed by GOOGLE, FACEBOOK, and APPLE, is always a good idea. I don't agree with this at all.
Few people in this neck of the woods are willing to argue the counterpoint - the risks of E2EE at scale.
Somewhat related: I'd personally rather see a move towards better cooperation between social network service providers, internet service providers, government agencies, and device manufacturers. Apple, for instance, won't get involved at all if your device is hacked. Rather, it would be nice to see a trend towards designing devices to have automatic cooperation between the various parties to both prevent and investigate hacks.
> criminals will be able to operate with impunity at scale.
This isn't even remotely true. At some point criminals have to go actually commit crimes that leave a detectable impact in the real world. That is where they can be caught. There is no need to surveil the communication of everyone on the plant just to catch the small minority of people who commit crimes. The cost is not remotely worth the benefit.
They would disagree that the cost outweighs the benefit?
Of course, they are not the ones bearing the cost of having their privacy invaded despite doing nothing wrong. I'm talking about the cost/benefit to society as a whole, not one particular actor. We don't need to rearrange all of society to make life convenient for the SEC or any other single agency.
I'm sure it would also be convenient for law enforcement if they could conduct warrentless searches and detain suspects indefinitely without access to counsel, but you know there's a reason we don't allow that.
That argument kind of died when we found out we were being surveilled on mass, with all our communications being kept to be used against us at any point in a Stasi fantasy.
You can't claim trust when you've shown yourself to be totally and utterly untrustworthy.
Moreover if the public servants can get your communications so can organised crime.
Law enforcement can do their job without it. Because they've abused it so thoroughly they're going to have to.
Improving on the design of Exceptional Access systems isn't just about math, nor about considering the problems of key escrow. Rather, it's about considering how to reduce the risks for the issues you've raised.
Fighting on principle is fine, but if the laws are changed to require exceptional access for these systems, it would behoove everyone to work towards a better compromise. Otherwise, your concerns will remain.
> If you give E2EE to the masses, then the endpoints will need to remain vulnerable by design, or LE/IC won't be able to do their jobs fighting criminals.
I don't buy this argument. Law enforcement still has plenty of other ways of going after criminals. All the E2EE in the world won't stop an informant from turning over decrypted versions of communications to the cops. In fact, E2EE makes that evidence more valuable since it's harder for the person at the other end to claim they didn't send it when they're the only one with that private key.
> LE/IC will have a more difficult time fighting crime
I'm not even sure that's true. LE/IC will have to rely more on different methods of fighting crime, but it's not at all clear that those methods are less effective than snooping on everyone's communications. Snooping on everyone's communications sounds easy until you realize how tiny the signal to noise ratio is--that is, if you're actually trying to find real criminals instead of just finding reasons to mess with more people in general.
Hey, I looked at some of your other comments and your profile.
I'm sure you're not posting here wanting to pathologize you, but you really seem like you're having a hard time. I doubt my comment will help, but if I were ever in your shoes I'd want someone to at least try...
The idea there is a bright line between sane and not sane is a fallacy. Instead, all our beliefs about the world are approximations-- at best. We make up some line and say beliefs that are consistent enough with observations are sane and others aren't but the position of that line is largely arbritary. Some of these approximations are more helpful than others, some create feedback that can make us less healthy and happy then we could otherwise be even in the same situation.
It seems to me that you have found yourself surrounded by beliefs which make your life more difficult and you're having trouble escaping from them.
You can get help for your problems and you will be happier for it, almost certainly. I really hope you do.
It's unclear why you've chosen to respond to my comment about my personal situation (which I'm not hiding at all), but I suspect you're trying to conflate something about my credibility with my argument. If that's the case, please stop. In fact, I'd rather you not bring up off topic issues in response to my comment.
We likely have orthogonal life experiences, and I'm simply trying to share my experiences and views. I'd like to stay on topic in doing so for a particular comment. Thanks.
I don't agree with your argument, but I think that disagreement is really entirely unimportant compared to your personal challenges. You're entitled to your views.
HN doesn't have a mechanism for private messages, or I would have addressed you that way.
To the extent that I would want any public effect from my comment being here it would only be to remind other people that you're a human being and should be treated with a modicum of kindness. ... unlike that rude commenters who called you a FBI shill. :)
I think even you know the problems with the argument you're making. The problem with power is, it's only good for two things:
1. Using it
2. Using it to get more of it
And its mere existence will make humans do both of those things.
The only solution is not to have it. A weak government is by design, not an accident or an unfortunate side effect. Nothing in the US constitution (or any decently democratic constitution) makes or should make LE's job easy. Why? Because they'll use that power to get more power. That's what humans do.
> The new bill would make it financially impossible for providers like
WhatsApp and Apple to operate services unless they conduct “best
practices” for scanning their systems for CSAM.
I'm okay with that, as long as e2e isn't actually banned.
As the author states, encrypting your data impedes scanning it, so any encryption that is actually worth anything inmediately puts them at fault, so it wont be applied.
Well let's be clear here: I do not expect e.g. the phone system to scan my calls, nor the carriers of my IP traffic to scan my packets. Carriers are not and should not be liable for the traffic they carry.
On the other hand, online "clubs" or services like FB etc. are not neutral carriers, they are content providers or "publishers" who source a lot of their content from the people who use them.
Right or wrong, that's the mental model I'm operating from: you're a carrier and you just take bits from here to there, or you're a kind of club, like a "Sam's Club" but for Internet ( https://en.wikipedia.org/wiki/Sam%27s_club ) and you're responsible for the stuff you stock on your shelves.
Now then, my opinion (such as it is) is that "I won't make enough money" is not a valid reason for not scanning for and preventing CSAM on your internet clubhouse.
Messaging apps like Signal or WhatsApp are not providing content in the way that you describe. They are just carrying messages from one user to another, and would be rendered useless by this bill.
I don't actually know much about WhatsApp, so I can't really speak to that.
Let's talk about FTP instead. If I write and sell an FTP client and/or server I do not expect to be liable for the content the users of that software exchange. This is true whether or not my software supports e2e.
If, however, I set up my own network, invite people to use it, and let them exchange CSAM, then I should be liable. This is true whether or not my software supports e2e.
Yeah but how are they going to implement that while preserving e2e encryption? They would basically have to either handle all of it on the client, or send along a hash of whatever file was being shared encrypted with a key Apple controls. This seems ripe for abuse as it's going to allow for "phoning home" any match of an arbitrary file, and it's not going to be immediately clear if the content being searched for is legitimately CSAM or if it's being abused to search for something else.
Who's going to be the first startup out the door to try to sell "EARN IT" compliant encryption and lobby the government to recommend their product for "best practices" implementation?
If you think identity theft is bad now, wait until this gets passed! Not to mention embezzlement, corporate espionage, etc. They'll need to set up a recovery fund in addition.
How long before someone extracted the algorithm and turned it into an application to test images before they are shared? Could even use it as a benchmark for software to hide from authorities.
One thing to remember is that "The Net interprets censorship as damage and routes around it." applies regardless of how agreeable the censorship is.
The US regulators are painfully aware of device rooting and how it allows to subvert any client-side security measures.
Disclosure: I have been dealing with certain state regulators and their morbid fear of forged geo-location data for a number of months. It would be inane to assume other regulators would be any less informed about the threat vector.
If this same argument took another form, e.g. if this were an attempt to ban walls made out of non-transparent material because opaque walls allow child abuse to occur hidden from sight, the obvious violation of privacy would be evident to the average person.
No, by definition allowing a 3rd party to derive any information about the semantics of the content means it's not end-to-end encrpytion or not a two-way communication (the 3rd party is assumed to be trusted)
Fully homomorphic encryption doesn't provide an ability to operate on encrypted data and get a decrypted output. It is also absurdly slow, with order-of-magnitude performance of a minute per and-gate in the operation being performed.
But, lets forget the terms you used and consider the question of "can fancy crypto do something here"?
A protocol could be created using a zero knowledge proof and a private set intersection that could do the following: I compute the hash of a file, blind it, and then submit it for you to query against a secret database of naughty hashes (Private set intersection / Private information retrieval). Then I encrypt the file, send it, the opened intersection result, and a ZKP that the encrypted file has a hash corresponding to the query.
The server only learns if the encrypted file was a hit on your database, it doesn't even learn the file's hash if it wasn't a hit. If the private intersection scheme is setup right the user doesn't learn if it was a hit or not.
Assuming the naughty hash databases was reasonably small (like tens of thousands of items), and users were assumed to be on very fast smart phones or desktops... then could actually this could have workable performance with existing tech-- on the order of tens of seconds processing on the client, milliseconds on the server.
But, this kind of scheme is pointless: I just make a one bit change to every file I send and it'll never match. You could invoke some kind of fuzzy match but then false positives are a real problem, the fuzzy hashing is a lot more expensive to perform inside the ZKP (now you need every user on a 8 core desktop), and the fuzzy matching is prohibitively expensive inside the private intersection (so the server side scales poorly).
You could go further and make it so that the server could decrypt the entire file if and only if there was a fuzzy match (and still, the user still can't tell if a match happened)-- but even that would create really bad incentives to stuff the database with false-positive producing data (or just loads of legally protected speech which they'd like to covertly monitor). You couldn't make the database public and transparent without distributing the naughty-data yourself and without making it easy for users to self-censor anything that might match.
... and that kind of supercharged scheme is also still easily defeated by just pre-encrypting the data.
So you'd have a system that was absurdly expensive to create, expensive to operate (both for the client and server), extremely non-transparent due to its complexity (even if it was open source) and private database, and would have extremely limited ability to do its job. Users would be subjected to an uncertain and non-transparent level of non-privacy. To me that seems more dystopian than a transparent "we're gonna watch everything you do", at least with a simple surveillance state you know where you stand and you'll refrain from complaining about Dear Leader online, where it might result in you ending up in a prison camp.
The whole discussion misses the point that the real goal of these systems isn't to protect children, stop child porn, etc. (which, as awful as it is, is essentially a rounding error in the risks we face) the real purpose it to subject the population to pervasive whole-take retroactively accessible surveillance.
Such a shame that this conversation is always so one sided because the 'other' side is not vocal or portrayed clearly in the media.
There is a controlled middle ground where the alternative to end to end encryption isn't a full surveillance state but gives safety and protection to society at the expense of trusting the other side with some of our information.
The majority?? of people want govts/intelligence to monitor, detect and pre-emptively manage:
Terrorism plots
Paedophiles
Large scale criminals - drugs, human trafficking etc.
Kidnapping
Lost person location
Rescue coordination
If that cannot be done without viewing content (and I'm not sure that statement is true) then personally I'm ok to trade some of my theoretical privacy for the security.
Let's face it, how likely is it that the other side would be interested in my cat video content?
But also don't be fooled that encrypted content means your privacy is secure.
Just knowing when and to whom messages are sent is a powerful tool already and not something easily protected.
App providers already use this in surveillance capitalism used by the likes of Facebook, Google etc.
Regardless, using the argument of child pornography and sex trafficking is an emotional play, and it is solely designed to resonate with those who also do not understand the technology.
If this same argument took another form, e.g. if this were an attempt ban walls made out of non-transparent material because opaque walls allow child abuse to occur hidden from sight, the obvious violation of privacy would be evident to the average person.
What inevitably happens with laws that create such a drastic power imbalance between your average citizen and the governing entity is that those with power and status are exempt.