Hacker News new | past | comments | ask | show | jobs | submit login

The tech required for search and rescue is the same tech required for search and destroy.

We seem to be running headfirst into the autonomous killer bot future without giving so much as a second look at the implications.

Slaughterbots [0] is a pretty good look at what can be achieved by refining and integrating technologies which all exist today.

[0] https://www.youtube.com/watch?v=9CO6M2HsoIA




> The tech required for search and rescue is the same tech required for search and destroy.

That's a good thing. We're going to develop the search and destroy technology no matter what, so we might as well get something beneficial out of it along the way.


We may have wiped out the human race, but at least we got better at search and rescue along the way!


We may have developed ICBMs, but at least we got GPS along the way.


We had combustion engines for a couple hundred years. Then they wiped out the human race.


If the west don't develop autonomous killer robots, China, Russia, and others still will.

For all of our flaws I'd still rather that our military was on an even footing with theirs. And I can't see how autonomous killer robots won't simply be better at warfare than non-autonomous machines.


Many people are worrying about the implications. They made the video you linked to. This worry is widespread and is all over pop culture. The Terminator or Matrix movies for example. How regular people can influence weapon development is another problem altogether.


Having worked at Lockheed and seeing just how "Because profits, so FUCK YOU that's why" the defense industry works, I have little faith.

Further, what pisses me off, is how idealistic-yet-naive-and-downright-corruptible many very smart people can be.

Take for example the genius nerd who is empowered and elated at the acceptance and praise they might receive for being able to hack a system which ultimately results in the death of anonymous-to-them-other-human.

This happens in every culture.

Mental exploitation is just as insidious as sexual or physical exploitation.

This is where there is one fundamental thing lacking in all of technology, aside from Asimov: What are our first principles?

A freaking POST-BIOS requirement for any system might ultimately have to be a checkin with a prime-directive which tells it that regardless of whatever code you are running, you can't do XYZ.

We are all guilty of "wouldn't it be cool if" syndrome in tech.

What you can do in 'cool' video games, is not 100% mappable to meat-space.


> This is where there is one fundamental thing lacking in all of technology, aside from Asimov: What are our first principles?

What are you referring to? The Three Laws? AFAIR they exist in order to explore how dumb the idea of them is in the first place. Such high-level rules don't map to reality at all.

A ruleset correctly encompassing human moral values is a complex and hard problem. It's so hard that we haven't even figured out the proper formalisms for it, not to mention that while all humans seem to share some basic moral values, it gets very muddy in the details.


There was a great discussion of that and more related topics on the Waking Up podcast with guest Eliezer Yudkowski: https://wakingup.libsyn.com/116-ai-racing-toward-the-brink


> The tech required for search and rescue is the same tech required for search and destroy.

Haha, no, no it's not. (I'm very active in SAR -- headed out to a large search this weekend, even.)

We've had "search and destroy" for a decade now, but SAR still doesn't have effective use of drones. Quadcopters can navigate tricky terrain close enough to the ground to see a sufficient amount of detail, but they only get about 25 minutes of flight time.

Fixed-wing drones can get much better flight time, but we can't put them anywhere near close enough to the ground to see the average missing person.

Typically the next thing people suggest is IR. Yeah, IR's great, there's just one problem: there are a lot of IR sources on the ground, even in the wilderness, and people with clothing on aren't that great a source of IR by comparison. At a recent multi-agency training event, I reviewed footage from a repurposed Reaper drone flown by the national guard after the southern California mudslides. Making out the difference between a person on the ground and a streetlamp was impossible.

The only documented drone find I've heard of so far was a person sitting in an orchard. Probably there have been others, but the point is: SAR is a wildly different situation than people imagine.

There was just a study done by DJI and European agencies that found that drones were of negligible benefit in the missing person scenarios they tried: https://www.aopa.org/news-and-media/all-news/2018/october/01...

Helicopters and civil air patrol are superior to drones in almost every way. I've hung out the side of a UH-60 and a couple of Lakotas during exercises looking for brightly colored objects on the ground. At around 100 feet above ground level our success rate was around 60%.

Heck, we had a missing small aircraft, in the snow, a year and a half ago. There were hundreds of searchers on the ground, multiple CAP and helo overhead. Guess who found it? Some folks on snowmobiles a few weeks later.

We use a number called "POD" -- Probability of Detection -- in our reports after every assignment. It's an ass-pull guesstimate at what our odds were of locating a responsive subject, an unresponsive subject, or a clue during the assignment. High POD for a close grid search pattern on the ground in light forest for an unresponsive subject is about 70%. Average POD for an unresponsive subject by a fleet of quadcopters in the same environment would be like 5%. To effectively rule the area out, any decent search manager would deploy a ground team anyway.

So, in the first place, this MIT experiment -- while very cool -- still suffers from a lot of the same issues that this technology has in a SAR context. And in the second place, shooting down efforts to develop this technology for civilian use is silly because, well, I hate to break it to ya, but we've been killing people with (mostly autonomous) drones for a while now. That horse is already outta the barn.


looking for brightly colored objects on the ground. At around 100 feet above ground level our success rate was around 60%.

I've done SAR at sea and it is frightening how close you can be to who/what you are looking for and not see them/it because of the waves, or reflections, or low cloud/fog, or rain, and then there's the constant movement of everything due to currents and wind, the inaccuracy of GPS and shooting resections... And you can't just have guys out looking, there are only a small number of RIBs or other assets, and they have limited endurance, and the sea is very very big.

SAR is a hard, hard problem as you say. When I dive for fun now, I take an excessive amount of gear incase I ever need to be found.


How well would sound work? As in listening for the person to yell out to an autonomous vehicle flying above the canopy.

Obviously drones are noisy, so maybe fixed wing. Or maybe you could have a microphone suspended on a cable fifty feet below a quad, with an apparatus that shields noise from above. But there seem to be promising characteristics too for this situation: 1.) human voice is very different than propeller hum, 2.) wildernesses have very minimal background noise. Filter and classify outliers.

I say this as someone whose only SAR situation was getting himself out of the Vermont woods at dusk by hiking roughly a mile towards the sound of a barking dog.


People have discussed that. A flying thing that could be used to relay instructions to lost parties would be helpful in a few situations.

I think search organizations are certainly interested in UAVs, but at the moment nobody's sure how to use them effectively. It kinda strikes me as being like blockchain: cool technology, everybody wants to use it, but there aren't really that many real-world use cases where something else isn't just as good or better.

Probably the highest priority should be fixing their flight time. 25 minutes is just not enough time in the air to be useful, and fixed-wing is too tricky to navigate in too many kinds of terrain. If someone figures out how to start approaching 60 minutes in the air with a payload -- speaker & microphone, IR, comms -- that would probably start getting more attention.

Another thing that would be really cool is an airborne minicell. Just about everyone's carrying a smartphone now, they're just going into areas where there's no signal. If something could go out there offering signal, and report back when a device tries to connect to it, it wouldn't necessarily need to get close to the ground and that could really narrow down some of our searches.


You can attach weapons and AI to anything, it’s not a reason to stop progress.

Morality has kept nuclear weapons in check (so far anyways), I’m sure it will keep AI drones in check too.


You can't have a MAD scenario with drones. Drones are less like nuclear weapons, and more like AK-47s. If they're available, they'll be used, by states, guerillas, warlords and organized crime alike.


I think there is a big difference though between a nuclear weapon, which is very hard to procure unless you're a nation-state, and a programmable drone, which can be procured at Best Buy by anyone. Mutually Assured Destruction has kept nation-states in check in terms of nuclear weapons, yes. There is no such barrier to individuals purchasing drones, which have many purposes, unlike a nuclear weapon. There are many more crazy individuals in the world that can afford cheap drones than there are crazy nation-states willing to use their nuclear weapons against a foreign population. You can retaliate against a nation-state because it will be obvious who it was. Missiles are easily trackable and immediately detectable upon launch, unlike small hobby-sized drones.


Important thing is, you can't really do a MAD scenario with drones. MAD requires the destruction to be assured - e.g. once missiles are in the air, everyone's done. The destruction is immediate and can't be stopped by last-minute negotiations; there's no option of attacking "only a little" with nukes.


> The destruction is immediate and can't be stopped by last-minute negotiations; there's no option of attacking "only a little" with nukes.

Of course there is: you could attack only battlefield targets, or only strategic military targets, or only one of the enemy's cities. MAD requires two things: first, the technology for mass destruction, second, the geopolitical conditions where either side can credibly threaten complete annihilation if provoked. Nuclear weapons provide the first by themselves, but the second only incidentally: the nuclear powers have come to a credible mutual understanding that they won't use nukes in their various border skirmishes, and that any use of nukes represents an escalation which risks MAD.

World leaders could hypothetically come to the same understanding regarding drones: use them anywhere in the world and you will start a nuclear war.


> World leaders could hypothetically come to the same understanding regarding drones: use them anywhere in the world and you will start a nuclear war.

Yes, except for one big difference that I see: individuals can't go purchasing a nuke like they can a drone. So while the MAD concept works with nukes, because only nation states have them, the same isn't true for drones, where individuals can have an unlimited number, in addition to governments. World leaders can agree re: drones and attacking each other, but that won't stop an intent individual. I think people might be quite surprised what you can do with a consumer grade drone purchased for a few hundred bucks hooked up to lidar, multiple gps sources, swarm technology so they communicate with each other and object recognition. Now just rig up an explosive device, chemical weapons or a gun. You can even drop items (bombs), or pick items up using retractable claws. You can do an awful lot with a consumer/hobby-grade drone hooked up to some cheap sensors using something even as basic as an arduino with open-source software like ArduPilot, which is so advanced (and cool) even Nasa uses it.


> Morality has kept nuclear weapons in check (so far anyways)

Morality and logistics. The major nuclear powers have implemented a relatively successful program to make it very difficult for a non-nuclear power to develop nuclear weapons.

> I’m sure it will keep AI drones in check too.

There is simply no way to use logistics to curtail AI drones in the same manner as nuclear weapons; all that is needed to make a drone is a 3d printer, common industrial components and software.


Fortunately for an AI drone swarm to be as destructive as a decent sized nuke, you would need millions of them. Unfortunately an AI drone swarm can be less destructive than a nuke. It could kill all the people in a city but leave infrastructure completely intact for the attackers to use. This future is not here yet but is coming and it is good people are starting to talk about it.


Do you need that size of destruction to keep people from leaving their homes? Or just the fear that rogue autonomous drones are flying around with a .22 mag attached? Wouldn't there be some economic impact from such a scenario?


Maybe for some small period of time, but not for long. Americans, and I think people in general, can be quite adaptive. In Texas enough people would just start carrying around shotguns for the chance to be the one to shoot it out of the sky. In San Francisco, who knows, maybe some counter drone tech would be quickly developed. A large society can absorb hundreds of deaths a day without too many problems if it knows the problem is not going to escalate exponentially.


Perhaps you are right. Texas certainly would adapt rather quick.

That said, some cities are already so messed up, that cops won't respond to 911 calls in some areas. I would expect that to get worse should there ever be "thugs with drones".


> all that is needed to make a drone is a 3d printer, common industrial components and software.

This could only happen in a very distant future. As it stands, firing a gun from a 3D printed drone would make it fall out of the sky. Even if it didn’t, it would be laughably inaccurate. Even still, they could be easily defeated with a birdshot.


The bar to entry for nukes is a bit higher than drones. There are already a myriad of drone hobbyists that do very impressive things at very impressive long ranges. RasPi's are getting cheaper and more powerful. I can imagine street gangs renting drones from geeks for all manor of news worthy events.

[Edit] Did I just create a DarkWeb business model?


> stop progress

More people thinking about what they're doing wouldn't be "stopping progress", it would be progress.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: