Hacker News new | past | comments | ask | show | jobs | submit login

I find it truly fascinating that "machine learning company doesn't want powerful tool to be weaponized for bigoted ends" and "modern citizens following major media expect their media to treat weaponized AI as a bad thing" makes times sad.

From my perspective, a ChatGPT in the hands of the worst of our society pumping out endless telegram, whatsapp, instagram, twitter etc bigotry and propaganda would be a far sadder time.

Imagine how powerful of a hate machine you could create by wiring HateGPT up to a twitter bot that can reply. Apparently, preventing this makes our times sad.

Honestly, we're at a time when weaponized chatGPT is powerful enough to easily topple most democratic nations. It could control the outcome of elections, if weaponized sufficiently.




>Honestly, we're at a time when weaponized chatGPT is powerful enough to easily topple most democratic nations. It could control the outcome of elections, if weaponized sufficiently.

Unless chatGPT is granted voting rights, it literally can't. If the majority of people vote for something and those people are all legally registered voters in the place where they vote and the votes are being tallied in a fair and accurate way, then there's nothing undemocratic about that election.


As I get it, GP is talking about ChatGPT running a fine-tuned propaganda campaign, replacing a troll farm with a single machine, deceiving and swaying people towards a different vote, thus disrupting the election.

If yes, then I'm skeptical of the statement - a machine could (I'm not even sure of this, though) lower down the cost of running a troll or scam farm, but it's not that government-run farms like that are suffering from budget issues.


> Unless chatGPT is granted voting rights, it literally can't. If the majority of people vote for something and those people are all legally registered voters in the place where they vote and the votes are being tallied in a fair and accurate way, then there's nothing undemocratic about that election.

Many democracies voted for a dictator that ended their democracies. Obviously a perfectly democratic election can end a democracy.

Given the opportunity, a weaponized ChatGPT could be weaponized to dominate online discussion by play-acting as thousands of different personas, could write to-the-person customized mailers, and completely dominate all current methods of politicking, easily winning an election.

Much like IT, humans are the biggest weakness, and weaponized AI has hit the point where it has a sufficient understanding of our psychology, it can be prompted to use it, and thus can functionally control us on a herd level, even if the special unique few swear they're above it.


> Honestly, we're at a time when weaponized chatGPT is powerful enough to easily topple most democratic nations

If something as important as this is that fragile, what's the plan to fix and strengthen it? Is there anything serious, better than just putting a blind eye and pretending the issue doesn't exist by hoping that only the "good" parties will ever have such technologies?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: