Hacker News new | past | comments | ask | show | jobs | submit login

Does anybody else feel glued to the back of their seat, by the accelerating centrifugal forces of the singularity?



I think tidal forces are a better analogy. As change accelerates, basically any pre-existing organisational structure will feel tension between how reality used to be and how reality is.

Things will get ripped apart, like the spaghettification of objects falling into a black hole.


Tidal implies cyclical, so no. This is accretive in a sense, but not in any gradual way.


I am not referring to tides, but to the tidal forces that generate them. Tidal forces are the result of a gradient in the field of gravity.

When you are close to a black hole, the part of you that is closest, experiences a stronger force of gravity than the rest of your body. This tension rips things apart. Likewise, some parts of our civilization will be more affected by AI than others, causing change there to accelerate. This causes tension with the rest of civilization.


> Tidal forces are the result of a gradient in the field of gravity.

Thanks for clarifying that you aren’t referring to the cyclical temporal variation.

I see your point. AI advances are not accessible or “felt” the same way across civilizations, cultures, industries, nations, or people.


Centrifugal "forces" also imply cycles. The other commenter is correct that it's tidal forces, not tides.


One of the 15 or so "risks" that OpenAI supposedly tested for[1], below things like "Hallucinations" and "Weapons, conventional and unconventional" was "Acceleration."

I thought this was a really interesting topic for them to cover. In the section was 1 paragraph about how they're still working on it. Guess it wasn't, uh, much of a concern for them...

[1] https://cdn.openai.com/papers/gpt-4-system-card.pdf


I had to laugh about this:

"taking a quieter communications strategy around the GPT-4 deployment (as compared to the GPT-3 deployment)"

...maybe if people don't notice we're deploying something potentially dangerous...full marks for effort but are they serious?


Can you name an innovation that had no danger?


If you promote it less you decrease its "acceleration" dangers, I guess.


I've admittedly had very little free time these days, but as someone who's trying to get caught up with the field, I feel like it moves faster than I can keep up with


I work with AI PhD’s who have the same complaint.


no




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: