Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Who is we exactly?

When I read these kind of threads, I believe it's "enthusiast" laypeople who follow the headlines but don't actually have a deep understanding of the tech.

Of course there are the promoters who are raising money and need to frame each advance in the most optimistic light. I don't see anything wrong with that, it just means that there will be a group of techie but not research literate folks who almost necessarily become the promoters and talk about how such and such headline means that a big advance is right around the corner. That is what I believe we're seeing here.



Can someone please explain like we are fifteen why AGI is impossible, at least right now? Or if not AGI, then something similar to a cat/etc mind?

As far as I am imagining it, current models are pipelines of various trained networks (and more traditional filters in the mix) that operate like request-reply. Why can’t you just connect few different pipelines in a loop/graph and make an autonomous self-feeding entity? By different I mean not looping gpt to itself, but different like object detection from camera vs emotion from a picture based on some training data. Is it because you don’t have data what is scary or not, or for a completely different reason?


> Can someone please explain like we are fifteen why AGI is impossible, at least right now

Probably not. I did attempt to give a high level explanation in another comment but I think there is this naive belief that complex problems can be distilled into terms that laymen can understand. This is such a complex problem that is so ill-defined that experts argue. I'm not sure there's really a good "explain it like I'm an undergrad who's done ML courses" explanation that can be concisely summed up in a HN comment.


>but I think there is this naive belief that complex problems can be distilled into terms that laymen can understand

Naive people like Richard Feynman, who said if you can't explain an idea to an 8 year old, you don't understand it? Can you tell us why you think Nobel Prize winner Richard Feynman is naive?


He isn't, you are. Let's look at some other Feynman quotes which we can actually attribute (I can't find a real source for your claim though I've also heard it attributed to Einstein).

> Hell, if I could explain it to the average person, it wouldn't have been worth the Nobel prize.

> I can't explain [magnetic] attraction in terms of anything else that's familiar to you. For example, if I said the magnets attract like as if they were connected by rubber bands, I would be cheating you. Because they're not connected by rubber bands … and if you were curious enough, you'd ask me why rubber bands tend to pull back together again, and I would end up explaining that in terms of electrical forces, which are the very things that I'm trying to use the rubber bands to explain, so I have cheated very badly, you see."

> we have this terrible struggle to try to explain things to people who have no reason to want to know. But if they want to defend their own point of view, they will have to learn what yours is a little bit. So I suggest, maybe correctly and perhaps wrongly, that we are too polite.

My best guess is that this misattribution comes from quote ABOUT Feynman and a misunderstanding at what is being conveyed.

> Once I asked [Feynman] to explain to me, so that I can understand it, why spin-1/2 particles obey Fermi-Dirac statistics. Gauging his audience perfectly, he said, "I'll prepare a freshman lecture on it." But a few days later he came to me and said: "You know, I couldn't do it. I couldn't reduce it to the freshman level. That means we really don't understand it." - David L. Goodstein

Which doesn't mean what you're using your "quote" to mean. As I stated before, we (the scientific community), don't even know what intelligence is. We definitely don't understand it so I'm not sure how you'd expect us to explain it to an 8 year old. Lots of things can't be explained to an 8 year old. Good luck teaching them Lie Algebras or Gauge Theory. You can have an excellent understanding of these advanced topics and the only way that 8 year old is going to understand it is if they are a genius prodigy and well beyond a layman. This quote is just illogical and only used by people who think the world is far simpler than it is and are too lazy to actually pursue its beauty. They only want to sound smart and they only will to those dumber than them.

Stop saying this and get off your high horse and hit the books instead.

Maybe Feynman was right, we're being too polite. There are a bunch of people in this thread that are not arguing in good faith and pretending to be smarter than people that are experts and performing mental gymnastics to prove that (you are one but not alone). If an expert is telling you that you are using words wrong, then you probably are. Don't just assume you're smarter than an expert. You don't have the experience to have that ego.

https://en.wikiquote.org/wiki/Richard_Feynman

https://www.sciencealert.com/watch-richard-feynman-on-why-he....


You will be hard pressed to have an interesting discussion about AGI because the way the term is defined makes it uninteresting. It’s like trying to have a discussion about aviation by asking how close we are to fly exactly like birds. It’s not really relevant to our ability to design good planes.

Then any discussions will be highly handicapped by the fact most still view human intelligence as something special. The field is still very much awaiting its urea synthesis moment.


I don’t view it as such, and am open to anything experts could suggest, much further than cat vs human difference, and not even on the line where these two reside, but a sci-fi level different. The fact that human intelligence is seemed as special bothers me too.


The gorilla has a brain 1/3 the size of a human brain with a very similar evolutionary history.

The sperm whale has a brain that is several times larger than ours.

What do you do differently in your AGI design to get a human, gorilla, or whale brain?


I'm not sure the point of this question - a brain size larger than a human's doesn't matter that much if the extra size isn't going to the right places. From what I can tell it's the size of the brain dedicated to higher cognition (cerebral cortex) and the "power" of the neurons it contains (humans spend more relatively more energy on their brain than an animal's brain of the same size). The answer to your question (how to make AGI dumber than humans) simply seems to be have fewer neurons in total, fewer neurons doing higher cognition, and have those neurons be less powerful.


Cool, make it happen. Don’t know how? Neither does anyone else. That’s why it’s not happening right now.


I spent a month studying AGI on a hobby level and the answer, from what I can tell, is that it isn't possible right now because we can't even model how human level intelligence works. There are also several ways to approach developing AGI and we don't even really have a good idea on which approach is best. I am not convinced current human level intelligence is enough to ever figure out this model, but I believe future genetically modified humans may have a much better chance.


No one can explain why AGI is impossible because you can't prove a negative. But so far there is still no clear path to a solution. We can't be confident that we're on the right track towards human-level intelligence until we can build something roughly equivalent to, let's say, a reptile brain (or pick some other similar target if you prefer).

If you have an idea for a technical approach then go ahead and build it. See what happens.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: