Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Vinge is a retired math professor, I.J. Good was an accomplished mathematician, Kurzweil makes hard to swallow predictions but is still an accomplished technologist and I happen to very much enjoy Vinge's writing. Regardless, the idea is worth considering independent of who is saying it.

I am also sure you know that words - such as variety and polymorphism - have different context specific meanings. Singularity in this case as in the kind of thing you can find on a variety but not on a manifold.

The idea of infinite recursive Moore's law fueled intelligence explosions leading to super human intellects by 2030 is something I assign a low probability to. I don't find it hard to believe that there is some point in the future - say 2131 - such that if anyone alive today or previously were transported there, they would never be able to understand what was going on and everyone from that time would think circles around them.



Maybe you didn't realize this, but the science fiction author bit wasn't meant as a slander. Many high quality people, and also Kurtzweil, are science fiction authors.

What I was getting at was "you realize they're writing books to make people happy for money, not doing legitimate science on that day, right?"

.

"words - such as variety and polymorphism - have different context specific meanings."

Sure. All handwaving about the rules of language notwithstanding, though, none of The Singularities have merit or underlying measurement, even if you want to talk syntax and grammar to create a seeming of academia by proxy.

.

"The idea of infinite recursive Moore's law"

... is nonsense. What would "recursion" be in the context of Moore's law? Have you even thought this over?

What, Moore's Law solves itself by going deeper into itself until the datastructure is exhausted?

.

"fueled intelligence explosions"

The science fiction part. I mean, you might as well say "fuelled by warp drives," because there's no evidence they're going to happen either. Or unicorns.

.

"is something I assign a low probability to."

This suggests that you don't know what probabilities are. Probabilities are either frequentist, which cannot happen here because we have no knowledge of the rates here (this would be like calculating the frequentist probability of alien life - it's just making numbers up,) or Bayesian, where you draw probabilities from observed events, at which point the probability is exactly zero.

So, is it undefined or zero that you're promoting?

.

"I don't find it hard to believe that there is some point in the future - say 2131"

(rolls eyes)

.

"they would never be able to understand what was going on and everyone from that time would think circles around them."

It seems you don't even need to be transported into the future for that.


1.) Okay.

2.) Singularity as in breakdown not as in single. You purposely muddled the meaning to make your quip work.

3.) Moore's law fueled as in AI gets interest on their intelligence. Recursive as in AI makes smarter AI makes smarter AI...

4.) Science fiction or not i find it unlikely.

5.) Bayesian. Look up prior.

6.) 2131 was tongue in cheek.

7.) Thanks ;) You actually never address my main point though.


"2.) Singularity as in breakdown not as in single. You purposely muddled the meaning to make your quip work."

This is a blatant falsehood. I have solely and exclusively used it as a title for Kurtzweil's concept. It has no meaning; it's a name. I have muddled nothing. It is inappropriate for you to make accusations like this without evidence.

.

"3.) Moore's law fueled as in AI gets interest on their intelligence"

Yes, that's what I said at the outset: this whole thing is driven by the false belief that intelligence is a function of CPU time. There is no experimental evidence in history to support this, and there are 65 years of counter-examples.

Repeating it won't make it less wrong.

.

"Recursive as in AI makes smarter AI makes smarter AI..."

Oh.

This gets to a different false presumption, namely that the ability to create an intelligence, as well as that the power of the intelligence created, is a linear function of the prior intelligence.

This whole treating everything like it's a score, like it's a number you tweak upwards? It's crap.

You can't make an AI with an IQ of 106 just because you have a 104, and the guy who made the 104 had a 102.

This is numerology, not computer science.

.

"4.) Science fiction or not i find it unlikely."

I can't even tell what noun you're attached to, at this point. What do you find unlikely?

.

"5.) Bayesian. Look up prior."

What about bayesian, sir? I don't need to look up prior; I used it, correctly, in what I said to you. You're just telling me to look things up to pretend that there is an error there, so that you can take the position of being correct without actually having done the work.

There are zero priors of alien life, sir. That was my point, in bringing up what you're now blandly one-word repeating at me, in your effort to gin up a falsehood where none actually exists.

.

"7.) Thanks ;) You actually never address my main point though."

You don't appear to have one.

Maybe you've forgotten that you were replying to someone else, who already said that to you?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: