Hacker News new | past | comments | ask | show | jobs | submit login

> - each timestep, self-replicate the most recently made objects twice each, and append true, false to the array

> - after N timesteps, every possible set of values for the boolean variables will be in memory

Creating that takes O(2^N) time, hence this algorithm is not polynomial. It's especially not linear, as claimed.




I disagree, did you run debugging tools to check how many frames this code takes to execute? On what basis are you claiming it takes O(2^N)? You just are just asserting that, but my code is running in time O(N) according to the debugger so im not sure why you are claiming that


"In computer science, big O notation is used to classify algorithms according to how their running time or space requirements grow as the input size grows" - https://en.wikipedia.org/wiki/Big_O_notation

It does not refer to the number of high-level execution frames, which appears to be what you measured.

Consider that in Python:

   a = []
   for i in range(500_000):
     a.append(i)
   a.reverse()
takes about 0.1 seconds, while

   a = []
   for i in range(500_000):
     a.insert(0, i)
takes about a minute, to produce the same result. That's a factor of 600, even though the first one actually executes more Python code.

This is because the first uses an amortized O(N) method, while the second uses an O(N^2) method.

Hence, you need to measure run time, not the number of frames.

I don't need to try it out because you wrote "every possible set of values for the boolean variables will be in memory", which requires O(2^N) values for the worst case, which require O(2^N) for the CPU to touch each memory cell.

Also, try something with 10,000+ variables. Or one of the benchmark test sets from http://www.satcompetition.org/ .


Thanks for this. After much research I have concluded that your reply here is accurate and my previous reply contained either misleading or wrong information. Your comments about space and high-level excution frames are on point imo. Thanks for taking the time to explain your thinking here, I really appreciate it!

I only care about the execution time for my research purposes. See this Math SE post where I explain this algorithm as a people type algorithm system. Curious for your thoughts?

___ Imagine there is a party with an infinite number of mathematicians at it, and I walk up to my friend with the following instructions:

"every 1 minute, find 1 person at the party who has not been told this message yet, and repeat this message to them"

It's clear that after N minutes, provided these instructions are followed exactly, there will be 2^N people who have heard this message.

I will now describe an algorithm which can solve the boolean satisfiability in N variables in N minutes.

1 - find 2 people at the party, hand one of them a paper that says "True" and another a paper which says "False" and has a boolean expression on it

2 - tell them these instructions "in 1 minute, find 2 people at the party who have not been told this message yet, and repeat this message to them, give 1 of these people your paper but add True to the list, find a new piece of paper and copy over the list + expression from the paper I gave you, but add False to the list"

3 - after N minutes, whoever has a piece of paper in their hands has to compute the solution of the boolean expression (we are assuming since all the people at the party are mathematicians, they can all check this in less than 30 seconds) and if its solvable they have to shout out "ITS SOLVABLE EVERYONE"

Then given any boolean expression involving N variables, I can use this algorithm to solve the boolean satisfiability problem in N minutes and 30 seconds

[1] https://math.stackexchange.com/q/3605352/1417


"in 1 minute, find 2 people at the party who have not been told this message yet"

What if they are not able to do so?

Let's suppose the party has 1 million people, and you start in the center.

At first it's easy to find 2 people. As the instructions spreads out, it gets harder and harder, because the number of new people you can reach within a minute goes from quadratic towards linear.

"there will be 2^N people who have heard this message"

Big-O complexity assumes an upper bound exists in the amount of compute power available. If the number of processors can grow exponentially to limitless size, then it's outside of the usual N vs. NP - many problems are linear if an unbounded amount of CPU power is available which can scale exponentially.


> What if they are not able to do so?

Good question! Since you have already corrected one of my mistakes, I don't want to start off by disagreeing lol

In my head, this is just a silly word problem, the purpose of which is to show the mathematical theory of computation requires a new axiom

Here is a map which describes my idea:

Party with infinite number of mathematicians --> theory of computation

a mathematician --> Turing Machine

getting a new piece of paper --> a new axiom that gives Turing Machines a new self-replication feature

According to this new theory of computation which I am trying to invent, and which I have named The Theory of Self Reproducing Machines, it would not be an issue that "because the number of new people you can reach within a minute goes from quadratic towards linear"

As a counter example to your idea: consider twitter and the abstract concept of a social network graph where it's quite possible to send messages instantly for all practical purposes

Curious for your thoughts


If you have access to an infinite number of CPUs then every problem can be solved in finite time. This is clearly obvious conclusion (excepting perhaps problems which involve transfinite numbers, in which case you need to pin down which infinity you mean).

In the real world, communication is limited to the speed of light.

If your message is going out to the "infinite number of mathematicians", then after time T your message reaches 4/3 π (c*T)^3 of space, which is polynomial in time. You therefore cannot contact an exponential number of mathematicians, which is what you would need in order to demonstrate the trivial reduction of an NP problem into P.

See https://en.wikipedia.org/wiki/Limits_of_computation for other real-world limitations.

You mention "twitter ... where it's quite possible to send messages instantly for all practical purposes."

There are fewer than 10 billion people in the world. That's a finite number. Even a quadratic problem remains quadratic if there is a bounded number of compute resources. (Any finite number of people/CPUs will be a problem, even if the entire galaxy were populated by humans and/or computronium)

Consider https://en.wikipedia.org/wiki/Ramsey%27s_theorem#Ramsey_numb... with the famous description attributed to Erdős:

> Erdős asks us to imagine an alien force, vastly more powerful than us, landing on Earth and demanding the value of R(5, 5) or they will destroy our planet. In that case, he claims, we should marshal all our computers and all our mathematicians and attempt to find the value. But suppose, instead, that they ask for R(6, 6). In that case, he believes, we should attempt to destroy the aliens.

Even with the entire mathematical population of the world working on it, Erdős doesn't think R(6, 6) solvable in reasonable time. Now try solving R(n, n). Can it really be solved in linear time even if the entire world becomes proficient in mathematics?


How interesting. You have given me quite a lot to think about it. I am still unable to get past the following, but I will continue to consider everything you have said in our discussion very carefully.

Issue: I carried out theoretical physics energy calculations on the work done by cells as a result of cell self-replication in the human nervous system (recall the nervous system primarily functions to send signals at ~the speed of light from one part of the body to another)

The thing I am unable to get past for arguments of the type you made, e.g. when you stated:

> "There are fewer than 10 billion people in the world. That's a finite number. Even a quadratic problem remains quadratic if there is a bounded number of compute resources"

The issue is: the human body does not completely fall apart, it all works fine and there are no issues like this, there is some kind of bound from being a physical object in the real world, but physically and quite practically for all purposes the human nervous system is capable of expending an exponential amount of energy

Question: Since the human body does not fall apart despite the fact that its doing cell self-replication, given how theoretical physics shows there truly is an exponential level of energy expended during cell-replication, how are these types of issues that you raise, quite legitimately from a pure-math/theoretical comp-sci perspective, not also a major issue for the human nervous system?

Thanks again! This discussion has been enlightening


The nervous system does not work at the speed of light. https://en.wikipedia.org/wiki/Nerve_conduction_velocity says:

> Nerve impulses are extremely slow compared to the speed of electricity, ... however, it is very fast compared to the speed of blood flow, with some myelinated neurons conducting at speeds up to 120 m/s (432 km/h or 275 mph).

> the human nervous system is capable of expending an exponential amount of energy

I don't know what that means. Exponent in what dimension? Time? What's the exponent? Do you have any demonstrable evidence?

At some point the human nervous system will be limited by the amount of food needed. The Tour de Frace cyclists consume some 7000 food calories per day. Another limit is the need to keep cool, because that consumed power must go somewhere.

> given how theoretical physics shows there truly is an exponential level of energy expended during cell-replication

Here again you'll need to explain what you mean. 1.0 is an exponent, so in some since you are trivially correct. But I strongly suspect that's not what you meant.

I think you mean that at the beginning there is 1 cell, the zygote, then 2, then 4, etc. But this exponential growth phase is short. It must be, otherwise it would quickly consume all available energy in its environment.

> not also a major issue for the human nervous system?

Again, I don't follow.

I am not aware of humans being able to solve NP-hard problems in polynomial time. We certainly can solve some NP-hard problems when N is small, but how do you know it scales polynomially?


> The nervous system does not work at the speed of light

I think it's ok to call a limiting maximum velocity for the transmission of signals by definition "the speed of light in the human body". Your point is well taken about the actual speed of light and how fast the nervous system works

> I don't know what that means. Exponent in what dimension? Time? What's the exponent? Do you have any demonstrable evidence?

I meant exponential in time, and yes I have evidence. It's theoretical physics energy calculations I did on the work done according to the principles of theoretical physics as a result of cell self-replication. (Consider a single particle to have kinetic energy 1/2mv^2, and then allow for self replicating particles) What I did was read a lot of biology text books and look at the experimental evidence of how the human body actually works. Hopefully I didn't make any major mistakes in these calculations, but to me its quite clear that all of the experimental evidence has tended to show that the human nervous system is capable of expending an exponential amount of energy in time

> Again, I don't follow. > I am not aware of humans being able to solve NP-hard problems in polynomial time

I understand the nature of your objection to involve arguments about the nature of this being unrealistic. I have tried to show why I believe this is quite realistic, and explain why I feel so strongly this is a practical question for a computer program I am writing so I can figure out the best way to design the software.

When I said "not also a major issue for the human nervous system?" what I meant was this:

Make an analogy where the human brain is a master CPU, and it has an object oriented programming language where each cell is an Object, and self replication is the new operator.

I felt like your objections about "not being realistic" could be objected to on the basis that these ideas come from physics and have real world applications in engineering for a new type of self-replicating 3d printer. In my view, it would be rather useful to further develop the theory of these types of systems

I believe you have made an extremely strong point about the speed of light and the nature of the reasons this kind of approach needs to be bounded in a strong form or else will clearly not work

Hope that clears it up!


> I think it's ok to call a limiting maximum

It's the rather trivial limit. I mean, it's pretty clear the limiting maximum for my walking speed is also the speed of light. Yet it's not useful to consider that detail when figuring out how long it will take me to walk to the store.

> Consider a single particle to have kinetic energy 1/2mv^2, and then allow for self replicating particles

That short summary appears to violate the conservation of energy. How does a self-replicating particle get the mass and energy to replicate a new particle, much less one with the same kinetic energy?

> its quite clear that all of the experimental evidence has tended to show that the human nervous system is capable of expending an exponential amount of energy in time

Can you point to any of the evidence? Especially evidence which shows it's unbounded? How can a human body produce 1 MW of power? (A simple estimate using the Stefan–Boltzmann law shows the body temperature would be over 1,000 degrees C.)

> Make an analogy ...

Your analogy doesn't support the ability to add new CPUs, only new objects. Your theory requires the ability to self-replicate the master CPU, which the human body does not have.

I have a graduate degree in physics. There's no need to use an argument by an analogy with me if you can present the theoretical physics energy calculations.


> It's the rather trivial limit. I mean, it's pretty clear the limiting maximum for my walking speed is also the speed of light

I was very intrigued that you were using relativistic arguments to support your point of view. I thought this was a very insightful approach and assumed you knew what I meant by "speed of light in the human body"

In [1]Landau volume 2 on the theory of relativity, chapter 1, section 1, it states:

"experminent shows that instantaneous interactions do not exist in nature. Thus a mechanics based on the instantaneous propagation of interactions contains within itself a certain inaccuracy"

The entire theory of relativity, which forms the basis of your arguments involving the speed of light as far as the real world is concerned, involves this concept of maximum limiting velocity very seriously.

You can speak, and the sound travels at the speed of sound. In many parts of physics, there is a "speed of X" based on the properties of the medium.

I don't appreciate you talking down to me about the theory of relativity, or the physics of light. I don't think you appreciate that I am a serious experimental and theoretical physicists and have published papers on my original research in PLR on applications of bio-physics to experimental nanotechnology.

> That short summary appears to violate the conservation of energy.

Because the cells in the human body consume protien for energy. Both of my parents are biologists and research scientists so thats where I learned about all of this.

> Can you point to any of the evidence? Especially evidence which shows it's unbounded?

I explicitly stated in my previous question (direct quote)

"I believe you have made an extremely strong point about the speed of light and the nature of the reasons this kind of approach needs to be bounded in a strong form or else will clearly not work"

I guess you didn't read that?

> Your analogy doesn't support the ability to add new CPUs, only new objects.

Thanks for all your help. Your condescending attitude is not helpful anymore. I think you are being very close minded and I have appreciated your point of view and we can agree to disagree

[1] https://books.google.com/books?id=X18PF4oKyrUC&printsec=fron...


I apologize for my hostile response. You have been extremely helpful and this was out of line.

Please see this recent HN post from today which is extremely relevant: https://news.ycombinator.com/item?id=22839035

Also please consider that in newtonian mechanics the potential energy of interaction assumes an instantaneous propagation of signals between 2 particles, and there is no issue in practice with this approximation, because the speed of light is so large compared to the time scales of macroscopic phenomenon. The ideas we have been discussing is the same type of regime where its an asymptotic limit, and where these types of assumptions are quite physical and very realistic.

I have been working so hard for the past 3 weeks on a generalization of John Conways The Game of Life and he just died so I am pretty emotional right now.

I'm sorry we had to come to this point, where you have provided enough information for me to continue my work, but had to have such a confrontational fight over who understands the theory of relativity better. I can assure you, I understand the point you are making about the speed of light.

Let me remind you, the constant speed of light is only constant in a vacuum. In air the speed is not the same as water, and gravitational fields can bend the path of light.

I hope in the future we can have a physics discussion as equals instead of getting into a fight when I have been crying over the death of one of my absolute greatest personal heroes




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: