Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I had a CS professor that used to hold up a length of string roughly that length and talk about how that is how far a bit of data can travel at the speed of light during a clock cycle or something. Honestly don't remember the point he was trying to make.


Probably trying to recreate this lecture by Grace Hopper [1]

[1] https://www.youtube.com/watch?v=9eyFDBPk4Yw


I still have my nanowire, received directly from Grace herself during one of her last lectures I attended in the 80’s.

Of course, it’s in among about a thousand other wires and cables and nonsense.

One of these days I should sort it out and try to identify it by length.

She had a very firm handshake, and a very definite glint in her eye as she handed those out to her star struck fans ..


I'm sure that's what it was. I probably should have remembered that, but it was such a small part of one of his lectures it didn't resonate as deeply as it should have.


That's a different thing, the signal travel length in a nanosecond, roughly. This is about the 21 cm RF wave that glows from the sky - https://en.wikipedia.org/wiki/Hydrogen_line. One of the (hyper) finest names of things in nerddrom - "hyperfine transition".


Admiral Hopper[1] used to use string to demonstrate how long pieces of time are:

https://www.youtube.com/watch?v=9eyFDBPk4Yw

[1] https://en.wikipedia.org/wiki/Grace_Hopper


She didn't use string, she used wires, and would sometimes hand them out after lectures.


Woah. Imagine extrapolating that for life. What does it mean to throw away a day?


Or to be given one more


I suppose it's interesting to think about. At today's clock rates, the distance between the CPU and RAM actually adds a small, but still significant delay.


It's ultimately what killed having a memory controller on the northbridge of a motherboard. Having the CPU talk to a separate chip to ultimately talk to the RAM simply added too much latency into the entire process.


And it may end up causing CAMM2 to end up being the next standard. The physical layout of the chips on the board means the traces can be shorter - leading to lower latency and higher stability.


I really hope CAMM2 takes off. It'd be a rare standard that could be used for both laptops and desktops. Having upgradable memory in a laptop again would be great. Using the same standard a desktop would make it easy to find sticks as time goes on.


And capacitance, etc too.


Well everyone knows if you want your network to be twice as fast, just cut all of the cables in half.


The point of how fast computers are, and why you need to make them smaller to make them faster. Think about the bus between the CPU and GPU, not much shorter than that. Information cannot travel faster than the speed of light, so there is a hard constraint on how quickly the GPU can respond to commands. The same is true for RAM and even within the CPU, signals take time to propagate across it. The total length of your circuitry for a single instruction can't be longer than 21cm if that's how far light travels.


https://youtu.be/9eyFDBPk4Yw: Admiral Grace Hopper Explains the Nanosecond




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: