Hacker News new | past | comments | ask | show | jobs | submit | mbbutler's comments login

Machine learning people use "tensor" to just mean an N-dimensional array of numbers. The term is divorced from its meaning in Physics and Mathematics, which caused me some confusion when I started looked at machine learning papers coming from physics.


Again, doesn't seem so divorced: "N-dimensional array of numbers" pretty much works for Mathematics from my understanding.


But the energy transported to Earth from your space power plant still creates waste heat when it is used to do work (and also when it is transported to earth). You cannot beat the second law.


They don't even decouple at high material standards of living. Recent increases to GDP produced emissions too but those new emissions were offset by reductions in emissions of existing industries.

This "decoupling" gets us basically nothing because it's not like we can just stop emissions tomorrow since GDP and emissions are "decoupled".


Counter example: if we can make computation 1000x more efficient per teraflop, we could use computers to trivially design drugs to cure cancer etc (just a random example) and yet our energy consumption would not change. GDP may be the wrong measure, but there would be economic growth or standard of living growth for no change in energy consumption.


This has already happened at a higher multiplier.

An Apple Watch has more processing power than a Cray 2, but it uses a rechargeable battery instead of a 150kW power supply.

The problem is that cycles expand to fill the space available, so another 1000X drop in efficiency would mean new kinds of applications rather than being limited to affordable super computing.

While individual computers are far more powerful and use far less energy, the power consumed by computing on Earth as a whole is far higher than it was. (Not even counting energy vampires like crypto.)

There's no reason to assume that trend would stop. Displays could easily have much higher resolutions (possibly holographic), IOT could be truly ubiquitous, AI could be in everything, entertainment could be social, interactive, and immersive, and so on.


> if we can make computation 1000x more efficient per teraflop, we could use computers yo trivially design new drugs to cure cancer etc

You're assuming two things: that cancer and disease can be cured by any feasible increase in computing power (not that outrageous an assumption), and that the energy savings won't be offset by keeping an even older population of disease survivors comfortable and alive. Reality is not as simple as "cure all our ailments and we're good to go".


How does UDP work if you're also using delta compression? I would naively expect that the accumulation of lost diff packets over time would cause game state drift among the clients.


The simplest way I've done it: say client and server start on tick 1, and that's also the last acknowledgement from the client that the server knows about. So it sends a diff from 1 to 2, 1 to 3, 1 to 4, until server gets an ack for tick 3, for example. Then server sends diffs from 3 to 5, 3 to 6, etc. The idea is that the diffs are idempotent and will take the client to the latest state, as long as we can trust the last ack value. So if it's a diff from 3 to 6, the client could apply that diff in tick 3, 4, 5 or 6, and the final result would be the same.

This is done for state that should be reliably transmitted and consistent. For stuff that doesn't matter as much if they get lost (explosion effects, or what not), then they're usually included in that packet but not retransmitted or accounted for once it goes out.

This is different (and a lot more efficient) than sending the last N updates in each packet.


> The idea is that the diffs are idempotent and will take the client to the latest state, as long as we can trust the last ack value. So if it's a diff from 3 to 6, the client could apply that diff in tick 3, 4, 5 or 6, and the final result would be the same.

Can you elaborate or give an example of how this works?


Imagine the following changes each tick:

    1: x = 1
    2: x = 2
    3: x = 3, y = 5
    4: x = 4
    5: x = 5
    6: x = 6
    7: x = 7, y = 1
Diff from 2 to 4 would be "x = 4, y = 5".

Diff from 3 to 6 is "x = 6", which will always be correct to apply as long as client is already on ticks 3~6. But if you apply at tick 2, you lose that "y = 5" part. This can't happen in a bug-free code because the server will only send diffs from the latest ticks it knows for sure the client has (because the client sends acks)


Cool thanks, that makes sense! In my head I was thinking the diff from 2 to 4 would be something like "x += 2, y += 5", and 3 to 6 would be "x += 3, y += 0"... which of course wouldn't be idempotent and wouldn't allow you to apply the update to different client states.


You can extend it to practical use by imagining these terms as:

    entity[123].active = true
    entity[123].x = 4
    entity[123].y = 8
Then later...

    entity[123].active = false
And with special rules such that if `active = false`, no other properties of the entity needs to be encoded. And if `active = true` is decoded, it sets all properties to their default value. Then you get a fairly simple way to transmit an entity system. Of course you'd want to encode these properties in a much smarter way for efficiency. But the basic idea is there


That is a fascinating use of idempotence, bravo!


If you get your data small enough to fit multiple updates into a single packet, you can send the last N updates in each packet.

If your updates are bigger; you probably will end up with seqs, acks and retransmitting of some sort, but you may be able to do better than sending a duplicate of the missed packet.


Exactly, you assign a sequence number to each update, have the client send acks to convey which packets it has received, and the server holds onto and sends each unacked update in the packet to clients (this is an improvement over blindly sending N updates each time, you don't want to send updates that you know the client has already received).

If the client misses too many frames the server can send it a snapshot (that way the server can hold a bounded number of old updates in memory).


You just described TCP


It's close but TCP will retransmit frames rather than packing multiple updates in a single frame.

It's common for people to build this kind of retransmission logic on top of UDP (especially for networked games), it's sometimes referred to as "reliable UDP".


It’s not TCP, it’s TCP without head-of-line blocking, which makes it much more suitable for real time games.


TCP forces sequencing across all packets, SCTP is a bit closer.


You don’t delta compress everything, only a significant part of the payload. Each diff is referential to a previous packet with a unique id. If you don’t have the previous packet, you just ignore the update.

Every 30 frames or so, you send a key frame packet that is uncompressed so that all clients have a consistent perspective of world state if they fell behind.

Using sentTime lets clients ignore old data and interpolate to catch up if behind as well.

It does work, I wrote one from scratch to create a multiplayer space rpg and the bandwidth savings were incredible.


It's $100M to develop tech that would allow carbon to be pulled out of the atmosphere at the price of $100-$500 per ton of CO2. Even if the tech is successful then it would still cost $100B-$500B per year to pull out the 1 bilion tons of CO2 by 2050 that the IPCC has built into their models.


The tech for Direct Air Capture (DAC) is already quite efficient at removing CO2 from the atmosphere (80%+ depending on the particular process). At best, new innovation on the chemistry of the removal process can only increase the efficiency by 25%.

The real issue with DAC is that it is incredibly difficult to innovate around the fact that CO2 in air is just immensely dilute. You need to process enormous amounts of air to remove an appreciable amount of CO2 and, even worse, as the plant operates and recirculates processed air, the local air around the plant becomes more and more devoid of CO2 leading to a decreased amount of CO2 captured per unit volume of air.

The only real improvement that I could see occurring in this space is coming up with a process that creates less back-pressure against the fans pumping in unprocessed air which could bring down the energy cost per ton of CO2 removed. But even in that case the back pressure created by a DAC process is normally caused by flowing air through a porous catalyst, which is essential for high efficiency. So there's a trade-off there as well.

Ultimately I am not very sanguine about DAC and I have been disappointed to see news agencies reporting as if DAC is even in the top 10 technologies most important to reduce carbon emissions.


Staggered extraction cycles, where each round leads to a 10X increase in concentration, is one option. If you look up how D2O (heavy water, where the hydrogens are replaced by deuterium) is produced, the first step is often done that way. The initial stage is the concentration of HDO from H2O, which is naturally found at about 1 part in 3200. 400 ppm CO2 is 1 part in 2500, so it's comparable.

Proof-of-process for Starship fueling would be generating a stream of pure CO2 from air at a rate sufficient to feed into a methane production facility (which would require a similar stream of hydrogen from water processing) to produce around 500 tons of liquid methane in a reasonable amount of time. Possible?


This is the best comment in the thread and it's somehow being downvoted.

Physics grad student here and I agree with this comment whole-heartedly. I love the idea of nuclear power but I also understand that it requires enormous CapEx and that the time to get a reactor up, running, and carbon neutral is much too long to address the climate crisis. We absolutely shouldn't be shutting down nuclear plants but any money spent on new plants is money that could otherwise be spent generating lower cost Wind/Solar in a shorter period of time.


Taking the view for a moment of somebody who thinks we won't have a solution until we have plenty of nuclear power,

> the time to get a reactor up, running, and carbon neutral is much too long to address the climate crisis

In that particular sense, it's not a crisis: if you think N megadeaths will occur in time T as a result of climate change (making no comment on that because I don't know much about N as a function of T), there are still (10000 - N) million humans around at time T to suffer from lack of a solution. In that situation, it doesn't make sense to say "it's a crisis, so a solution that helps after time T can't possibly help"?

So for people who think that we won't have a solution until we build a lot of nuclear power, the slogan "climate crisis" seems likely to badly damage those 10000-N million humans because it gives us an excuse to never solve the problem. People could reasonably disagree with the antecedent ("won't have a solution until we build nuclear"), but I hope you see a bit how the other side sees it?

Greetings from an ex-Physics grad student on the other side by the way! Other side both of this debate and of being a grad student I suppose. Have a nice day over there :wave:


have you noticed how the mythical future where the bulk of energy is supplied by wind and solar is always at an indeterminate point in the future? There is currently no gigawatt scale power grid anywhere in the world that supplies more than 20-30% of annual energy from variable energy sources like wind and solar.

Meanwhile, France enjoys a low carbon grid together with low energy prices as do many parts of Canada, parts of eastern europe and so on.

It is wrong to frame the argument as wind/solar versus everything else. The argument should be carbon versus no carbon. Period.


Look no further than South Australia with a 65.7% renewable utilization for the 2021. Sure you can shift the goal posts to include total energy usage instead of simply electricity, but that also makes it easier since now you have a whole load of possible smart consumers in for example the transportation industry.

https://www.climatecouncil.org.au/resources/record-year-rene...


South Australia does not scale. It is a region with a tiny population, almost 75% of which lives in a limited area.

South Australia is also incredibly sunny and has the advantage of being an arid desert. Again, that does not scale well.


With the exponential lowering of the costs of wind power, solar and storage that band should increase every year right?

South Australia sits at 25 to 37 degrees south. The longest HVDC line (in China) is 3300 km, that is 30 degrees.

Based on existing HVDC lines and South Australian circumstances we can say that we should reliably be able to supply somewhere up to 55 to 67 degrees from the equator, in general. Do you know what is at 66 degrees north? The arctic circle. That is how far north we end up.

This is not even considering that closer to the poles the wind resources are much better, especially in winter time.

The 4 million people living north of the Arctic circle might have a harder time. But it would seem that it is a trivially easy to solve edge case when the rest of the world has a solution.

https://en.wikipedia.org/wiki/Arctic_Circle#Human_habitation

This of course discounts any geopolitical concerns, which makes it harder. But on a US or EU continental scale we easily have the technology today.


South Australia gets rather more sunshine than the regions of the world where people actually live.


With the exponential lowering of the costs of wind power, solar and storage that band should increase every year right?

South Australia is 25 to 37 degrees south. The longest HVDC line (in China) is 3300 km, that is 30 degrees.

So now based on existing HVDC lines and South Australian circumstances we can say that we should reliably be able to supply somewhere up to 55 to 67 degrees from the equator, in general. Do you know what is at 66 degrees north? The arctic circle. That is how far north we end up.

This is not even considering that closer to the poles have much better wind resources, especially in winter time.

So, the 4 million people living north of the Arctic circle might have a harder time. But it would seem that it is a trivially easy to solve edge case when the rest of the world has a solution.

https://en.wikipedia.org/wiki/Arctic_Circle#Human_habitation

This of course discounts any geopolitical concerns, which makes it harder. But on a US or EU continental we easily have the technology today.


Ireland is gigawatt-scale, over 36% wind energy, and growing rapidly. Well beyond your stated 20-30% figure.

https://en.wikipedia.org/wiki/Wind_power_in_Ireland


>. Well beyond

Is it? This adds nothing to the discussion. If it were actually 50% and i claimed 30%, that would be a distinction worth pointing out.

Since we are being pedantic, you should really look up the figures for energy share from wind in 2021 in Ireland. It fell to 29% which completely negates your claim of it growing rapidly.


> It is wrong to frame the argument as wind/solar versus everything else. The argument should be carbon versus no carbon. Period.

I am framing it as carbon vs no carbon. A new nuclear plant's timeline to carbon-neutrality compared to the counterfactual where an equal capital investment is made in solar/wind is well over a decade. Given the time crunch we are on to lower emissions, I simply do not think we have the time to waste building new plants. That said, we should not be decommissioning plants that are still operable.


How much massive grid batteries could built within 10 years?


I realize that you won't be impressed, but 20% of electricity in Texas came from wind in 2019 (plus 1% solar) [1].

Texas has been building wind and solar like crazy since then, such electricity in Texas in 2021 was 24% wind + 4% solar [2].

Texas is building 6.1 GW of solar in 2022 and 3.8 GW of wind [3]. So that indeterminate point in the future where ERCOT is more than 30% wind+solar is probably 2023 or 2024. It might even be 2022.

[1] https://comptroller.texas.gov/economy/fiscal-notes/2020/augu... [2] https://www.ercot.com/gridinfo/generation [3] https://www.eia.gov/todayinenergy/detail.php?id=50818


Why are you complaining that the author didn't talk about tensors as they are used in tensorflow? Tensorflow is never even mentioned in the piece.

The author is perfectly clear in the first sentence that the piece's focus is about the usefulness of tensors in a physics context.


Huh, you’re right, not sure why I thought it’s TensorFlow related.


I have to question the accuracy of the numbers presented in this article.

I own a Chevy Bolt in Maine and the figure listed in this article is far off from what I have experienced over several winters here. At a temp of 20 F my bolt loses ~20% of its range from the EPA estimate, going from 236 to about 190miles of range. Conversely, my friend owns a Tesla Model 3 up here and his range degrades severely during winter on the order of 30%.

edit: He owns a Model 3, not a Model Y.


Replying to my own comment with an update.

The temperature today in my area varied between 9 F at 9AM and 25 F at 3PM. My girlfriend happened to need to drive to several different towns/cities for her work so her total trip was 135 miles. Our Bolt says it has 50 miles left.

I only charge to 90% and just received a new 259 mile range battery as part of the Bolt recall. So the EPA range of my Bolt at 90% charge is 233 miles. If we take the Bolt's 50 mile left range claim at face value (which I have found to be overly pessimistic), then it got 185 miles of range today out of the expected 233, or a ~20% drop in EPA stated range.

So I just don't believe the numbers provided in the linked article. They are completely out of whack with my every day experience owning a Bolt through three winters in Maine.


This same class did the same thing to me at Princeton except I was an engineering major at the time. I had done well on the Calc AP exams in high school which the engineering department said placed me in MAT 202. On the first day of class the professor started summarizing "what we already knew" from high school about linear algebra and I had seen basically none of it with the exception of basic matrix multiplication. The rest of the week was ego-destroying as I attempted to get help from the professor and was repeatedly told (almost berated) that I should already know the answers to my questions from my high school courses.

I eventually dropped the course, left engineering entirely, and majored in a biological science. But the jokes on them because I self-studied a shit ton of math after I graduated and eventually went back to grad school doing ML+Physics. It turns out that I'm actually pretty good at Linear Algebra after all.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: