Hacker Newsnew | past | comments | ask | show | jobs | submit | ginko's commentslogin

Just ban children from using the internet.

Not to be a stickler (ok I like being a stickler) but temperature delta, especially deltas between degrees celsius, should be given in kelvin. A 1.8K difference makes sense. A 1.8C difference would be 274.8 kelvin!

This is probably the most ridiculous comment I've read in the history of this website.

There is no difference in the amount of energy 1 degree Celsius delta and 1 degree Kelvin delta represents.

The only (and I really mean only) difference is how zero energy is defined. It is not possible to have negative energy, and that zero Celsius represents the freezing point of water is an artifact of convenience, not of absolute definition.


Also, the way Kelvin is defined necessitates that both degrees are identical. If 10 degrees Celcius defined the boiling point of water at 1 atmosphere (or whatever the actual definition is) then Kelvin would be smaller by a factor of 10. And this applies to both negative and positive K values.

Ranking, Celsius, Centigrade have the degrees. Kelvin is a base unit, absolute and no degree!

Taking differences between degrees Celsius values is absolutely fine.

Ratios are undefined because the Celsius scale has no absolute zero while the Kelvin scale has.

See: https://en.wikipedia.org/wiki/Level_of_measurement


>A 1.8C difference would be 274.8 kelvin!

Categorically and factually incorrect.

A 1.8 degree C different would be 1.8 kelvin. The two degrees have different zero points but one degree Celsius and one degree Kelvin are identical in magnitude.


Celsius is not an absolute scale, but that isn't a problem for deltas: (10C - 5C)=5C, (10K-5K)=5K. Celsius is only problematic when multiplying or dividing. 10C is not twice as hot as 5C.

> A 1.8K difference makes sense. A 1.8C difference would be 274.8 kelvin!

I think there was some insight here that went off on a bad tangent leading to a math word-problem mistake, confusing these two:

1. A difference... between [X] and [Y], which is a delta of 1.8°C

2. A difference... between [0°K] and a reading of [1.8°C], which is a delta 274.95°K.


That makes no sense. A difference between a read of 37C and 38.8C is still 1.8C.

[flagged]


Dude, you are just completely making shit up, and it makes no sense.

So what if Celsius and Kelvin have different 0 points - they are still valid scales and you can talk about differences between 2 measurements.

According to your logic it would be impossible to state that two Fahrenheit measurements differ by some number of degrees F - why, I have no idea.


I'm not entirely sure what point you are trying to make, but this is absolutely false from a scientific perspective.

If you believe otherwise, please provide some citations to your beliefs so we can understand what you are trying to say.


Saying something is false and then asking for citations doesn't seem that helpful to me.

To support your argument, take the following example:

Lets take some water at 273.15 Kelvin and add 1 Kelvin of energy to it. The water is now at 274.15 Kelvin. The difference is of 1 Kelvin.

If we had the same amount of water at 0 degrees Celsius and added 1 Celsius of energy, the water would now be at 1 Celcius.

Converting these values leave us with 273.15 Kelvin and 274.15 Kelvin respectively.

You can repeat this experiment (ignoring latent heat) for any value of Kelvin or Celsius, therefore Kevlin and Celsius are interchangeable in reference to temperature comparasion.


I believe any chemistry or physics textbook will state (possibly indirectly) how temperature deltas work.

But I think it's sufficient to just say that Kelvin and Celsius have the same scale magnitude and just a constant offset.


To be a stickler, communication requires respect for your audience. The vast majority of everyone understands a 1.8 degree C delta. I would argue that very few people anywhere would understand a temperature delta given in kelvin.

How is expecting readers to not understand what a kelvin is respecting the audience?

You misread.

Most people do not understand temperature on the Kelvin scale. As such, you should not use it to communicate in a general setting such as this.


The same way expecting you understand what a Kelvin is isn't respectful to you.

Kelvin and Celsius use the same unit magnitudes. It would be a 1.8* difference either way.

"A 1.8C difference" expands as "A difference of 1.8C" expands as, and here's the ambiguity, either:

"An absolute difference of 1.8C, or 274.8K, measured between A and B"

or

"A relative difference of 1.8C, or 1.8K, is added/subtracted to A/B in order to reach B/A"

I don't think the context-free variant with K will improve understanding and decrease confusability in this discussion context, but I appreciate the pointer about it in general. I'll take a lot more care around it in a future thread about space apparel!


No it doesn't. The absolute difference[1] of 1.8°C is the same as 1.8K; they have the same scale. The subtraction of values cancels out the offset.

A relative difference[2], usually given in percent change, has problems with a unit that has an offset zero like Celcius, but that isn't what anybody is using here. It's more than simple subtraction; you have to divide by the reference value.

[1] https://en.wikipedia.org/wiki/Absolute_difference#Applicatio... [2] https://en.wikipedia.org/wiki/Relative_difference


You're just confused by terminology. While 1 C is 273 K, 1 degree Celsius is 1 degree Kelvin.

See, a degree is not an absolute unit of measure like a Celsius or a Kelvin, it's a relative difference between two absolute units of measure. When discussing the difference between two separate temperature readings measured in Celsius, degrees Celsius is entirely appropriate.

Think of it like time: there is a difference between meeting at 2:00 and meeting two hours from now.


During that peak Vienna’s housing situation was infamously bad though. You’d have single rooms shared by multiple families and beds being used by multiple people on a timetable.

>The math for Bezier curves is usually a bit beyond me, but this seems to use a simple lerp (linear interpolation) to split. Why is that valid?

This can be explained through the bezier's polar form (aka blossom). There's plenty of literature on this. (For instance see slide 40 here[0])

I generally find it interesting that articles on Bezier curves/surfaces usually get upvoted on HN even though they tend to be extremely surface level. Any introductory applied geometry course or textbook will go much deeper within the first chapter or two.

[0] https://resources.mpi-inf.mpg.de/departments/d4/teaching/ss2...


Thankyou, there's a lot interesting there.

I think for many of us, Bezier curves were our first introduction to curve geometry. They were certainly mine. In the late 90s, early 2000s, Rhino3D came out, and Quake 3 had curved surfaces, and suddenly splines were everywhere. For those of us of that generation, they are somewhat magic and back then I never saw a good explanation how they work -- but 20+ years later, this thread has provided multiple!


Honestly I find the impact of the Columbian exchange on cuisine of the old world overblown. Tomatoes potatoes and corn a sure are great, but you can do without them. Italian cuisine was different but most of the modern elements were in place. I'd say the role of tomatoes in Italian cooking isn't as big as people make it out to be.

On the other hand it's almost impossible to imagine what food was like in the Americas before Columbus. No wheat, no pork/beef/chicken, no dairy, no onions, no cabbage, no oranges/apples/figs, any citrus and much much more.


One of the most praised recent restaurants in the United States is based on an attempt to reconstruct pre-Colombian cuisine from the Americas: https://owamni.com/, https://www.newyorker.com/magazine/2022/09/19/how-owamni-bec....


In that list, I think I’d only really miss apples and dairy (really just cheese) by their own virtue. Pork/beef/meat due to familiarity (which is to say, they had other meat sources, which I’m sure were just a good, if I’d grown up on venison I’m sure it would just taste like cow to me).

Potatoes and corn, losing though would be absolutely tragic. Also avocados.


> if I’d grown up on venison I’m sure it would just taste like cow to me

Having grown up on plenty of both wild venison and farmed cattle, they are pretty different, not to mention that different types of venison are also quite different from each other. So I'm not sure I would consider venison and beef interchangeable simply by familiarity. White tailed deer and gemsbok, specifically, I find the best tasting and much better than beef.


Venison is very different from beef. The most beef-like thing I've had is ostrich (which you wouldn't expect), even though it has subtle differences.

> no dairy

They couldn't find one mammal from which to obtain milk? It's a pretty obvious thing to try, for obvious reasons.


The vast majority of the human population is lactose intolerant, both historically and today. Genetically intolerant populations in South and Central Asia have microbiotic help with their dairy-heavy diets, but for people who didn't spend thousands of years developing a culture around it, dairy is just a quick road to an upset stomach and/or food poisoning.


That makes some sense. Given the historic sometime scarcity of food and pressure of starvation, and the widespread availability of milk, I would think people would adapt to it.

I guess that lactose-intolerent people today would drink milk rather than starve - do they get zero nutrients from it? - and that evolution would select for those who could survive that way.


Not going to get into the social darwinism stuff. We can empirically measure an apparent selective pressure for lactase persistence, but it's an open question without clear answers what the factors driving that are.

I think you're missing why milk is useful though. Dairy allows you to take resources that aren't calorically useful like grasslands and turn them into food. You can consume it either immediately or later via preservation techniques like cheese. Even if you consume it immediately, milk is a seasonal product.

Dairy also isn't the only way of turning unusable resources into food though. You can eat the animal, for example. That's less efficient if you're limited to a single species, but cattle and other large livestock suitable for the scale of milk production you're talking about are so phenomenally inefficient that you're likely better off if you consume more efficient animals instead.


> social darwinism

There is none of that in my comment.

> I think you're missing why milk is useful though.

? I was saying it is useful, and therefore I expect Homo sapiens would adapt to it.

After writing the GP I was told that humans, and some or all mammals, have a gene that disables lactose tolerance when they reach the stage of life where they no longer need milk. A miniority of humans have a mutation that stops that process, making them lactose-tolerent.

Why haven't we evolved to consume milk lifelong, given its obvious advantages (or why have we evolved to become lactose-intolerent past early childhood)?

A guess: Obviously milk consumption is inherited from mammal ancestors. That provides plenty of time (66 million years +) and population to evolve lifelong lactose digestion.

But other mammals don't have much need for that adaptation - for the most part, they can't figure out obtaining milk from another species as a regular food source. Human ancestors didn't figure out tool use until 2.6-3.3 million years ago; would we have figured it out then?

My guess is that it required domestication of animals ~12 thousand years ago before non-childhood milk consumption was commonplace. 12,000 years isn't much time to evolve much.


> On the other hand it's almost impossible to imagine what food was like in the Americas before Columbus.

Not at all. Many pre columbian foods remain popular today, like tamales. Corn, beans, squash, fish, nuts, and tropical fruit were all staple foods in pre contact Mesoamerica. Central American islanders were big on grilling fish over coals.

I don't think it was a miserably plain diet by any means.


Depends on the area. German speaking areas and Eastern Europe do use lots of potato. Even the collagial name for German is potato


I'm Austrian myself. There's plenty of potato dumplings etc., but they're just variants of other flour/cheese based dumplings. Potatoes are important but certainly not indispensable.

Compare that to pork for instance. Remove that and you've removed like 50% of Austrian cuisine.


no beef? bison were ubiquitous, though.


I still think someone should set up a voice chat bot that answers to "Computer!" and has Majel Barrett's monotone voice.


My fan theory of the original Star Trek is that the computer voice is something they arrived at AFTER trying more naturalistic personalities. They intended to have the control interface be a cold monotone.

In fact, there is an episode where the computer voice becomes sultry, and Kirk complains.



I'm mainly team monorepo because working with git submodules is such an needlessly miserable experience.

At work we have a pretty large project with many teams having moved to using nested git submodules for their stuff.

Whenever you checkout a commit you basically have do a `git submodule --init --recursive` and pray there's no random files left over because some submodule has moved and git-submodule thinks it's your job to clean up its mess. This becomes really annoying when you want to bisect something.

Surely there must be a saner way to deal with trees of git repos. I guess AOSP uses its own `repo` tool to do multirepo stuff which might be better. But honestly this _should_ be fixable in git-submodule itself if they just make it behave sanely.


Notice how they moved the ok & cancel buttons to the bottom right since it’s the more logical location to put them.

Meanwhile gtk now puts those on opposite sides of the window title bar by default.


Separating them is good for avoiding misclicks.

Decades ago, MacOS properly had the close box for windows on the opposite side from minimize etc. widgets; now the one destructive window action could be reasonably safe without confirmation. Then Windows started gaining popularity and nobody ever did it the right way by default again. A pity for the sharp minds at Xerox PARC.


I don't mind ok and cancel being on opposite sides. It's mainly ok not being bottom-right that bothers me.


Command Q and Command W are still beside each other though


How is it more logical? Upper right places them close to the other window controls. Also continues the down-then-right order of most of the other controls.

In fact, putting buttons along either _side_ of the windows would be a better fit on the wider aspect screens we use nowadays.


>And that's why it's probably not China. I mean, why would they make it that obvious?

That's just what they want you to think!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: