Hacker News new | past | comments | ask | show | jobs | submit | SEMW's comments login

What problem here would be solved by ratifying a constitution?

Like -- ISTM that the relevant property here is the ability of the courts to overturn ordinary legislation for incompatibility with basic human rights provisions. But the EU already has this. the Charter of Fundamental Rights of the EU (which is pretty much a superset of the european convention on human rights) is incorporated into the Lisbon treaty, and all EU legislation must be compatible with it. EU courts have overturned legislation for incompatibility with the CFR, eg Digital Rights Ireland[0].

The collection of member state treaties is for ~all intents and purposes a constitution, just not in a single document, and without the word "constitution" at the top.

[0] https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A...


> What problem here would be solved by ratifying a constitution?

Admittedly, a narrow one: clearly delineating unconstitutional behaviour and allowing it to be called out.

> collection of member state treaties is for ~all intents and purposes a constitution, just not in a single document

That morass makes it difficult for the public to cleanly digest when something is blatantly unconstitutional. (Britain has a similar problem.)


> That morass makes it difficult for the public to cleanly digest when something is blatantly unconstitutional

I'm not convinced that's a relevant issue here. For some parts of EU treaty law, sure, but here the context here is disapplying EU legislation that's incompatible with fundamental human rights. Those parts are all in one document in one treaty: the Charter of Fundamental Rights[0], which was incorporated into the Lisbon treaty.

(besides, whether in the EU, somewhere with a formal constitution like the US, or the UK, the vast majority of the work of figuring out whether something is in breach of treaty / constitutional provisions is always going to be analysing caselaw)

[0] https://www.europarl.europa.eu/charter/pdf/text_en.pdf


...No...? If someone puts paint on a 280 million year old rock, that doesn't change the age of the rock.


Earth rocks are about 4.54 ± 0.05 Billion years old. The atoms comprising those rocks are *about 13.787 ± 0.020 Billion years old.


The atoms themselves are far older, as they originated from some nuclear synthesis event in a star somewhere else (generally).

Rocks on earth are rarely that old, as they’re composed from sedimentation (aka they get weathered then deposited), metamorphosed (compacted and modified by heat and pressure), cool from a lava flow (basalt, etc). [https://en.m.wikipedia.org/wiki/List_of_rock_types]

Each of those forms/types changes fundamental properties of the rocks which allows for dating.

Some rocks are literally only tens of years old, or even formed yesterday. Like basalt formed from lava from active volcanos.


[flagged]


You wish! pure homegrown human here.

Of course, that is what an AI would say, eh?


Yup, lol


Most earth rocks are much younger than that. They get melted down, mixed with different melted rocks (etc) and form new rocks all the time.


Hydrogen and helium on earth would be 13.7 billion years old but many of the other atoms on earth are more like 4.6 billion years old.


https://en.wikipedia.org/wiki/Helium

> On Earth, it is relatively rare—5.2 ppm by volume in the atmosphere. Most terrestrial helium present today is created by the natural radioactive decay of heavy radioactive elements (thorium and uranium, although there are other examples), as the alpha particles emitted by such decays consist of helium-4 nuclei. This radiogenic helium is trapped with natural gas in concentrations as great as 7% by volume, from which it is extracted commercially by a low-temperature separation process called fractional distillation. Terrestrial helium is a non-renewable resource because once released into the atmosphere, it promptly escapes into space. Its supply is thought to be rapidly diminishing. However, some studies suggest that helium produced deep in the Earth by radioactive decay can collect in natural gas reserves in larger-than-expected quantities, in some cases having been released by volcanic activity.


It's glass fibre reinforced concrete (GFRC). Very much doubt it's heat-reactive, suspect it's just oil & dirt on a somewhat porous surface (so cleaning properly is labour-intensive, stickers are cheaper)


I'm a reasonably new go programmer, I've only been doing it full-time for a few months. But "great concurrency ergonomics" has.. not been my experience.

What I've been finding is that primitives it gives you are easy to use, but extremely hard to use _correctly_. My first major PR in go had a week of back and forths as more experienced go programmers on my team pointed out the many, many places where my concurrent code was buggy -- places where I was receiving without selecting over the context being cancelled or some close-channel, places where I was sending from a goroutine that had to have a nonblocking basic loop without guarding with a default clause, places where I was using an unbuffered channel where pathological goroutine scheduling could result in a deadlock, places where I was using non-thread-safe data structures in places that could theoretically be mutated by multiple goroutines and without spamming mutexes everywhere, etc. etc.

And sure, I'm relatively new to go. I'll learn these things and get better. But the contrast to some previous work I'd done in Elixir was striking. In Erlang/elixir, _the obvious first thing you try is generally actually correct_. And it gives you standard ways of working built on top of the primitives (genserver calls etc.), so you don't have to buggily reinvent the wheel every time you want to send a message from one goroutine to another with a reply, or need to tear down a bunch of goroutines simultaneously.

I have a highly-concurrent elixir service I wrote several years ago, by myself (early stage company, no code review), still in production. It pretty much never has problems and basically just worked from the start. If I'd tried that with go, without a bunch of experienced go programmers to point out all the subtle race conditions, I'd probably still be dealing with a long tail of data races and deadlocks years later.


The fact that the 95% confidence intervals of two variables have some overlap doesn't mean there's a >5% chance that the expected values of the two variables are the same.

Consider two independent random variables X and Y; the chance that (a sample from X is above the 90th percentile of the true distribution of X) is 10%, but the chance that (a sample of X is above the 90th percentile of the true distribution of X AND a sample of Y is below the 10th percentile of the true distribution of Y) is 1%.

(disclaimer: with actual science the stats are a lot more complicated and you can't just assume they're independent and multiply the two, it's just a simplified example to give intuition about why overlapping confidence intervals don't imply what the parent thought, IANAstatistician)


Mostly false.

Overlapping confidence intervals does not mean > x% chance that the two variables' expected values are the same. If the intervals overlap, the difference is not statistically significant.

Your example about random variables is largely misinformed. You're talking about things as if they are individual values. But we're talking about sample means. The probability that a sample mean for a large sample is above the 90th percentile is massively lower than 10%, and depends on n. The joint probability of getting two sample means above X threshold is irrelevant.

Confidence intervals don't tell you what the probability of the true mean being above X is. They tell you, bluntly, the range of values where the true mean could be, with 95% confidence ("If i were to do this experiment 100 times, based on the results I got, I would expect the true mean to be within this range")

You can play with some numbers and methods but you can rest pretty sure that a material effect size is probably not rigorously evidenced if the intervals overlap


> If the intervals overlap, the difference is not statistically significant.

Demonstrably false. Obvious counterexample: the study in the OP, which has overlapping confidence intervals and a statistically significant difference.

Proof: just calculate the 95% confidence interval for the difference between the two means. You can figure out what the stddev was from half the confidence interval divided by the z-score for a 95% confidence interval, 1.96, and you get 1.02 and 1.30 for the two groups. Then the confidence interval is: (10.4 - 6.3) +/- 1.96*sqrt(1.02^2 + 1.30^2) gives [0.86, 7.34]. This does not include 0, therefore the difference is significant.

> The probability that a sample mean for a large sample is above the 90th percentile is massively lower than 10%, and depends on n.

I was trying to give a basic intuition about normal distributions with a simple example, the distribution of one sample is a simpler example of a different normal distribution. Yes obviously the distribution of an estimate of X given lots of samples is not the same as the distribution of a single sample, I never claimed it was.


> You can figure out what the stddev was from half the confidence interval divided by the z-score for a 95% confidence interval, 1.96, and you get 1.02 and 1.30 for the two groups.

I'm not really interested in double checking your math, but you cannot derive the standard deviation of a sample mean confidence interval without considering the sample size. You seem to be making the same mistake again, confusing the Z score of a single value vs. the Z score of a sample mean. The standard deviation is of course going to be much larger. Why? Because you're actually looking at a difference of proportions where the values are either 1 or 0. The standard deviation is of course going to be much larger than 1%.

Ignoring that and assuming you meant to say standard error, where your math appears to work at a glance; in general, sure, overlapping confidence intervals don't mean that statistical tests of mean difference won't be significant. But... if you don't have that your effect size is probably pretty small. I would not put a lot of faith on these particular results as strong evidence of anything.

I would advocate for people to just look for overlapping curves.

> Yes obviously the distribution of an estimate of X given lots of samples is not the same as the distribution of a single sample, I never claimed it was.

Not number of samples. The sample size.


..No, not wanting to indefinitely maintain arbitrarily-old versions of a free and open-source security library is not "planned obsolesence".


The devil hides in the details.


Could you elaborate on the details? It seems like a pretty silly thing, to engage in duplicitous business practices to sell more… free code.


Usually, it is in non-pertinent ABI/API breakage.

And companies which are making a business at upgrading components like openssl are the ones which would be targetted for a planned obsolescence crack down.

Usually, it is c++ ABI issues (those are usually a massive pain), or glibc versioning manic usage, since there are rarely API/ABI breakages in many crypto libs.


The monarch is theoretically above the law, yes.

But theory is not practice. In practice, if King Charles shivved someone in Trafalgar Square tomorrow, crowing about how he can't be prosecuted, what would happen would probably be something like:

- parliament would try to pass a law saying that we were a republic now (or that harry becomes king or whatever)

- charles would refuse royal assent

- parliament would amend the bill to remove the requirement for royal assent for primary legislation and then claim they'd pass it using itself

- people would point out that this is clearly invalid and self-referential

- it would go to the UK supreme court, who would twist themselves into knots to conclude that it's actually fine, because they know as well as anyone else that that's the only conclusion that wouldn't result in riots and the collapse of the state as a liberal democracy

- all the institutions who matter would agree that we're a republic now


You are taking an extreme example. How about if he sexually assaulted an underage girl a la prince Andrew and denied it happened? The same thing that happened to Prince Andrew would happen, ie nothing. The royalty is above the law unless they do something unbelievably stupidly obviously bad and admit it. And it that case they would just claim that one particular royal is crazy and give the power to the next in line.


Charles I tried that argument, it didn't go well for him.


Sure, but that was overturned 4 years later and caused many people to die.

I wouldn't assume anyone to be as bold as Mr Cromwell was.


> It reminds me of how lawyers are happy to accept signatures by fax

The purpose of a lot of these sorts of requirements is not authentication. It's ensuring that if you do do it, you trigger the statutory requirement for some particular criminal offense. For example, a jurisdiction might have a crime of forgery which is substantially easier to prosecute than fraud (perhaps fraud would need the prosecution to prove intent to make financial gain, wheras forgery might be satisfied as soon as you can prove signature was forged -- hypothetical example, it will vary by jurisdiction and IANAL).

These sort of statues might have been written before computers or even faxes, and there might be caselaw to the effect that forging someone's signature and sending it by fax does satisfy its requirements of the offence, but none yet for just writing your name at the bottom of an email; things like that.


If you fax a fake signature, and you have any real or potential gain at all - boom, in the US it's no longer a simple state court case, you are now guilty of (federal) wire fraud.


Many possible reasons.

Eg Ryanair operates a completely uniform fleet, 100% 737s, and their current operating model relies on all of their pilots and crew being able to operate whatever aircraft is available. Adding a few airbuses means a split fleet, causing significant logistical and training costs. And their customers are evidently happy to prioritize flight cost over their aversion to the 737max, so why would they switch?

Also, an aircraft being on the market doesn't mean you can actually order one and get it anytime soon. Airbus has like a 10-year backlog for A32x deliveries.


...and just to make sure that any possible aversion is, er, averted, they have renamed their 737 MAX to "737-8200" (and rely on journalists publishing press releases without research to improve its reception: https://www.essexlive.news/news/essex-news/stansted-airport-...).


That was always the model designation though, 737-MAX is just branding. And 8200 is just one of its specific versions.

For example the 737-NG was called 737-600 through 737-900 (different versions)

They did indeed drop the MAX branding and replaced it with the numeric designation because MAX got a bad name after their screwup. But they didn't invent the new name, it was always there.


I don't understand why anyone would fly Ryanair ;)


Sometimes there is no choice, e.g. Prague-Malaga. The alternative routes that go on sane airlines require changing planes. Otherwise I agree with the sentiment :)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: