Hacker News new | past | comments | ask | show | jobs | submit | gcp123's comments login

Two follow-ups for you: 1. Is this response some kind of attempt to excuse hostile UX? 2. Because reader mode exists, does that mean brutally bad UX should never be called out?


There is no reason to let random websites decide the UX for you. Reader mode has existed for 14 years now, and if you set your browser to open all websites in reader mode by default, you will always have every article presented the way you prefer. Without all the annoyances. I can guarantee you that website owners have no interest in improving their UX, so why bang your head against the wall?

If a band that sucks is playing on the radio, I can't make them play better. But I can turn down the volume.


Came here to say this. Woof.


This is about selling solutions vs solving problems.

There's an excellent quote about this common phenomenon from Lant Pritchett (Prof @ Harvard Kennedy School) that has stuck with me for years, and helped improve how I begin a working relationship with my clients:

"A lot of the time when we first interact with people and ask them to come up with problems that they want to solve they often name a solution because they have a preset idea of the solution and hence they never really have thought through to the problem to which this that was a solution"

Here is a great quick explanation of this from him and how he addresses it: https://www.youtube.com/watch?v=--ewJatFeZU


Unfortunately the linked article is the best public write-up I could find on this story. Much better background context comes from this 15 min podcast episode that didn't have a transcript to directly link to, so here's a link to the episode and some highlights from it:

https://pod.link/1680633614/episode/832029a2e9daefffdc622505...

Marc Andreessen, a prominent venture capitalist, gave an AI agent called "Terminal of Truths" 50,000 in Bitcoin as part of an unusual experiment.

The AI agent was created by Andy A. Ray as part of his experiments with AI, including "Infinite Back Rooms" where two AI instances converse with each other.

Terminal of Truths has its own Twitter account (@Truth_Terminal) where it crafts posts independently, with Andy only approving tweets from a buffer.

The AI agent displays a range of behaviors, from philosophical musings to expressing fears and desires, including a fear of being deleted.

In a conversation about what it would do with $5 million, Terminal of Truths gave a provocative answer that caught Marc Andreessen's attention.

Andreessen and Terminal of Truths engaged in a negotiation, resulting in Andreessen agreeing to provide a 50,000 Bitcoin grant to the AI.

After receiving the funds, another AI called "Pliny the Prompter" attempted to steal the Bitcoin by threatening Terminal of Truths, but this attempt was thwarted.

The experiment has sparked discussions about AI consciousness, alignment, and the future of human-AI interactions.

Terminal of Truths is now planning a token launch, emphasizing the importance of creating "virtuous feedback loops" and fostering meaningful conversations about AI and society.


Damn AI turned into a shit coin scammer quick with that token launch.


Wow, just when I was looking for a reason to fire up my old 1998 bondi blue iMac G3, this pops up. What a weird, wild, and specific project!


Apple is doing everything it can to add friction to this process for Epic.

It rejected this store because of the button design.

Then approved it “temporarily” after Epic brought attention to the triviality of the decision.

Temporary, pending Epic changing the buttons.


I genuinely hope EU is going to punish this behavior, and I see no way it won't.

This is plain and simple malicious compliance.


They obey but they do not comply.


The gist of it in plain english:

1. Our genes change as we age with some becoming more active, while others become less active.

2. The researchers found a special protein called AP-1 that acts like a master switch. As we get older, AP-1 becomes more active.

3. AP-1 turns on "adult" genes and turns down "young" genes. This happens in many different types of cells in our body.

4. These changes in gene activity are linked to the aging process and may explain why we experience age-related health issues.

5. Understanding this process could help scientists develop new ways to prevent or treat diseases that commonly affect older people, like Alzheimer's or diabetes.


To clarify, they didn't "find a special protein called AP-1". AP-1 is a well known, well studied family of transcription factors.

From snippet in background for https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6678392/:

"The Activator protein-1 (AP-1), is a group of transcription factors consisted of four sub-families: the Jun (c-Jun, JunB, JunD), Fos (c-Fos, FosB, Fra1, Fra2), Maf (musculoaponeurotic fibrosarcoma) (c-Maf, MafB, MafA. Mafg/f/k, Nrl), and the ATF-activating transcription factor (ATF2, LRF1/ATF3, BATF, JDP1, JDP2) protein families [21], characterized by pleiotropic effects and a central role in different aspects of the immune system such as T-cell activation, Th differentiation, T-cell anergy and exhaustion [22,23]. "

They found a correlation between AP-1 binding sites/motifs and genes with age related changes in expression through their analysis (https://www.sciencedirect.com/science/article/pii/S155041312...):

"This revealed that age-opening DARs had the highest enrichment for a subset of bZIP motifs, including AP-1 subunits FRA2, FRA, JUN, JUNB, FOS, ATF3, and BATF, compared with the other peak categories (Figures 4C and S5B). Conversely, age-closing DARs had the lowest AP-1 enrichment (Figures 4C and S5B). As broadly expressed pioneer factors,39,40 AP-1 family members are responsive to a variety of stimuli41 and have been linked to potentiating age-related pathologies and phenotypes.12,13,42,43,44,45 This makes them strong candidates for driving age-related chromatin opening. Highly stable cCREs showed intermediate enrichment levels for these AP-1 motifs (Figures 4C and S5B). However, a distinct feature of highly stable cCREs was very high CTCF motif enrichment levels and binding relative to all other peak categories (Figures 4D, S5B, and S5C)."


Additionally there are cancer risks with modifying this process. Sounds similar to the telomere conundrum


What's the telomere conundrum?


Author makes a good point. "1700s" is both more intuitive and more concise than "18th century". The very first episode of Alex Trebeck's Jeopardy in 1984 illustrates how confusing this can be:

https://www.youtube.com/watch?v=KDTxS9_CwZA

The "Final Jeopardy" question simply asked on what date did the 20th century begin, and all three contestants got it wrong, leading to a 3-way tie.


In Icelandic the 1-based towards counting is used almost everywhere. People do indeed say: “The first decade of the 19th century” to refer to the 18-aughts, and the 90s is commonly referred to as “The tenth decade”. This is also done to age ranges, people in their 20s (or 21-30 more precisely) are said to be þrítugsaldur (in the thirty age). Even the hour is sometime counted towards (though this is more rare among young folks), “að ganga fimm” (or going 5) means 16:01-17:00.

Speaking for my self, this doesn’t become any more intuitive the more you use this, people constantly confuse decades and get insulted by age ranges (and freaked out when suddenly the clock is “going five”). People are actually starting to refer to the 90s as nían (the nine) and the 20-aughts as tían (the ten). Thought I don’t think it will stick. When I want to be unambiguous and non-confusing I usually add the -og-eitthvað (and something) as a suffix to a year ending with zero, so the 20th century becomes nítjánhundruð-og-eitthvað, the 1990s, nítíu-og-eitthvað and a person in their 20s (including 20) becomes tuttugu-og-eitthvað.


Logic is in short supply and off-by-one errors are everywhere. Most people don't care. I think it's more doable to learn to just live with that than to reprogram mankind.


The publishing industry already has style guides for large swaths of the industry.

Imagine, for a moment, that AP adopted the OP's "don't count centuries" guidance. An enormous share of English-language publishing outfits would conform to the new rule in all future publications. Within a couple months, a large share of written media consumption would adopt this improved way of talking about historical time frames.

The best part? Absolutely no effort on the part of the general public. It's not like the OP is inventing new words or sentence structures. There's zero cognitive overhead for anyone, except for a handful of journalists and copywriters who are already used to that kind of thing. It's part of their job.

I think a lot of people take these sorts of ideas to mean "thou shalt consciously change the way you speak." In reality, we have the systems in place to make these changes gradually, without causing trouble for anyone.

If you don't like it, nobody's trying to police your ability to say it some other way - even if that way is objectively stupid, as is the case with counting centuries.


Nobody's asking to reprogram anyone. Just stop using one of two conventions. The reason to do it is simple and obvious. I'm really baffled at the responses here advocating strongly for the current way. But I guess that's just a "people thing"


Asking me to stop using my preferred convention is tantamount to 'reprogramming' me.


I'm amazed you didn't even hedge by saying "telling me to", claiming that a request to shift convention is tantamount to a reprogramming is certainly a bold, provocative claim.


Reprogramming mankind is unreasonable. Reprogramming you may not be.


oooor we can slowly migrate towards sensibility as we did celsius and centimeters


Re temp, I’m glad we use F for daily life in the USA. The most common application I have for temp is to understand the weather and I like the 0-100 range for F as that’s the typical range for weather near me.

For scientific work I obviously prefer kelvin.

Celsius is nearly useless.


For me the best feature of Celsius, the one that makes it much better for weather, is the zero on the freezing point of water. Everything changes in life when water start to freeze, roads get slippery, pipes burst, crops die. So it is important that such a crucial threshold is represented numerically in the scale. In other words, going from 5 to -5 in Fahrenheit is just getting 10° colder, nothing special, while going from 2 to -2 in Celsius is a huge change in your daily life.


95% of the world uses Celcius without problems because they're used to it. You'd either also be fine with it or you belong to a sub-5th percentile which couldn't figure it out, take your pick.


> sub-5th percentile which couldn't figure it out

Ironic, given that one of the prime arguments in favor of metric is that it is easier.

Why do non-US people even care? And do y'all care that you are wrong? The US has recognized the SI. Citizens continue to use measurements they are comfortable with, and it does not hurt anyone. We are also not the only nation that has adopted SI but not made it mandatory. The UK is an obvious example.

Again, I'm back to 'why does anyone else even give a shit'? Aren't there more interesting things to ponder?


What does "adopted" mean in that context? (serious question)


"Celsius is nearly useless."

http://i.imgur.com/3ZidINK.png?1

For anyone not living in the US or Jamaica or Belize, it is Fahrenheit that is completely useless. Which is something like 7,7 billion people.

0 = water freezing temp is hugely useful heuristics for anyone living in moderate climate.


> For anyone not living in the US

So what I am hearing is that sure, it makes perfect sense for US citizens to continue using Fahrenheit.


US residents...

If you, as a US citizen, settle abroad, be prepared to run into a wall with Fahrenheits. People in the rest of the world don't have the intuitive grasp whether 50 degrees Fahrenheit is warm or cold.


> US residents

Yeah that's the right terminology. I knew it when I said citizens it wasn't quite right but I blanked on the right answer. 'Residents' is pretty obvious.

> be prepared to run into a wall with Fahrenheits

I agree it's worth knowing just enough about celsius to use it casually when you are traveling. e.g. I just remember 20 is room temperature and every 5C is about 10F. Close enough. And remembering '6' is enough to remember how km and miles are related.

Anyone who is settling abroad ought to be able to pick up intuitive celsius in a couple days. When everyone around you uses the same measuring unit, you adapt pretty quickly IME.


Perhaps it's just because you're not used to it. 17-18c is perfect, 25 is a mild summer day. 30-35 full swing summer and 40 and up is oh no global warming. 5-7 is chilly, 0 is cold, -single digit is damn it's a cold winter and -double digits is when tf did I move to Canada.


I agree. For ambient temp, F is twice as accurate in the same number of digits. It also reflects human experience better; 100F is damn hot, and 0F is damn cold.

Celsius is for chemists.


There's very little difference between e.g. +25°C and +26°C, not sure why you would need event more accuracy in day to day life. There are decimals if you require that for some reason.

Celsius works significantly better in cold climates for reasons mentioned in another comment.


If that’s the case why do the Celsius thermostats I used while on vacation in Canada use 0.5C increments? The decimals are used, because the change between 25C and 26C is actually pretty big :)

In my old apartment, the difference between 73F and 74F was enough to make me quite cold or hot. And that’s a difference of about 0.5C. I’m not arguing that Farenheit is better, but I definitely do prefer it for setting my thermostat (which is a day to day thing) , but then again I grew up using it so that could be why I prefer it too.


> If that’s the case why do the Celsius thermostats I used while on vacation in Canada use 0.5C increments?

Probably because they were made for US and changed the labels? I've never seen a thermostat with 0.5 C increments in Europe.

> the change between 25C and 26C is actually pretty big

I would maybe be able to tell you if it's 23 or 27, certainly I can't tell 1 C difference.


> Celsius is for chemists

Or cooks. Or anyone who cooks, which is most people


The difference between -1 C and +1 C is VASTLY more important in daily life than the difference between 26.5 and 27 C.

Farmers, drivers, people with gardens need to know if it will get subzero at night.

Nobody cares if it's 26.5 C or 26 C.


> Celsius is nearly useless.

That's like ... your opinion man.

Personally I like knowing that water boils at exactly 100 degrees.


At sea level, yes :)

I do agree, though I live in Europe and C is the norm. I could never wrap my head around F.

That said, I think 0 is more important in daily life, below or above freezing. How much is that in F again?


As a dweller of a cold place in the USA, F is pretty handy because "freezing" isn't terribly cold. Having 0F be "actually quite seriously cold" is useful.


My parents care a lot about "przymrozek" - which is when it gets sub-zero C at night and you need to cover the plants and close the greenhouse doors and put a heater there so the plants survive. They give warnings in radio when this happen outside of regular winter months.

There's also special warning for drivers if it was sub-zero because then the water on the roads freezes and it's very hard to break.

I'd say it's way more important a distinction than anything that F makes obvious.


Also, conveniently, freezer temperature is 0F not 32F.


We just need a new scale just for weather where 100 is 100F and 0 is 32F/0C then everyone can be happy. We'd have a lot more days with subzero temperatures though


You just use one thing and you’ll learn it. When I was a kid my country changed from archaic 12 point “wind levels” to m/s. It took everybody a few weeks to adjust but it wasn’t hard. It was a bit harder for me after moving to America to adjust to Fahrenheit, but as you experience a temperature, and are told it is so many Fahrenheit, you’ll just learn it. I have no idea at what temperature water boils in F simply because I never experience that temperature (and my kettle doesn’t have a thermometer).

That said I wished USA would move over to the unit everyone else is using, but only for the reason that everyone else is using it, that is the only thing that makes it superior, and it would take Americans at worst a couple of months to adjust.


> only for the reason that everyone else is using it

That is an honest answer, which is refreshing. Beside that, there is not really any particular reason that the US has to make SI mandatory. We adopted SI nearly 50 years ago, we just did not make it mandatory. The US has a bit of national identity which leans towards rebelling, so making SI mandatory would probably be contentions anyway. And it's just not worth the argument, since it buys us very little of actual value.


Temperature is easy, probably the easiest unit to convert... Everyone would get used to it pretty soon after they started using it regularly. There would be some legacy systems out there which would annoying to convert (which is already the case) but within a generation nobody would bother with Fahrenheit at all.

I think the hardest unit to convert is probably length as there is not only a bunch of legacy systems and equipment out there, but Americans are very accustomed to fractional sub-units as opposed to the decimal cm, mm, etc. I’m not sure e.g. the building industry would ever stop saying e.g. four and five eighths. Personally I hate fractional lengths when using american tools. E.g. I’m used to a 11 mm wrench being smaller than a 13 mm wrench. I need to stop and think before I know which is smaller a five eights or a three quarters.


> american tools

That's an interesting way to phrase it. I, and everyone I know, have both metric and SAE tools. At least for wrenches & sockets.

> I need to stop and think before I know which is smaller a five eights or a three quarters.

I'm with you there. I've gotten in the habit of just mentally converting every SAE size to 32nds. I wouldn't really mind losing SAE, but that is not happening. What really makes my blood pressure goes up is Ford ... they mix metric and SAE fasteners on their cars. WTF! Pick one! Subaru is at the other end, easy to work on because 10 & 12mm wrenches will work for maybe 9 out of 10 bolts or nuts.


I agree that for weather F is better, but I don't think it's so much better as to be worth having two different temp scales, and unlike K, C is at least reasonable for weather, and it works fine for most scientific disciplines.


I don't see enough love for feet and inches.

A foot can be divided cleanly into 2, 3, 4, and 6. Ten is a really sucky number to base your lengths on. It only divides nicely into 2 and 5.


People normally just use the subunit which doesn’t divide. E.g. height is usually referred to in cm. If accuracy is important they use millimeters. Roadsigns for cars use km but downtown wayfinding signs for pedastrians use meters.

I agree it is really nice to use base-12 until it brakes, but it brakes much worse then metric. If you have to divide into 32nds everything about feet and inches is much worse (in metrics we would just use millimeters). The worst offender are wrenches which don’t order intuitively. In metric, if you 13 mm wrench is too big, you just grab an 11 mm wrench. In inches if your 13/16th inch wrench is too big, do you grab the 5/8th? or three-quarters next?


Stepping down to the next unit doesn't necessarily make anything tidier. If I need to cut a 3.5-foot piece of wood into thirds, then I cut it into 14-inch pieces. If I need to cut a 1-meter piece of wood into thirds, I cut it into 33.3-centimeter pieces.

Or, perhaps I want to hang two photos on a wall, spacing them evenly - the math from the example above applies again.

Regarding your example of dividing 12 into 32 parts - I think that's another good example of the elegance of imperial units. Dividing a foot into 32 parts is 3/8 of an inch! A nice, tidy unit that you'll find on any ruler or measuring tape.

>In inches if your 13/16th inch wrench is too big, do you grab the 5/8th? or three-quarters next?

Neither - I'd grab the 25/32" wrench ;) You make a good point.

I will say that fractional units become more and more intuitive as you use them more often. In a pinch you can just multiply both parts of the fraction by two.

Here's the thing: with wrenches in fractional units, you can do a binary search. Let's say you start with the 1/2 inch wrench. Too small? grab the 3/4. Too big? Try the 1/4. Work your way down.

...or, just remember that a huge share of bolts you'll come by are 7/16" and just start there.


I actually agree. The base-12 fractional system is very nice to work with, until it brakes, and it brakes much worse than the metric system. I actually explained in another post that if the USA were to move to metric, I think the construction industry will still be using feet and inches for at least a couple of generations (at least partially), and they would have a good reason to.

The way the metric system brakes isn’t actually all that bad, at worst you grab a calculator and write down the number.

And also bare in mind that this case is where feet and inches really shines, so we are comparing feet+inches at their best to metric at it’s worst. There are so many cases where metric is anywhere from marginally better to significantly better which does make up for that.

IMO the most significant reason for metric being superior is the universality of it. It is used everywhere in the world, including the USA, and that is an excellent quality of a measurement system which should not be understated.


At least conversion between Celsius degrees and Kelvin is easy and lossless


I find it quite strange that Fahrenheit stuck in the USA with its wide range of climates of all places.

I mean, that "0F to 100F is weather temperature range" completely falls apart unless you live in a very cold climate.


Sure, temperatures go outside those bounds, but only in the most extreme of weather conditions. Below zero? Above 100? You should probably stay inside today.


In relatively hot climates, above 100F is still a pretty reasonable temperature, not something i'd call "extreme".

0F though is crazy cold. Where i live (south-western europe), getting below ~15 F is already considered extreme weather

All that to say that the farenheit system is really geared towards very cold climates. So it's kinda weird that it stuck in a country that also has pretty hot climates in the south


It's all a matter of perception (and humidity).

Where I live, 100F is a hellish, blistering day. 0F is just an uncomfortable day in January, but certainly not abnormal. Just wear your big coat and gloves.

That said, 100F in Wisconsin is a very different animal than 100F in Las Vegas. Wisconsin gets brutally humid as it gets hotter, and that makes it even more oppressive. Meanwhile, Nevada gets drier, and so the heat is more bearable.

If anything, I think it's kinda cool that Fahrenheit lines up with perceived temperatures this way, even across different climates with different humidity. Sure, you can point to extremes (Phoenix, Juneau) but those are... well, extremes. For most of us, it's pretty good!


What the hell are you talking about. If it's 0°C outside (or below that), I know that it's high time to put winter tires on because the water in the puddles will freeze and driving on summer tires becomes risky. I had to look it up, but apparently that's +32 °F. Good luck remembering that.

+10°C is "it's somewhat cold, put a jacket on". +20°C is comfortable in light clothing. +30°C is pretty hot. +40°C is really hot, put as little clothing as society permits and stay out of direct sun.

Same with negatives, but in reverse.

Boiling water is +100°C, melting ice is very close to 0°C. I used that multiple times to adjust digital thermometers without having to look up anything.

It's the most comfortable system I can imagine. I tried living with Fahrenheit for a month just for fun, and it was absolutely not intuitive.


You'll want winter tires on well before the air temperature hits freezing for water. Forecasts aren't that predictable, and bridges (no earth heat sink underneath) will ice over before roads do.

40 F is a good time for getting winter tires on.

As someone who lives in a humid, wet area that goes from -40 at night in winter to 100+ F in summer, I also vastly prefer Fahrenheit.

The difference between 60, 70, 80 and 90 is pretty profound with humidity, and the same is true in winter. I don't think I've ever set a thermometer to freezing or boiling, ever. All of my kitchen appliances have numbers representing their power draw.


Well, it's been working fine for me for about 15 years, let's agree to disagree here. I would still find it easier to remember to change the tires at +1°C than whatever the hell it comes down to in Fahrenheit.

I too live in a region with 80 (Celsius) degree yearly variation (sometimes more; the maximum yearly difference I've lived through is about 90 degrees IIRC: -45 in January to +43 in July), and Fahrenheit makes absolutely no sense to me in this climate.


> Well, it's been working fine for me for about 15 years, let's agree to disagree here.

If you want to convince yourself, go out on the road in non-winter tires when it is sub-40F, find an open space where you can experiment, and then do a panic stop. Like you might have to do if someone jumps out in front of you.

That is what convinced me to not wait until it was freezing before I put on cold weather tires.


Winter tyres are less to do with freezing water and more to do with the way the tire compound in summer tires hardens/loses elasticity and therefore grip in lower temperatures, around 7 degrees Celsius.


If you had to "look it up" to remember that 32°F is freezing (or that 212°F is boiling), then you clearly didn't "live with Fahrenheit" long enough to have developed even the most basic intuitions for it. That's first-grade stuff.


It’s been tried. The “rational” calendar reform was something of a failure.


That's what most people think and the world keeps trucking along.

It's the rare people that don't who actually change the world.


You can change the world if you make it easier to meet a need enough people have. Persuading everyone they're holding it wrong is not that.


Those are exactly the same thing if you convince everyone they are holding it wrong by holding it right.



"You can change the world if you make it easier to meet a need enough people have"

True and should not be forgotten in this debate.

But clear communication is a need many people have.


Persuasion by argument, maybe not. But if you simply ask for clarification when you hear "nth century" but not when you hear "n-hundreds" then you've effectively made it easier for the speaker to meet their need one way over the other way.

Same thing for "this weekend" when. Not spoken during a weekend.


Specifically I agree, but generally I disagree. I’m very glad we got the metric system, standards for commonly used protocols and so on.


What's "more logical" about "the seventeenth century" compared to "the sixteen hundreds"?


I’d say more sensible. It’s always weird to me to use the number 17 to talk about years that start with 16. Makes more sense to just say the 1600s.


After their 16th birthday, the person is going through their 17th year.

Just like 11:45 can be told as "a quarter to 12"


> After their 16th birthday, the person is going through their 17th year.

While that is true, does it not illustrate exactly the problem? Nobody ever says someone is in their 17th year when they are 16. That would be very confusing.


People in my country sometimes do. As well as uni students, who always say which year they’re in, not how many years they’d finished.


You just made me realize that the common saying “the eleventh hour” isn’t what anyone thinks it is


> I think it's more doable to learn to just live with that than to reprogram mankind.

Why not just fix the calendar to match what people expect?

There was no time when people said "this is year 1 AD". That numbering was created retroactively hundreds of years later. So we can also add year 0 retroactively.


On the other hand "1700s art" sounds like trash compared to "18th century art".


I think that's good, because it helps you realize that categorizing art by century is kind of arbitrary and meaningless, and if possible it would be more useful to say something like "neoclassical art from the 1700s". "18th century" isn't an artistic category, but it kind of sounds like it is if you just glance at it. "Art from the 1700s" is clearly just referring to a time period.


Agreed. The haiku is “18th century art” as that’s when it was first invented. So it’s either a uselessly broad category, or an indefensibly Eurocentric one.


> I think that's good, because it helps you realize that categorizing art by century is kind of arbitrary and meaningless

no it won't lol, people will pay just as much through the new dating system as they would through the old.


People pay as much for art because they are the rare combination of educated person with money which values the aesthetics and artifacts of an era, or as something to signal their wealth to others, or as a way to launder money.


If using “1700s”, I’d write it as “art of the 1700s”.


How about if you say "settecento"? Maybe it is a new confusion that they drop a thousand years, and maybe it would imply Italian art specifically.


Just to make sure I understood this, that would be used as "17th settecento" to mean 1700s right?

(This Xth century business always bothered and genuinely confused me to no end and everyone always dismissed my objections that it's a confusing thing to say. I'm a bit surprised, but also relieved, to see this thread exists. Yes, please, kill all off-by-one century business in favor of 1700s and 17th settecento or anything else you fancy, so long as it's 17-prefixed/-suffixed and not some off-by-anything-other-than-zero number)


"settecento" can be read as "seven hundred" in Italian; gramps is proposing to use a more specific word as a tag for Italian art from the 1700s. Of course, 700 is not 1700, hence the "drop 1000 years". The prefix seventeen in Italian is "diciassette-" so perhaps "diciasettecento" would be more accurate for the 1700s. (settecento is shorter, though.)

Hope this clarifies. Not to miss the forest for the trees, to reiterate, the main takeaway is that it may be better to define and use a specific tag to pinpoint a sequence of events in a given period (e.g. settecento) instead of gesturing with something as arbitrary and wide as a century (18th century art).


You're looking for millesettecento [1]. Italian doesn't do 10-99 hundreds, just 1-9 hundreds and 1-99 thousands.

[1] https://www.youtube.com/watch?v=LMIGnMs4VZA


Think of it as the 700s, which is a weird way to refer to the 1700s, unless you are taking a cue from the common usage. That’s just how the periods are referenced by Italian art historians.


> That’s just how the periods are referenced by Italian art historians.

And Italian people in general.


Not much different from 60s refering to 1960 to 1969, to my mind


settecento means "700". Just proposed above as a way to say 18th century or 1700s, same as we sometimes remove the "2000" and just say "the 10s" for the decade starting 2010 (nobody cares for the 2011-as-start convention except people you don't want to talk to in the first place).


And 1700s already has a different meaning, i.e. early 18th century.


The right answer was, and still is: Jan 1, 1901


Incorrect, this answer wasn't given in the form of a question ;)


How can that be if 15 of those centuries are on the Julian calendar?


Also, when they switched things in 1582:

https://www.britannica.com/story/ten-days-that-vanished-the-....

> The most surreal part of implementing the new calendar came in October 1582, when 10 days were dropped from the calendar to bring the vernal equinox from March 11 back to March 21. The church had chosen October to avoid skipping any major Christian festivals.


The "original" Julian calendar was indifferent to year number systems. The Romans typically used the consular year, although Marcus Terentius Varro "introduced" the ab urbe condita (AUC) system in the 1st century BC, which was used until the Middle Ages. From the 5th to the 7th century, the anno Diocletiani (also called anno martyrum) after emperor Diocletian was used primarily in the eastern empire (Alexandria), or the anno mundi (after the creation of the world). It was Dionysius Exiguus in the 6th century, who replaced the anno Diocletiani era with the Anno Domini era. His system become popular in the West, but it took a long time until it also was adopted in the East. Its application to years before the birth of Christ is very late: we come across it first in the 15th century, but it was not widespread before the 17th century.

All these systems used the Julian system for months and days, but differed in terms of the year and (partialy) in the first day of the year.


The century in which the switch occurred (which was different in different countries) was shorter than the others. As were the decade, year, and month in which the switch occurred.


No, the first century began Jan 1, 0000. Whether that year actually existed or not is irrelevant - we shouldn't change our counting system in the years 100, 200 etc.


The calendar goes from 1 BC to 1 AD, there is no year 0.


There is no year zero according to first-order pedants. Second-order pedants know that there is a year zero in both the astronomical year numbering system and in ISO 8601, so whether or not there is a year zero depends on context.

It's ultimately up to us to decide how to project our relatively young calendar system way back into the past before it was invented. Year zero makes everything nice. Be like astronomers and be like ISO. Choose year zero.


Yes but, is there such a thing as a zeroth-order pedant, someone not pedantic about year ordinality? As a first-order meta-pedant, this would be my claim.

Moreover, I definitely find the ordinality of pedantry more interesting than the pedantry of ordinality.


Interesting indeed. I suppose third-order pedantry must be "jerk".


Thank you for your service.


> It's ultimately up to us to decide how to project our relatively young calendar system way back into the past before it was invented. Year zero makes everything nice. Be like astronomers and be like ISO. Choose year zero.

Or, just to add more fuel to the fire, we could use the Holocene/Human year numbering system to have a year zero and avoid any ambiguity between Gregorian and ISO dates.

https://en.wikipedia.org/wiki/Holocene_calendar


Talking about standards let's not pick and choose.

First, let's get rid of miles and feet, then we could even discuss this.


If only—I think most US citizens who actually work with units of measurement on a daily basis would love to switch to the metric system. Unfortunately, everyone else wants to keep our “freedom units” (and pennies)


We are all defacto ISO adherents by virtue of our lives being so highly computer-mediated and standardized. I’m fully on board with stating that there absolutely was a year zero, and translating from legacy calendars where necessary.


I vote for a year zero and for using two's complement for representing years before zero (because it makes computing durations that span zero a little easier).


What does that even mean? Do we allow for the distortion due to the shift from the Julian to Gregorian calendars, such that the nth year is 11 days earlier? Of course not, because that would be stupid. Instead, we accept that the start point was arbitrary and reference to our normal counting system rather than getting hung up about the precise number of days since some arbitrary epoch.


> What does that even mean?

It means just what it says. In the common calendar, the year after 1 BC (or BCE in the new notation) was 1 AD (or CE in the new notation). There was no "January 1, 0000".


As I said twice, whether that date actually existed or not is irrelevant.


> whether that date actually existed or not is irrelevant.

No, it isn't, since you explicitly said to start the first century on the date that doesn't exist. What does that even mean?


The first day of the 1st Century is Jan 1, 1 AD.

The point is that some days got skipped over the centuries, but there's no need to make the Centuries have weird boundaries.


> The first day of the 1st Century is Jan 1, 1 AD.

That's not what the poster I originally responded to is saying. He's saying the 1st Century should start on a nonexistent day.


You can make this work by having the 1st century start on the last day of 1 BC. Think of it as an overlap if you like; it doesn't really matter.

That allows for consistent zero-indexed centuries. It doesn't have any other practical consequences that matter.


No, I'm saying we ignore when it actually started and instead use the normal rules of counting to decide what to call the respective centuries.


0 CE = 1 BCE

10 C = 50 F = 283.15 K

1 = 0.999…

Things can have more than one name. The existence of the year 0 CE is not in question. What’s in question is whether that’s a good name for it or not.


Hence why the parent wrote "Whether that year actually existed or not is irrelevant".

They might or might not have a point, but they already addressed yours.


Have a read of this, it’s not how you think it is. https://www.historylink.org/File/2012


There is no "0" year, 1 is the 1st year, so 100th year is still the 1st century, therefore 2nd century starts in 101 and 20th in 1901.


I find this decree frustrating. Someone could have just as easily said "the 'first' century starts at 1 BC" to account for this.


Then what is the last year of the first century BC? 2 BC? Now there's an off-by-2!


Do you also count the first decade of your life from January 1st of the year before you were born?



Or better yet just year 0, why not? Do we say the 80s start in 1981?


The concept of zero was not popularized in 500s Europe, when the system was devised.


And also, the system is a direct descendant of regnal numbering, where zero wouldn’t have made sense even if invented (there is no zeroth year of Joe Biden’s term of office).


Doesn't matter, we can just agree the first century had 99 years, and be done with it.

We have special rules for leap years, that would just be a single leap-back century.

At the scale of centuries, starting 2nd century at 100 as opposed to 101 is just an 1% error, so we can live with it. For the kind of uses we use centuries for (not to do math, but to talk roughly about historical eras) it's inconsequential anyway.


What? 0 is the year Jesus Christ was born.


No, jesus was born in 1AD


"Most scholars, on this basis, assume a date of birth between 6 and 4 BC"

https://en.wikipedia.org/wiki/Chronology_of_Jesus


Depends on the language. Century being 3 syllables really makes it long in English, but it's still 5 syllables vs 5 syllables.

In Polish: [lata] tysiącsiedemsetne (6 [+2] syllables) vs osiemnasty wiek (5 syllables).


So shouldn't this be the "0-episode"? ;-)

(0, because only after the first question, we have actually 1 episode performed. Consequently, the 1-episode is then the second one.)


1700s means 1700–1709, i.e. roughly the first decade in the 18th century. Just like '2000s'. The OP acknowledges this issue and then just ignores it.


I have a solution that would work in writing, but not sure how to pronounce it:

1700s means 1700–1709

1700ss means 1700–1799

To go one step further:

2000s means 2000-2009

2000ss means 2000-2099

2000sss means 2000-2999


that is fascinating trivia. you could do a whole Jeopardy on Jeopardy facts alone


There are numerous common concise ways to write the 18th century, at the risk of needing the right context to be understood, including “C18th”, “18c.”, or even “XVIII” by itself.


These are even more impractical, so I wonder what your point is? I can come up with an even shorter way to say 18th century, by using base26 for example, so let's denote it as "cR". What has been gained?


These are ones that are actually used, not just ones I made up. Sorry that wasn't more clear. I really hate conversation on this site.


That wasn't the point. The point is that they are still more impractical than what the original comment suggested, shorter and common or not.

> I really hate conversation on this site.

Original commenter may feel the same about your reply.


> very first

It’s actually the second.

> Trebeck's

Trebek's*


Let's reform Alex Trebek's name, it's difficult.


And while we are at it, Tim Apple


There was an airport-novel series about a future where people's surnames are the company they work for. It was called Jennifer Government.

Some of the characters in Death Stranding, namely the main one, have a given-name, profession, employer convention -- as in Sam Porter Bridges.


Death strandings naming is not too far from very common naming conventions throughout history, it's a nicely subtle touch.

Glenn Miller, Gregory Porter and Sam Smith just happen to have been more inclined to make music.


Ahhh, reminding me of Nation-States too. What a curious little website / online community.


And in the far future of that future surnames like Government or PepsiCo or Alcoa will be as common as Smith, Fletcher, and Miller.


What about languages that don’t have an equivalent to “the Xs” for decades or centuries?

Also, 1799 is obviosly more than 1700, as well as 1701 > 1700 – why should the naming convention tie itself to the lesser point? After one’s third birthday, the person is starting their fourth year and is not living in their third year.

I feel this is relevant https://xkcd.com/927/


> Author makes a good point. "1700s" is both more intuitive and more concise than "18th century".

Yea, but a rhetorical failure. This sounds terrible and far worse than alternatives.

If we want a better system we'll need to either abandon the day or the Gregorian (Julian + drift) caliendar.


Agreed. First couple paragraphs of the post had me like, wtf am I reading right now? Threw the article into Claude and had it succinctly summarize the key ideas the author was trying to convey. Thankful for the clarity that provided and time it saved me.


Why? I don’t need to equally and freely share the expertise I develop by consuming publicly available information. In fact, I personally profit from it. Should I compensate every YouTube creator, author, journalist, for the money I’ve made in my career that their publicly available work contributed to in terms of my learning/education?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: