Two follow-ups for you:
1. Is this response some kind of attempt to excuse hostile UX?
2. Because reader mode exists, does that mean brutally bad UX should never be called out?
There is no reason to let random websites decide the UX for you. Reader mode has existed for 14 years now, and if you set your browser to open all websites in reader mode by default, you will always have every article presented the way you prefer. Without all the annoyances. I can guarantee you that website owners have no interest in improving their UX, so why bang your head against the wall?
If a band that sucks is playing on the radio, I can't make them play better. But I can turn down the volume.
This is about selling solutions vs solving problems.
There's an excellent quote about this common phenomenon from Lant Pritchett (Prof @ Harvard Kennedy School) that has stuck with me for years, and helped improve how I begin a working relationship with my clients:
"A lot of the time when we first interact with people and ask them to come up with problems that they want to solve they often name a solution because they have a preset idea of the solution and hence they never really have thought through to the problem to which this that was a solution"
Unfortunately the linked article is the best public write-up I could find on this story. Much better background context comes from this 15 min podcast episode that didn't have a transcript to directly link to, so here's a link to the episode and some highlights from it:
Marc Andreessen, a prominent venture capitalist, gave an AI agent called "Terminal of Truths" 50,000 in Bitcoin as part of an unusual experiment.
The AI agent was created by Andy A. Ray as part of his experiments with AI, including "Infinite Back Rooms" where two AI instances converse with each other.
Terminal of Truths has its own Twitter account (@Truth_Terminal) where it crafts posts independently, with Andy only approving tweets from a buffer.
The AI agent displays a range of behaviors, from philosophical musings to expressing fears and desires, including a fear of being deleted.
In a conversation about what it would do with $5 million, Terminal of Truths gave a provocative answer that caught Marc Andreessen's attention.
Andreessen and Terminal of Truths engaged in a negotiation, resulting in Andreessen agreeing to provide a 50,000 Bitcoin grant to the AI.
After receiving the funds, another AI called "Pliny the Prompter" attempted to steal the Bitcoin by threatening Terminal of Truths, but this attempt was thwarted.
The experiment has sparked discussions about AI consciousness, alignment, and the future of human-AI interactions.
Terminal of Truths is now planning a token launch, emphasizing the importance of creating "virtuous feedback loops" and fostering meaningful conversations about AI and society.
1. Our genes change as we age with some becoming more active, while others become less active.
2. The researchers found a special protein called AP-1 that acts like a master switch. As we get older, AP-1 becomes more active.
3. AP-1 turns on "adult" genes and turns down "young" genes. This happens in many different types of cells in our body.
4. These changes in gene activity are linked to the aging process and may explain why we experience age-related health issues.
5. Understanding this process could help scientists develop new ways to prevent or treat diseases that commonly affect older people, like Alzheimer's or diabetes.
"The Activator protein-1 (AP-1), is a group of transcription factors consisted of four sub-families: the Jun (c-Jun, JunB, JunD), Fos (c-Fos, FosB, Fra1, Fra2), Maf (musculoaponeurotic fibrosarcoma) (c-Maf, MafB, MafA. Mafg/f/k, Nrl), and the ATF-activating transcription factor (ATF2, LRF1/ATF3, BATF, JDP1, JDP2) protein families [21], characterized by pleiotropic effects and a central role in different aspects of the immune system such as T-cell activation, Th differentiation, T-cell anergy and exhaustion [22,23]. "
"This revealed that age-opening DARs had the highest enrichment for a subset of bZIP motifs, including AP-1 subunits FRA2, FRA, JUN, JUNB, FOS, ATF3, and BATF, compared with the other peak categories (Figures 4C and S5B). Conversely, age-closing DARs had the lowest AP-1 enrichment (Figures 4C and S5B). As broadly expressed pioneer factors,39,40 AP-1 family members are responsive to a variety of stimuli41 and have been linked to potentiating age-related pathologies and phenotypes.12,13,42,43,44,45 This makes them strong candidates for driving age-related chromatin opening. Highly stable cCREs showed intermediate enrichment levels for these AP-1 motifs (Figures 4C and S5B). However, a distinct feature of highly stable cCREs was very high CTCF motif enrichment levels and binding relative to all other peak categories (Figures 4D, S5B, and S5C)."
Author makes a good point. "1700s" is both more intuitive and more concise than "18th century".
The very first episode of Alex Trebeck's Jeopardy in 1984 illustrates how confusing this can be:
In Icelandic the 1-based towards counting is used almost everywhere. People do indeed say: “The first decade of the 19th century” to refer to the 18-aughts, and the 90s is commonly referred to as “The tenth decade”. This is also done to age ranges, people in their 20s (or 21-30 more precisely) are said to be þrítugsaldur (in the thirty age). Even the hour is sometime counted towards (though this is more rare among young folks), “að ganga fimm” (or going 5) means 16:01-17:00.
Speaking for my self, this doesn’t become any more intuitive the more you use this, people constantly confuse decades and get insulted by age ranges (and freaked out when suddenly the clock is “going five”). People are actually starting to refer to the 90s as nían (the nine) and the 20-aughts as tían (the ten). Thought I don’t think it will stick. When I want to be unambiguous and non-confusing I usually add the -og-eitthvað (and something) as a suffix to a year ending with zero, so the 20th century becomes nítjánhundruð-og-eitthvað, the 1990s, nítíu-og-eitthvað and a person in their 20s (including 20) becomes tuttugu-og-eitthvað.
Logic is in short supply and off-by-one errors are everywhere. Most people don't care. I think it's more doable to learn to just live with that than to reprogram mankind.
The publishing industry already has style guides for large swaths of the industry.
Imagine, for a moment, that AP adopted the OP's "don't count centuries" guidance. An enormous share of English-language publishing outfits would conform to the new rule in all future publications. Within a couple months, a large share of written media consumption would adopt this improved way of talking about historical time frames.
The best part? Absolutely no effort on the part of the general public. It's not like the OP is inventing new words or sentence structures. There's zero cognitive overhead for anyone, except for a handful of journalists and copywriters who are already used to that kind of thing. It's part of their job.
I think a lot of people take these sorts of ideas to mean "thou shalt consciously change the way you speak." In reality, we have the systems in place to make these changes gradually, without causing trouble for anyone.
If you don't like it, nobody's trying to police your ability to say it some other way - even if that way is objectively stupid, as is the case with counting centuries.
Nobody's asking to reprogram anyone. Just stop using one of two conventions. The reason to do it is simple and obvious. I'm really baffled at the responses here advocating strongly for the current way. But I guess that's just a "people thing"
I'm amazed you didn't even hedge by saying "telling me to", claiming that a request to shift convention is tantamount to a reprogramming is certainly a bold, provocative claim.
Re temp, I’m glad we use F for daily life in the USA. The most common application I have for temp is to understand the weather and I like the 0-100 range for F as that’s the typical range for weather near me.
For me the best feature of Celsius, the one that makes it much better for weather, is the zero on the freezing point of water. Everything changes in life when water start to freeze, roads get slippery, pipes burst, crops die. So it is important that such a crucial threshold is represented numerically in the scale. In other words, going from 5 to -5 in Fahrenheit is just getting 10° colder, nothing special, while going from 2 to -2 in Celsius is a huge change in your daily life.
95% of the world uses Celcius without problems because they're used to it. You'd either also be fine with it or you belong to a sub-5th percentile which couldn't figure it out, take your pick.
Ironic, given that one of the prime arguments in favor of metric is that it is easier.
Why do non-US people even care? And do y'all care that you are wrong? The US has recognized the SI. Citizens continue to use measurements they are comfortable with, and it does not hurt anyone. We are also not the only nation that has adopted SI but not made it mandatory. The UK is an obvious example.
Again, I'm back to 'why does anyone else even give a shit'? Aren't there more interesting things to ponder?
If you, as a US citizen, settle abroad, be prepared to run into a wall with Fahrenheits. People in the rest of the world don't have the intuitive grasp whether 50 degrees Fahrenheit is warm or cold.
Yeah that's the right terminology. I knew it when I said citizens it wasn't quite right but I blanked on the right answer. 'Residents' is pretty obvious.
> be prepared to run into a wall with Fahrenheits
I agree it's worth knowing just enough about celsius to use it casually when you are traveling. e.g. I just remember 20 is room temperature and every 5C is about 10F. Close enough. And remembering '6' is enough to remember how km and miles are related.
Anyone who is settling abroad ought to be able to pick up intuitive celsius in a couple days. When everyone around you uses the same measuring unit, you adapt pretty quickly IME.
Perhaps it's just because you're not used to it. 17-18c is perfect, 25 is a mild summer day. 30-35 full swing summer and 40 and up is oh no global warming. 5-7 is chilly, 0 is cold, -single digit is damn it's a cold winter and -double digits is when tf did I move to Canada.
I agree. For ambient temp, F is twice as accurate in the same number of digits. It also reflects human experience better; 100F is damn hot, and 0F is damn cold.
There's very little difference between e.g. +25°C and +26°C, not sure why you would need event more accuracy in day to day life. There are decimals if you require that for some reason.
Celsius works significantly better in cold climates for reasons mentioned in another comment.
If that’s the case why do the Celsius thermostats I used while on vacation in Canada use 0.5C increments? The decimals are used, because the change between 25C and 26C is actually pretty big :)
In my old apartment, the difference between 73F and 74F was enough to make me quite cold or hot. And that’s a difference of about 0.5C. I’m not arguing that Farenheit is better, but I definitely do prefer it for setting my thermostat (which is a day to day thing) , but then again I grew up using it so that could be why I prefer it too.
As a dweller of a cold place in the USA, F is pretty handy because "freezing" isn't terribly cold. Having 0F be "actually quite seriously cold" is useful.
My parents care a lot about "przymrozek" - which is when it gets sub-zero C at night and you need to cover the plants and close the greenhouse doors and put a heater there so the plants survive. They give warnings in radio when this happen outside of regular winter months.
There's also special warning for drivers if it was sub-zero because then the water on the roads freezes and it's very hard to break.
I'd say it's way more important a distinction than anything that F makes obvious.
We just need a new scale just for weather where 100 is 100F and 0 is 32F/0C then everyone can be happy. We'd have a lot more days with subzero temperatures though
You just use one thing and you’ll learn it. When I was a kid my country changed from archaic 12 point “wind levels” to m/s. It took everybody a few weeks to adjust but it wasn’t hard. It was a bit harder for me after moving to America to adjust to Fahrenheit, but as you experience a temperature, and are told it is so many Fahrenheit, you’ll just learn it. I have no idea at what temperature water boils in F simply because I never experience that temperature (and my kettle doesn’t have a thermometer).
That said I wished USA would move over to the unit everyone else is using, but only for the reason that everyone else is using it, that is the only thing that makes it superior, and it would take Americans at worst a couple of months to adjust.
> only for the reason that everyone else is using it
That is an honest answer, which is refreshing. Beside that, there is not really any particular reason that the US has to make SI mandatory. We adopted SI nearly 50 years ago, we just did not make it mandatory. The US has a bit of national identity which leans towards rebelling, so making SI mandatory would probably be contentions anyway. And it's just not worth the argument, since it buys us very little of actual value.
Temperature is easy, probably the easiest unit to convert... Everyone would get used to it pretty soon after they started using it regularly. There would be some legacy systems out there which would annoying to convert (which is already the case) but within a generation nobody would bother with Fahrenheit at all.
I think the hardest unit to convert is probably length as there is not only a bunch of legacy systems and equipment out there, but Americans are very accustomed to fractional sub-units as opposed to the decimal cm, mm, etc. I’m not sure e.g. the building industry would ever stop saying e.g. four and five eighths. Personally I hate fractional lengths when using american tools. E.g. I’m used to a 11 mm wrench being smaller than a 13 mm wrench. I need to stop and think before I know which is smaller a five eights or a three quarters.
That's an interesting way to phrase it. I, and everyone I know, have both metric and SAE tools. At least for wrenches & sockets.
> I need to stop and think before I know which is smaller a five eights or a three quarters.
I'm with you there. I've gotten in the habit of just mentally converting every SAE size to 32nds. I wouldn't really mind losing SAE, but that is not happening. What really makes my blood pressure goes up is Ford ... they mix metric and SAE fasteners on their cars. WTF! Pick one! Subaru is at the other end, easy to work on because 10 & 12mm wrenches will work for maybe 9 out of 10 bolts or nuts.
I agree that for weather F is better, but I don't think it's so much better as to be worth having two different temp scales, and unlike K, C is at least reasonable for weather, and it works fine for most scientific disciplines.
People normally just use the subunit which doesn’t divide. E.g. height is usually referred to in cm. If accuracy is important they use millimeters. Roadsigns for cars use km but downtown wayfinding signs for pedastrians use meters.
I agree it is really nice to use base-12 until it brakes, but it brakes much worse then metric. If you have to divide into 32nds everything about feet and inches is much worse (in metrics we would just use millimeters). The worst offender are wrenches which don’t order intuitively. In metric, if you 13 mm wrench is too big, you just grab an 11 mm wrench. In inches if your 13/16th inch wrench is too big, do you grab the 5/8th? or three-quarters next?
Stepping down to the next unit doesn't necessarily make anything tidier. If I need to cut a 3.5-foot piece of wood into thirds, then I cut it into 14-inch pieces. If I need to cut a 1-meter piece of wood into thirds, I cut it into 33.3-centimeter pieces.
Or, perhaps I want to hang two photos on a wall, spacing them evenly - the math from the example above applies again.
Regarding your example of dividing 12 into 32 parts - I think that's another good example of the elegance of imperial units. Dividing a foot into 32 parts is 3/8 of an inch! A nice, tidy unit that you'll find on any ruler or measuring tape.
>In inches if your 13/16th inch wrench is too big, do you grab the 5/8th? or three-quarters next?
Neither - I'd grab the 25/32" wrench ;) You make a good point.
I will say that fractional units become more and more intuitive as you use them more often. In a pinch you can just multiply both parts of the fraction by two.
Here's the thing: with wrenches in fractional units, you can do a binary search. Let's say you start with the 1/2 inch wrench. Too small? grab the 3/4. Too big? Try the 1/4. Work your way down.
...or, just remember that a huge share of bolts you'll come by are 7/16" and just start there.
I actually agree. The base-12 fractional system is very nice to work with, until it brakes, and it brakes much worse than the metric system. I actually explained in another post that if the USA were to move to metric, I think the construction industry will still be using feet and inches for at least a couple of generations (at least partially), and they would have a good reason to.
The way the metric system brakes isn’t actually all that bad, at worst you grab a calculator and write down the number.
And also bare in mind that this case is where feet and inches really shines, so we are comparing feet+inches at their best to metric at it’s worst. There are so many cases where metric is anywhere from marginally better to significantly better which does make up for that.
IMO the most significant reason for metric being superior is the universality of it. It is used everywhere in the world, including the USA, and that is an excellent quality of a measurement system which should not be understated.
Sure, temperatures go outside those bounds, but only in the most extreme of weather conditions. Below zero? Above 100? You should probably stay inside today.
In relatively hot climates, above 100F is still a pretty reasonable temperature, not something i'd call "extreme".
0F though is crazy cold. Where i live (south-western europe), getting below ~15 F is already considered extreme weather
All that to say that the farenheit system is really geared towards very cold climates. So it's kinda weird that it stuck in a country that also has pretty hot climates in the south
Where I live, 100F is a hellish, blistering day. 0F is just an uncomfortable day in January, but certainly not abnormal. Just wear your big coat and gloves.
That said, 100F in Wisconsin is a very different animal than 100F in Las Vegas. Wisconsin gets brutally humid as it gets hotter, and that makes it even more oppressive. Meanwhile, Nevada gets drier, and so the heat is more bearable.
If anything, I think it's kinda cool that Fahrenheit lines up with perceived temperatures this way, even across different climates with different humidity. Sure, you can point to extremes (Phoenix, Juneau) but those are... well, extremes. For most of us, it's pretty good!
What the hell are you talking about. If it's 0°C outside (or below that), I know that it's high time to put winter tires on because the water in the puddles will freeze and driving on summer tires becomes risky. I had to look it up, but apparently that's +32 °F. Good luck remembering that.
+10°C is "it's somewhat cold, put a jacket on". +20°C is comfortable in light clothing. +30°C is pretty hot. +40°C is really hot, put as little clothing as society permits and stay out of direct sun.
Same with negatives, but in reverse.
Boiling water is +100°C, melting ice is very close to 0°C. I used that multiple times to adjust digital thermometers without having to look up anything.
It's the most comfortable system I can imagine. I tried living with Fahrenheit for a month just for fun, and it was absolutely not intuitive.
You'll want winter tires on well before the air temperature hits freezing for water. Forecasts aren't that predictable, and bridges (no earth heat sink underneath) will ice over before roads do.
40 F is a good time for getting winter tires on.
As someone who lives in a humid, wet area that goes from -40 at night in winter to 100+ F in summer, I also vastly prefer Fahrenheit.
The difference between 60, 70, 80 and 90 is pretty profound with humidity, and the same is true in winter. I don't think I've ever set a thermometer to freezing or boiling, ever. All of my kitchen appliances have numbers representing their power draw.
Well, it's been working fine for me for about 15 years, let's agree to disagree here. I would still find it easier to remember to change the tires at +1°C than whatever the hell it comes down to in Fahrenheit.
I too live in a region with 80 (Celsius) degree yearly variation (sometimes more; the maximum yearly difference I've lived through is about 90 degrees IIRC: -45 in January to +43 in July), and Fahrenheit makes absolutely no sense to me in this climate.
> Well, it's been working fine for me for about 15 years, let's agree to disagree here.
If you want to convince yourself, go out on the road in non-winter tires when it is sub-40F, find an open space where you can experiment, and then do a panic stop. Like you might have to do if someone jumps out in front of you.
That is what convinced me to not wait until it was freezing before I put on cold weather tires.
Winter tyres are less to do with freezing water and more to do with the way the tire compound in summer tires hardens/loses elasticity and therefore grip in lower temperatures, around 7 degrees Celsius.
If you had to "look it up" to remember that 32°F is freezing (or that 212°F is boiling), then you clearly didn't "live with Fahrenheit" long enough to have developed even the most basic intuitions for it. That's first-grade stuff.
Persuasion by argument, maybe not. But if you simply ask for clarification when you hear "nth century" but not when you hear "n-hundreds" then you've effectively made it easier for the speaker to meet their need one way over the other way.
Same thing for "this weekend" when. Not spoken during a weekend.
> After their 16th birthday, the person is going through their 17th year.
While that is true, does it not illustrate exactly the problem? Nobody ever says someone is in their 17th year when they are 16. That would be very confusing.
> I think it's more doable to learn to just live with that than to reprogram mankind.
Why not just fix the calendar to match what people expect?
There was no time when people said "this is year 1 AD". That numbering was created retroactively hundreds of years later. So we can also add year 0 retroactively.
I think that's good, because it helps you realize that categorizing art by century is kind of arbitrary and meaningless, and if possible it would be more useful to say something like "neoclassical art from the 1700s". "18th century" isn't an artistic category, but it kind of sounds like it is if you just glance at it. "Art from the 1700s" is clearly just referring to a time period.
Agreed. The haiku is “18th century art” as that’s when it was first invented. So it’s either a uselessly broad category, or an indefensibly Eurocentric one.
People pay as much for art because they are the rare combination of educated person with money which values the aesthetics and artifacts of an era, or as something to signal their wealth to others, or as a way to launder money.
Just to make sure I understood this, that would be used as "17th settecento" to mean 1700s right?
(This Xth century business always bothered and genuinely confused me to no end and everyone always dismissed my objections that it's a confusing thing to say. I'm a bit surprised, but also relieved, to see this thread exists. Yes, please, kill all off-by-one century business in favor of 1700s and 17th settecento or anything else you fancy, so long as it's 17-prefixed/-suffixed and not some off-by-anything-other-than-zero number)
"settecento" can be read as "seven hundred" in Italian; gramps is proposing to use a more specific word as a tag for Italian art from the 1700s. Of course, 700 is not 1700, hence the "drop 1000 years". The prefix seventeen in Italian is "diciassette-" so perhaps "diciasettecento" would be more accurate for the 1700s. (settecento is shorter, though.)
Hope this clarifies. Not to miss the forest for the trees, to reiterate, the main takeaway is that it may be better to define and use a specific tag to pinpoint a sequence of events in a given period (e.g. settecento) instead of gesturing with something as arbitrary and wide as a century (18th century art).
Think of it as the 700s, which is a weird way to refer to the 1700s, unless you are taking a cue from the common usage. That’s just how the periods are referenced by Italian art historians.
settecento means "700". Just proposed above as a way to say 18th century or 1700s, same as we sometimes remove the "2000" and just say "the 10s" for the decade starting 2010 (nobody cares for the 2011-as-start convention except people you don't want to talk to in the first place).
> The most surreal part of implementing the new calendar came in October 1582, when 10 days were dropped from the calendar to bring the vernal equinox from March 11 back to March 21. The church had chosen October to avoid skipping any major Christian festivals.
The "original" Julian calendar was indifferent to year number systems. The Romans typically used the consular year, although Marcus Terentius Varro "introduced" the ab urbe condita (AUC) system in the 1st century BC, which was used until the Middle Ages. From the 5th to the 7th century, the anno Diocletiani (also called anno martyrum) after emperor Diocletian was used primarily in the eastern empire (Alexandria), or the anno mundi (after the creation of the world). It was Dionysius Exiguus in the 6th century, who replaced the anno Diocletiani era with the Anno Domini era. His system become popular in the West, but it took a long time until it also was adopted in the East. Its application to years before the birth of Christ is very late: we come across it first in the 15th century, but it was not widespread before the 17th century.
All these systems used the Julian system for months and days, but differed in terms of the year and (partialy) in the first day of the year.
The century in which the switch occurred (which was different in different countries) was shorter than the others. As were the decade, year, and month in which the switch occurred.
No, the first century began Jan 1, 0000. Whether that year actually existed or not is irrelevant - we shouldn't change our counting system in the years 100, 200 etc.
There is no year zero according to first-order pedants. Second-order pedants know that there is a year zero in both the astronomical year numbering system and in ISO 8601, so whether or not there is a year zero depends on context.
It's ultimately up to us to decide how to project our relatively young calendar system way back into the past before it was invented. Year zero makes everything nice. Be like astronomers and be like ISO. Choose year zero.
Yes but, is there such a thing as a zeroth-order pedant, someone not pedantic about year ordinality? As a first-order meta-pedant, this would be my claim.
Moreover, I definitely find the ordinality of pedantry more interesting than the pedantry of ordinality.
> It's ultimately up to us to decide how to project our relatively young calendar system way back into the past before it was invented. Year zero makes everything nice. Be like astronomers and be like ISO. Choose year zero.
Or, just to add more fuel to the fire, we could use the Holocene/Human year numbering system to have a year zero and avoid any ambiguity between Gregorian and ISO dates.
If only—I think most US citizens who actually work with units of measurement on a daily basis would love to switch to the metric system. Unfortunately, everyone else wants to keep our “freedom units” (and pennies)
We are all defacto ISO adherents by virtue of our lives being so highly computer-mediated and standardized. I’m fully on board with stating that there absolutely was a year zero, and translating from legacy calendars where necessary.
I vote for a year zero and for using two's complement for representing years before zero (because it makes computing durations that span zero a little easier).
What does that even mean? Do we allow for the distortion due to the shift from the Julian to Gregorian calendars, such that the nth year is 11 days earlier? Of course not, because that would be stupid. Instead, we accept that the start point was arbitrary and reference to our normal counting system rather than getting hung up about the precise number of days since some arbitrary epoch.
It means just what it says. In the common calendar, the year after 1 BC (or BCE in the new notation) was 1 AD (or CE in the new notation). There was no "January 1, 0000".
And also, the system is a direct descendant of regnal numbering, where zero wouldn’t have made sense even if invented (there is no zeroth year of Joe Biden’s term of office).
Doesn't matter, we can just agree the first century had 99 years, and be done with it.
We have special rules for leap years, that would just be a single leap-back century.
At the scale of centuries, starting 2nd century at 100 as opposed to 101 is just an 1% error, so we can live with it. For the kind of uses we use centuries for (not to do math, but to talk roughly about historical eras) it's inconsequential anyway.
There are numerous common concise ways to write the 18th century, at the risk of needing the right context to be understood, including “C18th”, “18c.”, or even “XVIII” by itself.
These are even more impractical, so I wonder what your point is? I can come up with an even shorter way to say 18th century, by using base26 for example, so let's denote it as "cR". What has been gained?
What about languages that don’t have an equivalent to “the Xs” for decades or centuries?
Also, 1799 is obviosly more than 1700, as well as 1701 > 1700 – why should the naming convention tie itself to the lesser point? After one’s third birthday, the person is starting their fourth year and is not living in their third year.
Agreed. First couple paragraphs of the post had me like, wtf am I reading right now? Threw the article into Claude and had it succinctly summarize the key ideas the author was trying to convey. Thankful for the clarity that provided and time it saved me.
Why? I don’t need to equally and freely share the expertise I develop by consuming publicly available information. In fact, I personally profit from it. Should I compensate every YouTube creator, author, journalist, for the money I’ve made in my career that their publicly available work contributed to in terms of my learning/education?