At NASA I wrote a set of Python scripts to help with planning and telemetry for a Mars mission. I had to convert a lot of time data points between UTC and a Mars timezone/clock (specific to the rover itself). There was an officially blessed tool written in Java that ran from the command line for time conversion. It was too slow to call out for each data point, so I figured it would be generally useful to port the conversion to Python.
First I dug into the Java source, which it turned out called out to a C library. I believe the C library linked out to a Fortran library. It was a lot more complicated than a simple scale and calendar - there were corrections for special and general relativity. Converting times actually required propagating orbits and estimating some non-closed-form quantities. In the end is was more work than it was worth, so we just did an approximation ("only" good to a few milliseconds - fine for our case).
So I guess item number 100 on this list should be "time is experienced the same way by all observers" and maybe 101 "simultaneity depends on the reference frame" :)
1. Dang is subject to the normal rules of time and space.
FALSE: Dang is a cosmic entity of Lovecraftian proportions who exists outside of and beyond our pitiful human conceptions of "time" and "space." Dang is eternal, dang is everywhere, dang is everything.
Obviously nobody believes most of these falsehoods. Presumably the op means a lot of programmers create bugs that seem t imply programmers believe these falsehoods.
But this one jumped out at me: "Any 24-hour period will always begin and end in the same day". It would be a lot more plausible if "always" were replaced with "never". Add 24 hours to a time, and it will (almost) always be the next day.
There are the things you say you believe, and there are the things your actions show you believe. This extends well past simple matters of time in computers.
Some of my favorite things lie in the fractal fringe categories between those, such as, "the things you didn't realize you believed (by your actions) until someone pointed them out to you" and "the things you didn't realize you should believe (in any sense) until someone pointed them out to you, at which point they seemed blindingly obvious and you can't believe you missed it".
Agreed on all counts. The “falsehoods programmers believe about X” articles are cautionary lists of edge cases to think about, with an unfortunately aggressive naming scheme.
Or “boundaries users have to stay within”. I’m aware of a lot of edge cases. When you start building with those in mind, it adds complications. Extra steps or more UI or whatever, to ensure the “right” thing happens, assumptions are obvious, etc.
More often than not I’m pushed back in with “that doesn’t happen” or “don’t spend time on that”. “We’re never open past 9, don’t worry about a day” but then they acquire a system in a different time zone and … boom. Things are broken.
When I start at a new company, it's always interesting to see how they handle dates and times. 1) Company A stores timestamps in a mysql database, no timezone, but implicitly on US pacific time. The system timezones are all on pacific. Many weird bugs around DST transitions. 2) Company B stores timestamps in a database as Unix timestamps (ints) Tons of code converting back and forth between ints. Code was a mess. 3) Company C stores them as postgres timestamp with timezone, which was always UTC in production. Code was reasonably sane.
> Tons of code converting back and forth between ints. Code was a mess.
Never experienced this myself, it’s trivial in all the languages I can think of. Sometimes you have multiply or divide by 1000 if JavaScript is involved.
I always prefer Unix epoch time. What was the language and stack causing the mess so I know to avoid it?
The code was a mess for other reasons beyond the int/timestamp conversions. In some places it was stored as milliseconds, other places seconds. It was python 2 if it matters.
Time would be much less of a problem if Terrestrial Time was used internally everywhere. That is: if TT (realized as TAI) were used as the fundamental definition of time, rather than UTC; and UTC were treated as just another timezone.
Then, there would be only three problems, which decompose nicely:
1. Trying to keep the system time accurate, and accounting for the possibility it isn't.
2. Having up-to-date time zone information
3. Converting TT to/from a date/time in some particular format in a particular timezone.
(1) is fundamentally unavoidable. (3) is complicated but well-defined. (2) should be handled by the system. All that's left is calculations on time values, which if they're in TT (i.e. actual time) are very well behaved.
Ultimately this is the fault of the standards bodies. POSIX defines time in terms of UTC. NTP tries to keep the system clock synced with UTC. Postgres "timestamp with timezone" stores UTC. Zone files state offsets in terms of UTC, and even worse, transition times are stated with reference to a timezone (see tzfile(5) and RFC 8536), which is completely insane.
This could change. Existing standards can't but new standards could be introduced to succeed the old ones and exist side-by-side. Maybe, instead of proposing changing UTC because they find leap seconds inconvenient, an organization like Facebook could actually do something useful and push for them.
The "opportunity for bugs" I didn't realize until I was over 10 years into my professional career:
10 AM
11 AM
12 PM <- AM/PM and calendar cycle is here.
01 PM <- n%12 cycle is here, an hour later.
02 PM
03 PM
I see large software systems for things like airlines and trains sometimes make this mistake, for instance a 1 hour trip on a ticket that says "11:30AM - 12:30AM" which is actually negative 11 hours.
I have a collection of photos somewhere of every time I've caught it. If I could change anything about time that probably nobody would care about, it would be to align those two cycles.
The fact that I catch it every couple months on widely deployed systems I interpret as the signal that basically nobody notices/cares about it as everyone knows a -11 hour train ride isn't how time works.
I would argue that 12:30 hours after midpoint is exactly that: there is the midpoint, and then you add 12 hours and 30 minutes. So 12:30 PM would be 00.30 military time according to that logic.
As a non-native am/pm user: it's confusing. Even with that rule.
I missed a plane flight once because an admin for the company I was working with booked a midnight flight accidentally and I thought it was at noon (I think she thought so as well). I think it was in the 90s sometime.
Not very long ago I showed up to an appointment at 7:00 am and was told it was really at 19:00 — and our computer system uses 24 hour time so obviously someone just screwed up the conversion.
Midnight appointment are interesting since the system doesn’t have “no appointment time” so they use 00:00 and 00:01 for a real midnight appointment and if you’re really, really tired…
I got one for OP, too: UTC is not a timezone in the first place [1].
> UTC is not a time zone, but a time standard that is the basis for civil time and time zones worldwide. This means that no country or territory officially uses UTC as a local time.
The difference is subtle, but a standard is not subject to government whims while a timezone is.
Sure but governments can and do change timezones more regularly than programmers keep up [1]. In 2014, Egypt's new government changed their timezone with only a week notice.
Updating standards almost never happens, unless with a name/identifier change. The whole point of standardized measures is their immutability.
> The system clock will always be set to the correct local time.
I work on a sometimes offline educational product for children. Kids for some reason love to mess with their system clocks.
The hoops we've tried jumping through to get a reasonable time frame of events from iPads that were offline for any period of time is hilarious.
We gave up, and if the time of events is unreasonable, we just shift the entire set waiting to be synced so the first one is sync time. It's not perfect, it's not even really good, but it's gotten the least complaints.
> Kids for some reason love to mess with their system clocks.
There are plenty of mobile games that will time-gate in-game rewards (e.g. wait x hours to unlock y). Often, messing with the system clock skips these delays. You're welcome.
My dad has some insomnia problems and used to play a lot of Candy Crush.
His iPad was like 2 centuries ahead because moving the system time would give him extra lives. He couldn't practically move it back because then he'd have to wait another 2 centuries to get lives again. So his iPad was 2 centuries ahead until he completed Candy Crush.
Presidential proclamations are signed and dated in two calendars, "the year of our Lord", and "the year of the Independence of the United States of America":
IN WITNESS WHEREOF, I have hereunto set my hand this twenty-fifth day of October, in the year of our Lord two thousand twenty-one, and of the Independence of the United States of America the two hundred and forty-sixth.
The former uses January 1st as the start of each new year; the latter, July 4th.
In England, Lady Day (March 25th) was the turnover of a new year, for about 6 centuries.
The point of these "falsehoods" threads is that programmers may not care about these niche anthropology facts, but users may care a lot.
It's not crazy to think that a user may, say, want to create a database of historical records, like when people were born, married or buried. Now I want to find out the age people were when married... suddenly, the existence of Lady Day and the time and place the dates were recorded become very important.
I think they mean that if someone says "remind me in a month"
On December 3rd, the reminder would be next year.
I've never met a developer who believed that, but I've met plenty who would forget about the edge-case and write a buggy time-library. As is the case for most of these.
I think they are talking about the End Of Period for a month. Whatever batch or closeout type processing happens during that time period can flow into the next year, despite it being flagged/tagged/marked with the last month of the prior year. So, be careful with using dates! (unless you are a Time Druid)
It could also be a reference to the start of the calendar, but that isn't bound to happen again.... Well, unless it does.
Then, on the other hand (because Time is funky that way) your time may not be the same as my time, since we could be in different time zones.
The biggest falsehood a programmer could believe about time is that they have the ability to roll their own time manager. Just use a battle-hardened library and hope for the best. Dealing with time in code is crazy-making in a way I never expected.
I agree. I'd wager most programmers don't need to actually worry about calendars, but most occasionally need to worry about time. Just use a library and read the docs for what to expect to happen when you do something like getCurrentTime. What is the resolution, does it monotonically increase, etc.
If you actually need to worry about calendars and user input, you have my sympathy.
Is it just me, or does anybody else hate these "Falsehoods programmers believe about..." lists.
Every abstraction at some level is leaky. Even the atom is a leaky abstraction, and maybe even matter itself is a leaky abstraction at some level.
What matters is how well the abstraction matches your use case, and where the leaks are.
Just having a list of how a particular abstraction leaks without any context is not super helpful, IMO other than conveying a sense of complexity (and perhaps a sense of looking down on those unwashed masses who believe these "falsehoods"),
I think I understand why you feel this way, but I personally find them to be a helpful format. For example, last time I had to figure out how to store addresses in a database I made a point to check out the falsehoods programmers believe about addresses document (as well as other sources).
Ugh.. I personally hate these lists. Absolutely no value added, nobody is going to remember these, they're gonna learn them the hard way and most importantly, if they don't work directly with the timekeeping infrastructure, probably never. And if they do work on the timekeeping infra, they likely know all of this.
Sure, you might run into some quirk at some point in your career but they will end up being just another weird war story.
I worked directly with NTP. The nitty-gritty details and the leap second smears and all of that, for an absolutely huge network and I find this list patronizing, smug and worthless. It's some sort of intellectual masturbation or just another instance of https://news.ycombinator.com/item?id=32335165.
- `pytz.timezone(timezone_name)` will give you the current offset for that timezone
- well, it will at least be some consistent offset
- surely, every timezone has the same reference starting point?
What `pytz.timezone(name)` actually does is initialize a timezone object at the point in time of the first entry in the tz database for that zone. For example, New York had an offset of 4:56:02 prior to 1883 November 18, 12:03:58 [1]. That is the "starting point" for America/New_York and US/Eastern (which I think is an alias of New_York before a certain date). Which is why you get
Can probably add another falsehood: that timezone information (in particular DST & UTC offsets) will never change. In reality, timezones are very much political and can change at any time with sometimes very little notice. Last major change US folks may remember occurred around 2007, when daylight saving time was expanded by 6 weeks. I know of clocks that are still wrong from that change.
At a previous job, I spent over a year working on a library for timezones built on top of Boost. The handling of historical timezones was a particular pain in the ass.
We're developing a new date/time library from scratch for Hare. So far I think we're doing a pretty good job. The person leading up this effort (Byron Torres) has written up a blog post about it here:
Skimming through it I didn’t see my common question of such libraries: can it handle approximate dates?
Twice in my career I’ve had to implement code to handle concepts like “August 2022”, or “1pm”, where it was important to track the level of precision offered by the source material for later comparisons. “1pm” is not the same as “1:00:00.000”, nor is “August 2022” the same as “20220801T00:00:00.000Z”.
Too bad that most stdlibs outscope it to third parties. Wildcard periods are essential for some applications (even as simple as a periodic reminder) and also are hard to do correctly. I’d say a period is actually more important than a date, because we mostly work with periods, and our “datetimes” are just integer coordinates of pixels of time, so to say.
Well, Hare is not a high-level programming language, and the standard library has a specific, finite scope. It's also not generally used in the kinds of situations where this is called for (think C -- there's not really a similar library for C, either). But that's not to say it isn't important -- I definitely think that there should be a library which provides this.
In the U.S. falling back to standard time from daylight saving time is rough. Those days have 25 hours - and the interval from 1:00 - 2:00 repeats (you reach 2:00 and fallback to 1:00 and repeat the interval). Makes power scheduling difficult - which 1:00 - 2:00 interval are you talking about?
Also, the offsets between the timezones change. Eastern Daylight Time falls back from 2:00 to 1:00 Eastern Standard Time, which coincides with 1:00 for Central Daylight Time. So Eastern Time and Central Time are the same for one hour interval. Of course in the Spring you have the opposite problem - Central Time will be 2 hours behind Eastern Time for a one hour interval. So much for your interval time calculations! Also, the U.S. has changed the transition dates for the time change.
I'm so glad to be working on a system where I no longer have to worry about this crap!
One of the interesting time-related applications I worked on was involved with medicines management. If a patient was to receive a medicine around the daylight savings change over we had to make sure it wasn't either doubled-up or missed completely.
I also worked on event planning software with plenty of timezone fun with events having a timezone, the planner a potentially different timezone and it also handled flights for speakers etc who could be coming from any number of other timezones. That was mostly straight forward with decent libraries, but it certainly taught me that you need to think in (and store) timezones and not just offsets.
During the last “fall back” event I experienced the 1 o’clock hour three times since I randomly happened to cross a time zone at exactly 2:00am…or maybe 1:00am?
The worst part was I had an appointment at 02:00, somewhere where they won’t let you in if you show up too early, in the other time zone and hadn’t had to deal with daylight savings time for a good long while having lived in Arizona where they don’t deal with such silliness. Trying to figure out what time to leave to time my arrival was very difficult.
> having lived in Arizona where they don’t deal with such silliness
Oh god, I'm simultaneously delighted at the idea of not having to deal with DST and terrified of writing software that might have to keep track of whether or not the user was in Arizona during a DST transition.
Parts of Indiana have the same issue. Notice I said parts and not all - there are parts of Indiana that observe DST because they border with states observe DST and it's to their advantage to stay in the same time zone. What a mess! Just makes writing software that much more difficult!
That's why in the utility business it's common to use the language of standard time, daylight saving time, and prevailing time. For the Eastern timezone they write it as EST, EDT, EPT respectively. The important point is EST and EDT never vary - they'e fixed offset from UTC (UTC-05:00 and UTC-04:00, respectively). EPT is the squirrelly one - sometimes it matches EST and other times it matches EDT. At least when processing a time interval you know whether you have to take clock movement into account.
We should just define a year to be 360 days long and made up of 12 30 day months.
Then we can elect some druids or whatever to arbitrarily, at the start of each year, define which dates the seasonal borders will land on. Events which are truly dependent on weather can be defined in terms of "Days after the season starts."
Or we could define a 5.5+-.5 day holiday between the beginning and end of a given year. Those days will be declared to not belong to any year. We will turn off all our computers for those days, and pretend they didn't happen. If you are born within them, you get a special hat or something.
Should we also turn off computers in hospitals and in other important facilities? And then how can we qualify what's important?
Maybe you and I have the privilege to turn off computers for almost a week without a negative effect, but not the majority of the world, everything uses a computer to operate and coordinate.
Think about food production, aviation, shipping, sailing... And much more at both small and big scales.
And why do only people born within them get special hats? Maybe you and I have the privilege of going about hatless, but not the majority of the world.
The Icelandic calendar used to be something like that before the adoption of the Gregorian calendar in the 18th century (Iceland actually skipped the Julian calendar entirely, although I believe the old calendar was actually heavily inspired by the Julian).
All years were 12 months of 30 days plus 4 extra days of summer (Sumarauki in Icelandic), making the year exactly 52 weeks. Leap years had one extra week added to sumarauki making it a total of 11 days and leap years exactly 53 weeks. This has the benefits that each month starts on the same day of the week (e.g. the first day of the first summer month (Harpa) is still celebrated in Iceland and always lands on Thursday).
Given that the month after Sumarauki is named Heyannir which literally means busy season during haying, which I’ve heard from farmers is their busiest time of the year, perhaps you are right.
Why not 364 days, 13 28-day months, with one special day (probably New Year's Eve), two during a leap year. This would have the interesting feature that a given date always falls on the same day of the week for the duration of the year.
That's what I thought. This list mixes "falsehoods" that nobody with a bit of common sense - even children - would believe, and pedantry that could be useful to 0.001% of developers. For most programs, Time starts in 1970.
IMO, this list is only good at... wasting time. Except for the multiple statements (that could be one or two) that point out that there's not such a thing like "identical clocks" and "same time on two clocks".
Most programs don't deal with birth dates and people. It's like stating that most "computers" are neither on a desktop nor in a pocket. Technically correct, as counter-intuitive as it is.
I suppose it depends on if you think about a “fact month”, that is the series of days people experience, or the formal calendar months as the things being handled by the system.
The daylight savings time transition doesn’t “shorten” or “lengthen” the day, it merely transforms the clock. The day isn’t 25 hours or 23 hours. This is essentially the same thing as the Julian/Gregorian swap. The day didn’t really change, just the system we use to understand what day it is.
The span of time users experienced was weird, but this wasn’t the unit of time being shorter or longer. It’s an artifact of the transition not anything happening to the time.
A lot of these falsehoods on the list can be side stepped by distinguishing between the dimensions and the facts about representing time.
As the comments to that comment points out, 1752 is only for one specific country. All countries switched at different times (and a few even switched back and forth a few times). Greece, in particular, seems to have switched as late as 1923.
Russia is a big one too, especially for early 20th century history nerds.
They used the Julian calendar until after the Revolution, so when reading accounts of, say, most of WW1, Russian sources have dates that are 13 days off from Western sources. This leads to much confusion when authors don’t make it
clear they’ve converted the dates (or not).
Amusingly, one of the first casualties of this was the October Revolution, which in the new (Gregorian) calendar actually occurred on November 7th
I'd expect in a lunar calendar system? (where 12 lunar months are to short to fill a year, and I think some of them fix this by inserting an extra half-month)
But I wish this list came with examples, and if it really means a different calendar system it's IMHO not as interesting, because it's IMHO fairly obvious that different calendar systems follow different rules. Or maybe during the switch to Gregorian calendar? That you'd at least encounter with dates for things long ago - although that transition is very "here be dragons" and very local.
Lunar month is a little over 29 days on average. And most lunar calendars do not align with solar year but rather have leap months added. E.g. Hebrew calendar [1].
Ok, you made me look up an example: "The Ethiopian calendar has twelve months of thirty days plus five or six epagomenal days, which comprise a thirteenth month."
If I am not wrong this holds true for the switch from Julian calendar to the Gregorian one. Some months were shortened to be synchronised with the new system. And if I am not mistaken Jewish and Arabs do not use the same systems we use for months and years
For an added twist this changeover occurred in different years, at different times in different countries .. and not always with the same number of days difference.
Actually looking at that tells me it is a mess... Like Belarus and Lithuania changing back to Julian calendar in 1800... And then again to Gregorian in 1918.
If you are creating, for instance, an idle game where the user can pay to skip time; that's a problem.
If you are doing cryptographic checks dependent on time, that's a big deal (eg: how do we handle when the client or service goes "wtf, no. That's the wrong time")
In that case, the best you can hope for is to centralize time on the server. The IoT devices can keep local differential time or contact the server (or a log server or somewhere else) at the time of the event. It's kind of messy, since you are asking to trust a client's data, which is untrustworthy.
Have had plenty of problems with WPA-Enterprise auth refusing to join a wifi network on account of a wildly wrong computer time. Which in turn means that the computer can't get a correct time on account of not being able to ask an NTP server.
If you ever have the pleasure of working with the Jewish religious calendar, it defines day boundaries by sundown and rise of the first star, which are not simultaneous. This leads to a phase during which two days co-exist. And of course it's not continuous over the year and by location.
The proposed moral of this story, and those like it, is to avoid re-implementing general-purpose libraries.
Yes, good. But not all applications need a “platonic ideal” understanding of time. Often when you type just the code you need, the system performs excellently, and the code is minimal.
One of the hardest programs you could ever have to write would be historical times taking into account all the time zone and Daylight Savings Time changes over the years. It might be impossible. Just look at Indiana's history with time zones and DST.
Many years ago, a user filed a bug report that the timezone information we used for historical places in Indiana was incorrect. I don't remember exactly what the user was doing, but I think it had something to do with train schedules or something.
I did not fix that bug. (But I did learn that the 'TZ' database has this information, though I think 'America/Indiana/*' had been pruned out of the copy our JVM was using.)
Also one thing I have noticed is that the system clocks are actually pretty bad. Or I would expect them to be much better. This can be easily noticed when you don't have active time sync on machine. Drift is multiple seconds in not that many days.
I've worked with calendar applications and todo managers, alarms, recurring events and all that. Our timekeeping mechanics are incredibly complex.
24hs before 15:00 is USUALLY 15:00. But in some timezones, it can be (about once a year): 14:00, 16:00 or 14:59:59.
It's currently Monday 11:02 here, but it's likely still Sunday in some parts of the world (or is it already Tuesday somewhere? Not sure which is right). So, right now "today" have no single meaning; it depends on where you are.
Oh, and does your online meeting repeat at 14hs CET every Monday? Then for people in Argentina it will be at 9hs for six months a year, and at 10hs for another six months (due to CET having DST, but not Argentina).
Isn't leap year calculation one of the very first things you do in most programming tutorials/schools? I know that a lot of people don't know this, but most programmers should, r-r-r-right?
If you handle dates and times yourself on that level, prepare to be in a world of pain.
Handling time and dates correctly has a similar difficulty to writing your own cryptographic primitives: If you don't know exactly what you are doing you will shoot yourself (and potentially countless others) into the foot at one in point or another.
Tbh, half of my life I handled them on a financial platform which couldn’t care less of the subj list or standards. It has just ‘wall date’ like yyyy-mm-dd and ‘wall time’ strictly in 00:00:00-23:59:59 range, without a timezone. Two separate types. Never encountered of even heard of any time-related bug there.
I find this simplified date/time very useful, unless you have to manage “continuous and/or real” time somehow. 99% of applications are okay with it, because they’re facing users with exactly the same mental model. Also, albeit not correct scientifically, it reflects many developers’ mental model as well. This is much better than a model that just doesn’t match, which is a world of pain. Someone adds 86400 but it’s the same day, good luck debugging it.
I don’t think this is a cryptographic-level issue.
My uninformed guess would be that a financial platform mostly just wants a monotonically increasing clock with which to order transactions. So this would I guess avoid quite a bit of the annoying date related stuff. (?)
Anyway if you queried something to get the current date-time and clock-time, you could be farming out the annoying edge cases to some arbitrarily complicated library, right? Which is the right way to do things.
Unimportant sidenote: I don't like the naming of these "falsehoods programmers believe about X" lists. Surely programmers are more likely to be conscious of these not being true than non-programmers.
The point is these are falsehoods whose belief can affect a programmer's work when they get embedded into the systems they create. The point is to correct yourself, not tu quoque your brother.
well basically it's shorthand for "things that do not apply to every locale for X of instance of Y, that programmers have decided to implement as if they do apply to every locale or instance either through not realizing they do not apply or that the non-applicability in some situations are such bizarre outliers in relation to the rest of the world that it has been decided to be an acceptable bug that they will not fix" but that's not very pithy.
But of course what happens is the outliers have to deal with these systems that treat them as bad data possessing impossible properties and then the curse and say hey these stupid programmers don't realize that the world is not just like X1 or Y1, but that us rare instances of X2 and Y2 also exist!
And then an expert with sufficient knowledge about all the outliers compiles a list entitled Falsehoods Programmers believe about {X}.
I still encounter forms on the web or input fields in software that prevent me from entering truthful information or that require me to input information that doesn't exist. There are definitely still programmers out there that make assumptions that don't hold in all cases.
No, they are not. Beyond leap years and leap seconds, I know absolutely nothing about time, how to handle all the "daylight savings" and other kind of bullshit. And I actually do not care, at all. Luckily for me, people who do care and know these things have made libraries that can handle it all while expecting only minimal knowledge from me, such as being aware that time zones exist at all.
The reason why you use those libraries is that you know time is difficult and you need to be cautious. You don not blindly believe those statements about time. If pressed to answer whether they are right or wrong you would say that you are not sure.
IIRC, that event is the reason SQLServer's datetime type is only good back to 1753, and thus why you should use datetime2 for dealing with older historical dates:
Which Outlook in particular seems to get direly wrong. I have lost count of the number of times I have been sent an email from someone using Outlook containing a supposed calendar event in summer which declares that it is at a particular time GMT. It's wrong, and if I were to turn up at the time it stated, I would be an hour late.
Britain uses GMT only during winter. In summer it's BST (British Summer Time). The fact that calendar clients can't determine a timezone for a given jurisdiction on a given day is actually something that I hate.
In principle a calendar program could use its output to perform GIS lookups of locations then use the tzinfo DB to get the correct timezone at the given location and specified time.
I bet you could get me to give the wrong answer to most of these questions if you worded it cleverly in informal conversation, but the answer to anything time and date related in a technical setting is "Ugh, we'll have to look up the stupid edge cases here," right?
Even if you use a library, it's easy to bake in an "obvious" assumption into your program.
Such as storing location + date + time, because that should be unambiguous, right?
Or confusing "same time tomorrow" with now + 24h. Are you even sure which one you need?
Any examples of when you need "now + 24h" instead of same time tomorrow?
For example, daylight savings may start the next day, and time will be adjusted... And everything will be done 1 hour earlier than today despite the hour on the clock being "same time" as today, so what? Other than your body needing to readjust the sleep cycles, noone else seems to actually care.
Meanwhile, when I setup cron to do something daily, I don't care about this at all, so what if on one day, the interval will be 23 or 25 hours?
Or, back when I was doing airborne geophysical surveying across the Fijian archipelago the plane would flit from -179 to +179 longitude a few times every hour with a potential change of a full day if being naive about time zones.
Data aquisition instrumentation needs to use a lapsed epoch (to avoid UTC leap seconds) with a calibration at start and end of projects to adjust for drift, etc.
First I dug into the Java source, which it turned out called out to a C library. I believe the C library linked out to a Fortran library. It was a lot more complicated than a simple scale and calendar - there were corrections for special and general relativity. Converting times actually required propagating orbits and estimating some non-closed-form quantities. In the end is was more work than it was worth, so we just did an approximation ("only" good to a few milliseconds - fine for our case).
So I guess item number 100 on this list should be "time is experienced the same way by all observers" and maybe 101 "simultaneity depends on the reference frame" :)