Hacker News new | past | comments | ask | show | jobs | submit login
Falsehoods programmers believe about time (infiniteundo.com)
178 points by rrampage on Aug 22, 2022 | hide | past | favorite | 204 comments



At NASA I wrote a set of Python scripts to help with planning and telemetry for a Mars mission. I had to convert a lot of time data points between UTC and a Mars timezone/clock (specific to the rover itself). There was an officially blessed tool written in Java that ran from the command line for time conversion. It was too slow to call out for each data point, so I figured it would be generally useful to port the conversion to Python.

First I dug into the Java source, which it turned out called out to a C library. I believe the C library linked out to a Fortran library. It was a lot more complicated than a simple scale and calendar - there were corrections for special and general relativity. Converting times actually required propagating orbits and estimating some non-closed-form quantities. In the end is was more work than it was worth, so we just did an approximation ("only" good to a few milliseconds - fine for our case).

So I guess item number 100 on this list should be "time is experienced the same way by all observers" and maybe 101 "simultaneity depends on the reference frame" :)


One glorious day we'll switch to outside-of-gravity-well coordinated reference time frame.


I don't know if these are all the same falsehoods, but—related:

Falsehoods programmers believe about time zones - https://news.ycombinator.com/item?id=24870376 - Oct 2020 (16 comments)

Falsehoods programmers believe about time (2017) - https://news.ycombinator.com/item?id=24453712 - Sept 2020 (8 comments)

Falsehoods programmers believe about Unix time - https://news.ycombinator.com/item?id=19922062 - May 2019 (268 comments)

Falsehoods Programmers Believe About Time (2012) - https://news.ycombinator.com/item?id=12675527 - Oct 2016 (154 comments)

Falsehoods programmers believe about time and time zones - https://news.ycombinator.com/item?id=11515125 - April 2016 (80 comments)

Falsehoods Programmers believe about Time - https://news.ycombinator.com/item?id=4128208 - June 2012 (213 comments)



Sure—changed from https://gist.github.com/timvisee/fcda9bbdff88d45cc9061606b4b... now. Thanks!

(For some reason people think we know these things)


Falsehoods hn'ers believe about dang…


1. Dang is subject to the normal rules of time and space.

FALSE: Dang is a cosmic entity of Lovecraftian proportions who exists outside of and beyond our pitiful human conceptions of "time" and "space." Dang is eternal, dang is everywhere, dang is everything.


Obviously nobody believes most of these falsehoods. Presumably the op means a lot of programmers create bugs that seem t imply programmers believe these falsehoods.

But this one jumped out at me: "Any 24-hour period will always begin and end in the same day". It would be a lot more plausible if "always" were replaced with "never". Add 24 hours to a time, and it will (almost) always be the next day.


There are the things you say you believe, and there are the things your actions show you believe. This extends well past simple matters of time in computers.

Some of my favorite things lie in the fractal fringe categories between those, such as, "the things you didn't realize you believed (by your actions) until someone pointed them out to you" and "the things you didn't realize you should believe (in any sense) until someone pointed them out to you, at which point they seemed blindingly obvious and you can't believe you missed it".


Do you know if a list of those exists?


Agreed on all counts. The “falsehoods programmers believe about X” articles are cautionary lists of edge cases to think about, with an unfortunately aggressive naming scheme.


They should really call these lists "Incorrect assumptions programmers make about X" or: "Test cases programmers should test their software against"


Or “boundaries users have to stay within”. I’m aware of a lot of edge cases. When you start building with those in mind, it adds complications. Extra steps or more UI or whatever, to ensure the “right” thing happens, assumptions are obvious, etc.

More often than not I’m pushed back in with “that doesn’t happen” or “don’t spend time on that”. “We’re never open past 9, don’t worry about a day” but then they acquire a system in a different time zone and … boom. Things are broken.


Falsehoods programmers Considered harmful


When I start at a new company, it's always interesting to see how they handle dates and times. 1) Company A stores timestamps in a mysql database, no timezone, but implicitly on US pacific time. The system timezones are all on pacific. Many weird bugs around DST transitions. 2) Company B stores timestamps in a database as Unix timestamps (ints) Tons of code converting back and forth between ints. Code was a mess. 3) Company C stores them as postgres timestamp with timezone, which was always UTC in production. Code was reasonably sane.


> Tons of code converting back and forth between ints. Code was a mess.

Never experienced this myself, it’s trivial in all the languages I can think of. Sometimes you have multiply or divide by 1000 if JavaScript is involved.

I always prefer Unix epoch time. What was the language and stack causing the mess so I know to avoid it?


The code was a mess for other reasons beyond the int/timestamp conversions. In some places it was stored as milliseconds, other places seconds. It was python 2 if it matters.


Time would be much less of a problem if Terrestrial Time was used internally everywhere. That is: if TT (realized as TAI) were used as the fundamental definition of time, rather than UTC; and UTC were treated as just another timezone.

Then, there would be only three problems, which decompose nicely:

1. Trying to keep the system time accurate, and accounting for the possibility it isn't. 2. Having up-to-date time zone information 3. Converting TT to/from a date/time in some particular format in a particular timezone.

(1) is fundamentally unavoidable. (3) is complicated but well-defined. (2) should be handled by the system. All that's left is calculations on time values, which if they're in TT (i.e. actual time) are very well behaved.

Ultimately this is the fault of the standards bodies. POSIX defines time in terms of UTC. NTP tries to keep the system clock synced with UTC. Postgres "timestamp with timezone" stores UTC. Zone files state offsets in terms of UTC, and even worse, transition times are stated with reference to a timezone (see tzfile(5) and RFC 8536), which is completely insane.

This could change. Existing standards can't but new standards could be introduced to succeed the old ones and exist side-by-side. Maybe, instead of proposing changing UTC because they find leap seconds inconvenient, an organization like Facebook could actually do something useful and push for them.


If the system time isn't accurate most of the problems would still apply.


The "opportunity for bugs" I didn't realize until I was over 10 years into my professional career:

10 AM

11 AM

12 PM <- AM/PM and calendar cycle is here.

01 PM <- n%12 cycle is here, an hour later.

02 PM

03 PM

I see large software systems for things like airlines and trains sometimes make this mistake, for instance a 1 hour trip on a ticket that says "11:30AM - 12:30AM" which is actually negative 11 hours.

I have a collection of photos somewhere of every time I've caught it. If I could change anything about time that probably nobody would care about, it would be to align those two cycles.

The fact that I catch it every couple months on widely deployed systems I interpret as the signal that basically nobody notices/cares about it as everyone knows a -11 hour train ride isn't how time works.


As a 1995 immigrant, I understand what am/pm time it is 22 hours of the day, but I'm still not sure about those two 12:* mystery hours.

I know, I can look it up, and I have several times. But the knowledge doesn't stick...


Post median, ante median.

Translation: After midpoint, before midpoint.

Also the difference in the cycles comes from analog clocks.

I can explain all of this but it's still confusing and stupid to have to deal with it in 2022



I would argue that 12:30 hours after midpoint is exactly that: there is the midpoint, and then you add 12 hours and 30 minutes. So 12:30 PM would be 00.30 military time according to that logic.

As a non-native am/pm user: it's confusing. Even with that rule.


You have to look at it as a separate concept from the hh:mm time,

pm is between midday (inclusive) and midnight (exclusive) and am the rest, between midnight (inclusive) and midday (exclusive).


Right. When I do remember, that's the rule I use.

The ambiguity of midday and midnight is really the cherry on top of this masterpiece of confusion.

There is probably some part of me too offended by this system to want to remember it...


> Align those two cycles

I'd say that the best solution would be to just rename 12 to 0, which is what it already should have been IMO, regardless of am/pm.

I've noticed some digital watches use 0:00AM instead of 12:00AM, but I've never seen this used for 0:00PM.


I missed a plane flight once because an admin for the company I was working with booked a midnight flight accidentally and I thought it was at noon (I think she thought so as well). I think it was in the 90s sometime.


Not very long ago I showed up to an appointment at 7:00 am and was told it was really at 19:00 — and our computer system uses 24 hour time so obviously someone just screwed up the conversion.

Midnight appointment are interesting since the system doesn’t have “no appointment time” so they use 00:00 and 00:01 for a real midnight appointment and if you’re really, really tired…


am/pm users have lost their sanity.


I m surprised it’s not 7 inch past 12


> GMT and UTC are the same timezone.

I got one for OP, too: UTC is not a timezone in the first place [1].

> UTC is not a time zone, but a time standard that is the basis for civil time and time zones worldwide. This means that no country or territory officially uses UTC as a local time.

The difference is subtle, but a standard is not subject to government whims while a timezone is.

1. https://www.timeanddate.com/time/gmt-utc-time.html#:~:text=U....


Still subject to IERS whims, though.


Sure but governments can and do change timezones more regularly than programmers keep up [1]. In 2014, Egypt's new government changed their timezone with only a week notice.

Updating standards almost never happens, unless with a name/identifier change. The whole point of standardized measures is their immutability.

[1] https://www.timeanddate.com/news/time/

[2] https://english.ahram.org.eg/News/100735.aspx


> The system clock will always be set to the correct local time.

I work on a sometimes offline educational product for children. Kids for some reason love to mess with their system clocks.

The hoops we've tried jumping through to get a reasonable time frame of events from iPads that were offline for any period of time is hilarious.

We gave up, and if the time of events is unreasonable, we just shift the entire set waiting to be synced so the first one is sync time. It's not perfect, it's not even really good, but it's gotten the least complaints.


> Kids for some reason love to mess with their system clocks.

There are plenty of mobile games that will time-gate in-game rewards (e.g. wait x hours to unlock y). Often, messing with the system clock skips these delays. You're welcome.


> Kids for some reason love to mess with their system clocks.

They are cheating at games!


My dad has some insomnia problems and used to play a lot of Candy Crush.

His iPad was like 2 centuries ahead because moving the system time would give him extra lives. He couldn't practically move it back because then he'd have to wait another 2 centuries to get lives again. So his iPad was 2 centuries ahead until he completed Candy Crush.


> A week (or a month) always begins and ends in the same year

Okay, of course a week can begin in an year and end in the next, but how exactly would a month not begin and end in the same year?


Depends on your calendar!

Presidential proclamations are signed and dated in two calendars, "the year of our Lord", and "the year of the Independence of the United States of America":

IN WITNESS WHEREOF, I have hereunto set my hand this twenty-fifth day of October, in the year of our Lord two thousand twenty-one, and of the Independence of the United States of America the two hundred and forty-sixth.

The former uses January 1st as the start of each new year; the latter, July 4th.

In England, Lady Day (March 25th) was the turnover of a new year, for about 6 centuries.


"Falsehoods programmers believe about X" articles inevitably become a discussion about niche anthropology facts.

No programmer has ever cared about Lady Day when writing a program. And if they did, by definition, they would understand it's significance.


The point of these "falsehoods" threads is that programmers may not care about these niche anthropology facts, but users may care a lot.

It's not crazy to think that a user may, say, want to create a database of historical records, like when people were born, married or buried. Now I want to find out the age people were when married... suddenly, the existence of Lady Day and the time and place the dates were recorded become very important.


I don't think they mean what you think they mean.

I think they mean that if someone says "remind me in a month" On December 3rd, the reminder would be next year.

I've never met a developer who believed that, but I've met plenty who would forget about the edge-case and write a buggy time-library. As is the case for most of these.


I think they are talking about the End Of Period for a month. Whatever batch or closeout type processing happens during that time period can flow into the next year, despite it being flagged/tagged/marked with the last month of the prior year. So, be careful with using dates! (unless you are a Time Druid)

It could also be a reference to the start of the calendar, but that isn't bound to happen again.... Well, unless it does.

Then, on the other hand (because Time is funky that way) your time may not be the same as my time, since we could be in different time zones.


Here you go: https://youtu.be/-5wpm-gesOY?t=368

I'd highly recommend watching the whole video, by the way. It's really good.


Work in Finance.

My year starts on April 1st.

I do not want my year to start on January 1st, else I will need to work through all holidays for year-end closing.


There is a concept of "week year" in Java so a given month could simultaneously exist in two (week) years.


The biggest falsehood a programmer could believe about time is that they have the ability to roll their own time manager. Just use a battle-hardened library and hope for the best. Dealing with time in code is crazy-making in a way I never expected.


I agree. I'd wager most programmers don't need to actually worry about calendars, but most occasionally need to worry about time. Just use a library and read the docs for what to expect to happen when you do something like getCurrentTime. What is the resolution, does it monotonically increase, etc.

If you actually need to worry about calendars and user input, you have my sympathy.


Handing those cases correctly might very well be a Turing test...


I personally have not had much experience with it.

Would love to hear some horror stories you've encountered. It might help me keep an eye out for it.


Trying to deal with conversion between time zones or to/from daylight saving can have gotchas which libraries will deal with for you.


Is it just me, or does anybody else hate these "Falsehoods programmers believe about..." lists.

Every abstraction at some level is leaky. Even the atom is a leaky abstraction, and maybe even matter itself is a leaky abstraction at some level.

What matters is how well the abstraction matches your use case, and where the leaks are.

Just having a list of how a particular abstraction leaks without any context is not super helpful, IMO other than conveying a sense of complexity (and perhaps a sense of looking down on those unwashed masses who believe these "falsehoods"),


I think I understand why you feel this way, but I personally find them to be a helpful format. For example, last time I had to figure out how to store addresses in a database I made a point to check out the falsehoods programmers believe about addresses document (as well as other sources).


I can’t stand them. It’s a smug, click-baity and passive-aggressive format, typically without any actual examples.


Here's my tl;dr for "falsehoods programmers believe about [x]" so you never have to read another:

Don't make assumptions. You aren't smart. Keep it simple and don't get clever.


Ugh.. I personally hate these lists. Absolutely no value added, nobody is going to remember these, they're gonna learn them the hard way and most importantly, if they don't work directly with the timekeeping infrastructure, probably never. And if they do work on the timekeeping infra, they likely know all of this.

Sure, you might run into some quirk at some point in your career but they will end up being just another weird war story.

I worked directly with NTP. The nitty-gritty details and the leap second smears and all of that, for an absolutely huge network and I find this list patronizing, smug and worthless. It's some sort of intellectual masturbation or just another instance of https://news.ycombinator.com/item?id=32335165.


Python specific:

- `pytz.timezone(timezone_name)` will give you the current offset for that timezone - well, it will at least be some consistent offset - surely, every timezone has the same reference starting point?

What `pytz.timezone(name)` actually does is initialize a timezone object at the point in time of the first entry in the tz database for that zone. For example, New York had an offset of 4:56:02 prior to 1883 November 18, 12:03:58 [1]. That is the "starting point" for America/New_York and US/Eastern (which I think is an alias of New_York before a certain date). Which is why you get

    >>> pytz.timezone('America/New_York')
    <DstTzInfo 'America/New_York' LMT-1 day, 19:04:00 STD> 
Ish. Not sure where those 2 seconds went. But that explains why you see that strange 19:04 (-3:56) offset. The real way to use it is

    >>> pytz.timezone('America/New_York').localize(your_specific_datetime)
Timezones without a reference time are deceptive.

[1] https://en.wikipedia.org/wiki/Tz_database#Example_zone_and_r...


Can probably add another falsehood: that timezone information (in particular DST & UTC offsets) will never change. In reality, timezones are very much political and can change at any time with sometimes very little notice. Last major change US folks may remember occurred around 2007, when daylight saving time was expanded by 6 weeks. I know of clocks that are still wrong from that change.

At a previous job, I spent over a year working on a library for timezones built on top of Boost. The handling of historical timezones was a particular pain in the ass.


I think the list would be much more interesting and educational if it provided examples.


Jon Skeet has put together a talk on Dates and time zones talking about some examples

https://www.youtube.com/watch?v=DhYUcxRWWQI


We're developing a new date/time library from scratch for Hare. So far I think we're doing a pretty good job. The person leading up this effort (Byron Torres) has written up a blog post about it here:

https://harelang.org/blog/2022-04-17-chronology-in-hare/

To stress test our implementation (and to flex on other languages), we're implementing Martian time in it as well.

https://harelang.org/blog/2022-08-01-martian-time-in-hare/

    // Hare's first commit.
    let hare = mbc::new(chrono::MTC, 0, 0218,10,19, 09,20,53,344357297)!;
    fmt::println(mbc::bsformat(buf, mbc::STELLAR, &hare)!)!;
    // 0218 Perseus 19, Fri 09:20 MTC
My current litmus test goal for it is to get strftime to print out 23:23:60.


Skimming through it I didn’t see my common question of such libraries: can it handle approximate dates?

Twice in my career I’ve had to implement code to handle concepts like “August 2022”, or “1pm”, where it was important to track the level of precision offered by the source material for later comparisons. “1pm” is not the same as “1:00:00.000”, nor is “August 2022” the same as “20220801T00:00:00.000Z”.


I think that would be out of scope for the standard library. We do have an interface which might be helpful for developing such a tool, though:

https://docs.harelang.org/datetime#builder


Too bad that most stdlibs outscope it to third parties. Wildcard periods are essential for some applications (even as simple as a periodic reminder) and also are hard to do correctly. I’d say a period is actually more important than a date, because we mostly work with periods, and our “datetimes” are just integer coordinates of pixels of time, so to say.


Well, Hare is not a high-level programming language, and the standard library has a specific, finite scope. It's also not generally used in the kinds of situations where this is called for (think C -- there's not really a similar library for C, either). But that's not to say it isn't important -- I definitely think that there should be a library which provides this.


Falsehoods programmers believe about falsehoods:

* Something is a falsehood because it's not true in every context

* Something is a falsehood because it's not true in many contexts

* Something is a falsehood because it's not true in any context but the one the programmer cares about

(and the correct context is often “whatever deals with the company's bullshit problem with the least effort”)


In the U.S. falling back to standard time from daylight saving time is rough. Those days have 25 hours - and the interval from 1:00 - 2:00 repeats (you reach 2:00 and fallback to 1:00 and repeat the interval). Makes power scheduling difficult - which 1:00 - 2:00 interval are you talking about?

Also, the offsets between the timezones change. Eastern Daylight Time falls back from 2:00 to 1:00 Eastern Standard Time, which coincides with 1:00 for Central Daylight Time. So Eastern Time and Central Time are the same for one hour interval. Of course in the Spring you have the opposite problem - Central Time will be 2 hours behind Eastern Time for a one hour interval. So much for your interval time calculations! Also, the U.S. has changed the transition dates for the time change.

I'm so glad to be working on a system where I no longer have to worry about this crap!


One of the interesting time-related applications I worked on was involved with medicines management. If a patient was to receive a medicine around the daylight savings change over we had to make sure it wasn't either doubled-up or missed completely.

I also worked on event planning software with plenty of timezone fun with events having a timezone, the planner a potentially different timezone and it also handled flights for speakers etc who could be coming from any number of other timezones. That was mostly straight forward with decent libraries, but it certainly taught me that you need to think in (and store) timezones and not just offsets.


During the last “fall back” event I experienced the 1 o’clock hour three times since I randomly happened to cross a time zone at exactly 2:00am…or maybe 1:00am?

The worst part was I had an appointment at 02:00, somewhere where they won’t let you in if you show up too early, in the other time zone and hadn’t had to deal with daylight savings time for a good long while having lived in Arizona where they don’t deal with such silliness. Trying to figure out what time to leave to time my arrival was very difficult.


> having lived in Arizona where they don’t deal with such silliness

Oh god, I'm simultaneously delighted at the idea of not having to deal with DST and terrified of writing software that might have to keep track of whether or not the user was in Arizona during a DST transition.


Parts of Indiana have the same issue. Notice I said parts and not all - there are parts of Indiana that observe DST because they border with states observe DST and it's to their advantage to stay in the same time zone. What a mess! Just makes writing software that much more difficult!

That's why in the utility business it's common to use the language of standard time, daylight saving time, and prevailing time. For the Eastern timezone they write it as EST, EDT, EPT respectively. The important point is EST and EDT never vary - they'e fixed offset from UTC (UTC-05:00 and UTC-04:00, respectively). EPT is the squirrelly one - sometimes it matches EST and other times it matches EDT. At least when processing a time interval you know whether you have to take clock movement into account.


Arizona just stays on MST year long making it easy unless you have to figure out what time everyone else is on.

And for even more fun the Navajo Nation does do time zone changes.


Of about 96 entries, I count 27 instances of "Always" and 12 instances of"Never"

Reminds me of the old SAT hints: Beware of statements and questions with "always", "never", "must", and "cannot".


We should just define a year to be 360 days long and made up of 12 30 day months.

Then we can elect some druids or whatever to arbitrarily, at the start of each year, define which dates the seasonal borders will land on. Events which are truly dependent on weather can be defined in terms of "Days after the season starts."

Or we could define a 5.5+-.5 day holiday between the beginning and end of a given year. Those days will be declared to not belong to any year. We will turn off all our computers for those days, and pretend they didn't happen. If you are born within them, you get a special hat or something.


Should we also turn off computers in hospitals and in other important facilities? And then how can we qualify what's important?

Maybe you and I have the privilege to turn off computers for almost a week without a negative effect, but not the majority of the world, everything uses a computer to operate and coordinate.

Think about food production, aviation, shipping, sailing... And much more at both small and big scales.


You're going to take issue with that instead of Druid part?


I hereby announce my campaign for the role of Time Druid.


Your symbols of office shall be a blinking 12:00 and the date 01-01-1970.

Wait. In fact, if you weren't born on 01-01-1970, you can't be Time Druid. Sorry.


Do you mean 01-01-1970 or 01-01-1970?


ISO 8601 dammnit


Don't worry. The Druid will handle that.


I don't mess with druids.


How did you make it that far through the comment before choosing this issue to take seriously?


And why do only people born within them get special hats? Maybe you and I have the privilege of going about hatless, but not the majority of the world.


First they came for the Special Hats, but I did not speak out -- because I was hatless...


Grandpa, tell me a story about the Time Hat Wars

Well sonnie-boy, listen up... it all started on 01-01-1970...


Clearly ISO 8601 didn't win in the time hat wars.


Ah details, the Time Druids will sort that out probably.


Something very similar already exists (in theory): https://en.wikipedia.org/wiki/International_Fixed_Calendar

Kodak used to run on this calendar.


I was going to mention this. 13 months each with 28 days.


Lousy Smarch weather (doh!)


bye bye Halloween...


On the other hand we'll have a whole 13'th month, which will afford us many bonus Fridays the thirteenth.


Depends, each month has exactly 28 days. So you either never get a Friday the 13th, or you get one every four weeks.


I thought the implication was every Friday in the 13th Month is a "Friday of the 13th" which is close enough.


That's what I was going for



Hmm the only one which seems to apply is:

> having one or two days per year which are part of no month is stupid

which I've cleverly circumvented by adding almost a while week. I believe in this case the stupidity overflows and it becomes a good plan again.


The Icelandic calendar used to be something like that before the adoption of the Gregorian calendar in the 18th century (Iceland actually skipped the Julian calendar entirely, although I believe the old calendar was actually heavily inspired by the Julian).

All years were 12 months of 30 days plus 4 extra days of summer (Sumarauki in Icelandic), making the year exactly 52 weeks. Leap years had one extra week added to sumarauki making it a total of 11 days and leap years exactly 53 weeks. This has the benefits that each month starts on the same day of the week (e.g. the first day of the first summer month (Harpa) is still celebrated in Iceland and always lands on Thursday).

https://time-meddler.co.uk/the-old-icelandic-calendar/


I'm certainly romanticizing the whole thing but given,

> There is no special numbering of the years used in the Icelandic calendar, so the year may be omitted or the current Gregorian year used.

It is hard not to imagine a day in Sumarauki as being sort of magically timeless.


Given that the month after Sumarauki is named Heyannir which literally means busy season during haying, which I’ve heard from farmers is their busiest time of the year, perhaps you are right.


Why not 364 days, 13 28-day months, with one special day (probably New Year's Eve), two during a leap year. This would have the interesting feature that a given date always falls on the same day of the week for the duration of the year.


Then we'll have to come up with a new month name, which will cause all sorts of bikesheding.


This sounds like an excellent first step into imperial units.


The Maya did this. Those last 5 orphaned days of the year were considered very bad luck. You did not want to be born in that period.


This was more or less exactly the French republican calendar, with 10 days weeks also: https://en.m.wikipedia.org/wiki/French_Republican_calendar


Or 11 30-day months and December will be 3X days. No need to shut down.


"Months have either 28, 29, 30, or 31 days."

Ok, Im intrigued, are there months that don't? I assume it must be a country specific thing?


Adoption of the Gregorian calendar [1] resulted in a number of short months in a bunch of countries.

Different but related, 46 BC, the year of adoption of the Julian calendar, had 15 months [2].

[1]: https://en.wikipedia.org/wiki/Adoption_of_the_Gregorian_cale...

[2]: https://www.uh.edu/engines/epi2364.htm


That's what I thought. This list mixes "falsehoods" that nobody with a bit of common sense - even children - would believe, and pedantry that could be useful to 0.001% of developers. For most programs, Time starts in 1970.

IMO, this list is only good at... wasting time. Except for the multiple statements (that could be one or two) that point out that there's not such a thing like "identical clocks" and "same time on two clocks".


So "most" programs can ignore birth dates of people older than 52? I don't think so.


Do most programs need to know the birth date of the user?


Most of the programs I write do. Well, of people, but not typically the user.


Most programs don't deal with birth dates and people. It's like stating that most "computers" are neither on a desktop nor in a pocket. Technically correct, as counter-intuitive as it is.


I suppose it depends on if you think about a “fact month”, that is the series of days people experience, or the formal calendar months as the things being handled by the system.

The daylight savings time transition doesn’t “shorten” or “lengthen” the day, it merely transforms the clock. The day isn’t 25 hours or 23 hours. This is essentially the same thing as the Julian/Gregorian swap. The day didn’t really change, just the system we use to understand what day it is.

The span of time users experienced was weird, but this wasn’t the unit of time being shorter or longer. It’s an artifact of the transition not anything happening to the time.

A lot of these falsehoods on the list can be side stepped by distinguishing between the dimensions and the facts about representing time.


Clicking through the links, this post appears to be the source: https://news.ycombinator.com/item?id=4128470

As another commenter noted, the exception appears to be September 1752.

That's a fun example but whether it's worth worrying about is another question...


As the comments to that comment points out, 1752 is only for one specific country. All countries switched at different times (and a few even switched back and forth a few times). Greece, in particular, seems to have switched as late as 1923.


Russia is a big one too, especially for early 20th century history nerds.

They used the Julian calendar until after the Revolution, so when reading accounts of, say, most of WW1, Russian sources have dates that are 13 days off from Western sources. This leads to much confusion when authors don’t make it clear they’ve converted the dates (or not).

Amusingly, one of the first casualties of this was the October Revolution, which in the new (Gregorian) calendar actually occurred on November 7th


I'd expect in a lunar calendar system? (where 12 lunar months are to short to fill a year, and I think some of them fix this by inserting an extra half-month)

But I wish this list came with examples, and if it really means a different calendar system it's IMHO not as interesting, because it's IMHO fairly obvious that different calendar systems follow different rules. Or maybe during the switch to Gregorian calendar? That you'd at least encounter with dates for things long ago - although that transition is very "here be dragons" and very local.


Lunar month is a little over 29 days on average. And most lunar calendars do not align with solar year but rather have leap months added. E.g. Hebrew calendar [1].

[1]: https://en.wikipedia.org/wiki/Hebrew_calendar


Ok, you made me look up an example: "The Ethiopian calendar has twelve months of thirty days plus five or six epagomenal days, which comprise a thirteenth month."

https://en.wikipedia.org/wiki/Intercalation_(timekeeping)

https://en.wikipedia.org/wiki/Ethiopian_calendar

I guess its the choice if you fix the offset every year or wait until you accumulate a full month of offset.


I suppose you could have 31 days plus an hour. This depends on how you are tracking time and if that includes free-running counters.


If I am not wrong this holds true for the switch from Julian calendar to the Gregorian one. Some months were shortened to be synchronised with the new system. And if I am not mistaken Jewish and Arabs do not use the same systems we use for months and years


For an added twist this changeover occurred in different years, at different times in different countries .. and not always with the same number of days difference.

Good Times.


And even more good times. It seems sometimes some countries went back to Julian...


Others have already answered this one, but somewhat relevant: https://www.timeanddate.com/date/february-30.html


September 1752


Or October 1582, Or December 1582,

Or any of a number of different dates depending on where you were located: https://en.m.wikipedia.org/wiki/List_of_adoption_dates_of_th...


Most countries had at least one month that differed some time from 1580ish to 1800ish when they switched calenders

Don't know about more recent examples


https://en.wikipedia.org/wiki/List_of_adoption_dates_of_the_...

Actually looking at that tells me it is a mess... Like Belarus and Lithuania changing back to Julian calendar in 1800... And then again to Gregorian in 1918.


Maybe they're developing software to be used on Mars? 687 day years mentioned in Gist comments.


Ah ok cool - thanks for the replies!

So I guess its not really something that I need to worry about!


The system clock will never be set to a time that is in the distant past or the far future.

Right but if that's the case, what exactly can you do about it?


It really depends what the 'threat model' is.

If you are creating, for instance, an idle game where the user can pay to skip time; that's a problem.

If you are doing cryptographic checks dependent on time, that's a big deal (eg: how do we handle when the client or service goes "wtf, no. That's the wrong time")


This is my issue with a lot of technical aspects of software engineering.

I can read and understand about clock drift, vector clocks etc.

But I sometimes struggle to align that with real world design and architecture.

An IoT device reported an event happened at t=1, but t is not accurate. Ok, so? What exactly can I do about that?


In that case, the best you can hope for is to centralize time on the server. The IoT devices can keep local differential time or contact the server (or a log server or somewhere else) at the time of the event. It's kind of messy, since you are asking to trust a client's data, which is untrustworthy.


> If you are creating, for instance, an idle game where the user can pay to skip time; that's a problem.

This statement is also true in isolation.


Have had plenty of problems with WPA-Enterprise auth refusing to join a wifi network on account of a wildly wrong computer time. Which in turn means that the computer can't get a correct time on account of not being able to ask an NTP server.


Oh I have a great one:

> days don't overlap

If you ever have the pleasure of working with the Jewish religious calendar, it defines day boundaries by sundown and rise of the first star, which are not simultaneous. This leads to a phase during which two days co-exist. And of course it's not continuous over the year and by location.


The proposed moral of this story, and those like it, is to avoid re-implementing general-purpose libraries.

Yes, good. But not all applications need a “platonic ideal” understanding of time. Often when you type just the code you need, the system performs excellently, and the code is minimal.


I think your comment makes sense for pretty much any domain other than timekeeping though, and to claim otherwise is to take nothing away from TFA.


One of the hardest programs you could ever have to write would be historical times taking into account all the time zone and Daylight Savings Time changes over the years. It might be impossible. Just look at Indiana's history with time zones and DST.


Many years ago, a user filed a bug report that the timezone information we used for historical places in Indiana was incorrect. I don't remember exactly what the user was doing, but I think it had something to do with train schedules or something.

I did not fix that bug. (But I did learn that the 'TZ' database has this information, though I think 'America/Indiana/*' had been pruned out of the copy our JVM was using.)


Some of Indiana is on the Central Timezone and some is on the Eastern Timezone. This is definitely an issue!


Also one thing I have noticed is that the system clocks are actually pretty bad. Or I would expect them to be much better. This can be easily noticed when you don't have active time sync on machine. Drift is multiple seconds in not that many days.


I've worked with calendar applications and todo managers, alarms, recurring events and all that. Our timekeeping mechanics are incredibly complex.

24hs before 15:00 is USUALLY 15:00. But in some timezones, it can be (about once a year): 14:00, 16:00 or 14:59:59.

It's currently Monday 11:02 here, but it's likely still Sunday in some parts of the world (or is it already Tuesday somewhere? Not sure which is right). So, right now "today" have no single meaning; it depends on where you are.

Oh, and does your online meeting repeat at 14hs CET every Monday? Then for people in Argentina it will be at 9hs for six months a year, and at 10hs for another six months (due to CET having DST, but not Argentina).


14h CET is 14h CET each and every time. It is just that we automatically reschedule it to 14h CEST when summertime is in use.

Also it is in use for 7 months not 6 months...


I've just learned to do time conversions by converting from the target timezone to UST, then from UST, to the recipient timezone.

Hasn't failed me yet.

Here's why: https://www.nist.gov/sites/default/files/images/2019/12/23/w...

This looks useful. I haven't checked for full accuracy: https://www.timeanddate.com/time/map/


>Leap years occur every 4 years.

Isn't leap year calculation one of the very first things you do in most programming tutorials/schools? I know that a lot of people don't know this, but most programmers should, r-r-r-right?


If you handle dates and times yourself on that level, prepare to be in a world of pain.

Handling time and dates correctly has a similar difficulty to writing your own cryptographic primitives: If you don't know exactly what you are doing you will shoot yourself (and potentially countless others) into the foot at one in point or another.


Tbh, half of my life I handled them on a financial platform which couldn’t care less of the subj list or standards. It has just ‘wall date’ like yyyy-mm-dd and ‘wall time’ strictly in 00:00:00-23:59:59 range, without a timezone. Two separate types. Never encountered of even heard of any time-related bug there.

I find this simplified date/time very useful, unless you have to manage “continuous and/or real” time somehow. 99% of applications are okay with it, because they’re facing users with exactly the same mental model. Also, albeit not correct scientifically, it reflects many developers’ mental model as well. This is much better than a model that just doesn’t match, which is a world of pain. Someone adds 86400 but it’s the same day, good luck debugging it.

I don’t think this is a cryptographic-level issue.


My uninformed guess would be that a financial platform mostly just wants a monotonically increasing clock with which to order transactions. So this would I guess avoid quite a bit of the annoying date related stuff. (?)

Anyway if you queried something to get the current date-time and clock-time, you could be farming out the annoying edge cases to some arbitrarily complicated library, right? Which is the right way to do things.


See also the problems in the Excel datetime calculations that are now unfixable because a fix would break many existing problems.

e.g. https://en.wikipedia.org/wiki/Leap_year_problem


Unimportant sidenote: I don't like the naming of these "falsehoods programmers believe about X" lists. Surely programmers are more likely to be conscious of these not being true than non-programmers.


It isn't "falsehoods all programmers believe..." it's "falsehoods that [some|lots|many|inexperienced] programmers believe..."

Unless you work all day on time-related software, you'll probably never encounter most of these quirks.

Humans don't write in BNF. We expect, rightly or wrongly, for other humans to use Postel's law to interpret what we mean.


But the point is that programmers are less likely to believe these things than ordinary people are.


The point is these are falsehoods whose belief can affect a programmer's work when they get embedded into the systems they create. The point is to correct yourself, not tu quoque your brother.


By putting the intended audience in the title, you grab their attention more.


It's clickbait.

Computerphile managed a much better title[1]: The Problem with Time & Timezones

A "(for programmers)" could be appended if needed.

[1] https://www.youtube.com/watch?v=-5wpm-gesOY


Its not click bait, the contents are what the title claim to be.


But they’re more likely to be impacted.


well basically it's shorthand for "things that do not apply to every locale for X of instance of Y, that programmers have decided to implement as if they do apply to every locale or instance either through not realizing they do not apply or that the non-applicability in some situations are such bizarre outliers in relation to the rest of the world that it has been decided to be an acceptable bug that they will not fix" but that's not very pithy.

But of course what happens is the outliers have to deal with these systems that treat them as bad data possessing impossible properties and then the curse and say hey these stupid programmers don't realize that the world is not just like X1 or Y1, but that us rare instances of X2 and Y2 also exist!

And then an expert with sufficient knowledge about all the outliers compiles a list entitled Falsehoods Programmers believe about {X}.


I still encounter forms on the web or input fields in software that prevent me from entering truthful information or that require me to input information that doesn't exist. There are definitely still programmers out there that make assumptions that don't hold in all cases.


No, they are not. Beyond leap years and leap seconds, I know absolutely nothing about time, how to handle all the "daylight savings" and other kind of bullshit. And I actually do not care, at all. Luckily for me, people who do care and know these things have made libraries that can handle it all while expecting only minimal knowledge from me, such as being aware that time zones exist at all.


The reason why you use those libraries is that you know time is difficult and you need to be cautious. You don not blindly believe those statements about time. If pressed to answer whether they are right or wrong you would say that you are not sure.


And I'm not sure how this contradicts what I wrote.


Arguing about this is not productive, so sorry in advance, but just to explain myself:

me: programmers are less likely to believe this stuff than non-programmers

you: no, they are not

me: programmers are conscious that time is more complex than it seems to be, therefore they avoid holding simple beliefs regarding it


Programmer that have worked with timekeeping: yes.

Average programmer who hasn't done anything datetime-specific: unlikely.


I always enjoyed an entertaining talk by Peter Hall on this subject: https://youtu.be/bJmx0tcVubY?t=3224


> The day of the month always advances contiguously from N to either N+1 or 1, with no discontinuities.

I would've presumed this one - for the Gregorian calendar, at least. What are some counterexamples?


See e.g. the British transition from Julian to Gregorian calendars, in which eleven days of September 1752 were not observed in England:

https://en.m.wikipedia.org/wiki/Calendar_(New_Style)_Act_175...

IIRC, that event is the reason SQLServer's datetime type is only good back to 1753, and thus why you should use datetime2 for dealing with older historical dates:

https://database.guide/datetime-vs-datetime2-in-sql-server-w...


Well, if by day advancing you (they) mean adding 24 hours to a given datetime, then a 23 hour day would cause the date to advance by two.


> Britain uses GMT.

Which Outlook in particular seems to get direly wrong. I have lost count of the number of times I have been sent an email from someone using Outlook containing a supposed calendar event in summer which declares that it is at a particular time GMT. It's wrong, and if I were to turn up at the time it stated, I would be an hour late.


Britain uses GMT only during winter. In summer it's BST (British Summer Time). The fact that calendar clients can't determine a timezone for a given jurisdiction on a given day is actually something that I hate.

https://en.wikipedia.org/wiki/British_Summer_Time


I have never had an opportunity to deploy this tool, but here's a program for generating a TZ shapefile from OpenStreetMap data:

https://github.com/evansiroky/timezone-boundary-builder

In principle a calendar program could use its output to perform GIS lookups of locations then use the tzinfo DB to get the correct timezone at the given location and specified time.


Does anyone actually believe this?

Even the most inexperienced developers I’ve worked with are well aware that time, and timezones in particular, are really difficult.

I’ve yet to meet anyone that has suggested using anything other than a battle-hardened standard library for time.


> Does anyone actually believe this?

Does any programmer believe ALL these falsehoods? Not very likely.

Do most programmers believe AT LEAST ONE of these falsehoods? Likely.

> I’ve yet to meet anyone that has suggested using anything other than a battle-hardened standard library for time.

Many stdlibs still don't account for the most edge of corner cases -- or make it very easy to screw up anyway.


I bet you could get me to give the wrong answer to most of these questions if you worded it cleverly in informal conversation, but the answer to anything time and date related in a technical setting is "Ugh, we'll have to look up the stupid edge cases here," right?


Exactly.

And you also need to understand your timing requirements.

If all you care about is recording a date on an invoice, you really don’t need to worry about leap seconds or smearing or whatever.


Even if you use a library, it's easy to bake in an "obvious" assumption into your program.

Such as storing location + date + time, because that should be unambiguous, right? Or confusing "same time tomorrow" with now + 24h. Are you even sure which one you need?


> Are you even sure which one you need?

Any examples of when you need "now + 24h" instead of same time tomorrow?

For example, daylight savings may start the next day, and time will be adjusted... And everything will be done 1 hour earlier than today despite the hour on the clock being "same time" as today, so what? Other than your body needing to readjust the sleep cycles, noone else seems to actually care.

Meanwhile, when I setup cron to do something daily, I don't care about this at all, so what if on one day, the interval will be 23 or 25 hours?


I once worked for a company, where the gps platform had multiple time related problems. This list pretty much sums it all up.


Missed one: https://en.wikipedia.org/wiki/Leap_second Yes, there are NOT 60 seconds in a minute...


I'd like to hear about how to write software orbiting a black hole?


As long as you don't change orbits, you should be fine.


"Falsehoods programmers believe about black hole orbits..."


I mean, you're sort of there already? The galactic barycenter is a supermassive black hole called Sagittarius A*.


Very carefully indeed


Just wait till you hear about leap seconds.

At my previous job matching datapoints on timestamp was crucial, so we did a lot of fun things with time. Leap seconds cost us 3 months of testing.



Similar thing for Names:

Falsehoods Programmers Believe About Names https://news.ycombinator.com/item?id=1438472


The Unreasonable Effectiveness of Falsehoods Programmers Believe Considered Harmful

https://news.ycombinator.com/item?id=3735928559


Awesome Falsehood - A curated list of falsehoods programmers believe in

https://github.com/kdeldycke/awesome-falsehood


Is this link correct? I get a 404.


So harmful they had to censor it.


> There are always 24 hours in a day.

Is this referring to leap seconds or what do the mean?


When the timezone changes for daylight savings time, you either lose or gain 1 hour in that day (or whatever your local equivalent is).


Except for Lord Howe Island in New South Wales, Australia, where local time changes by 30 minutes.


Makes sense, thanks!


23hs or 25hs during DST transitions are the most obvious exception (twice a year in many many countries).


Sometimes 22hrs or 26hrs if traveling by ship.


Daylight savings, or the device is on a plane and travels through timezones.


Or, back when I was doing airborne geophysical surveying across the Fijian archipelago the plane would flit from -179 to +179 longitude a few times every hour with a potential change of a full day if being naive about time zones.

Data aquisition instrumentation needs to use a lapsed epoch (to avoid UTC leap seconds) with a calibration at start and end of projects to adjust for drift, etc.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: