I'm a big fan of decimal time (or French Revolutionary Time as it's sometimes called). I made my own version a few years back: https://kybernetikos.github.io/UIT/
My version also removes timezones. The numeric time is the same whereever you are in the world, but the display changes based on your location so that local solar midday is straight up on the clock face and local solar midnight is straight down on the clock face. Day and night hours are drawn on the face.
I used to have it attach to your google calendar and draw meetings onto the clock face too, but I think in the last 11 years the google calendar api has changed and I haven't updated it.
Obviously once you have metric time, you need a metric week too, so I have new week days: nullday, unday, duoday, triday, quadday, hexday heptday, octday and nonday. There's no need for months anymore, I just number the weeks. Of course 28th Quadday is the 285th day in the year, as the first day is 0th Nullday. Fixing the number of weeks in a year to a round number is left as an exercise for the reader.
I also use Decimal Time and the French Republican Calendar as my default calendar and time method. The mental math to convert back to "standard" time and date is very easy once you get the hang of it, and I think it's good practice to not take for granted systems we consider "standard" such as time.
For more on the philosophy of this, I highly recommend Jenny Odell's Book, Saving Time. Note that she doesn't talk about the revolutionary calendar in it (to my memory) but touches on a lot of the realities of time and clocks themselves.
I am curious which leap day system you use for the French Republican calendar. Most popular versions of the calendar I see floating around just use the Gregorian leap year system, which I don't like because it yields the wrong results in the year the calendar was in use. Yet, I don't see mental math as a viable option when using the original date-of-equinox method used during the revolution.
It's not an easy thing; frankly, if I have to back convert to prior times when the Calendar was still in use, you have to just swallow the pill
However, if using the calendar in day to day life, the Revised System is much easier.
To quote Wikipedia:
leap years being every year divisible by 4, except years divisible by 100 and not by 400. Years divisible by 4000 would also be ordinary years. This calendar also has the benefit that every year in the third century of the Republican Era (1992–2091) begins on 22 September.
About the non timezone version but skates over the fact that he doesn't know the answer in the timezone version either. He says "Google tells me..." well sure, if you're allowed to use Google you could just have easily have asked "what are typical daylight/waking hours in x location?" This is much better anyway since in some countries culturally you siesta in the early afternoon (don't call please) and stay up later.
Anyway there is something of a genuine problem here (although I don't consider it serious) and I'd solve it by drawing the world on the side of the clock and allowing you to rotate the face by selecting a place on the map.
The arguments against removing timezones apply to adding timezones - it just depends on what you're used to or which type of conversion you find easier. The arguments in this oft-circulated article really fall flat.
Simple and useless. You can have any system you want for your personal use, but clocks and calendars are for colaboration. It's nice that you have day and night hours on the clock where you are. No you will travel a few thousand kilometers. You know your decimal time you get there, do you know whether it will be daytime or nighttime? You get there, and it is winter and the dark time is 3x of the daytime. Have you got any idea when anything opens and closes?
And why have weeks if you don't have months. How about just numbered days?
In the full version there is an unrolled map of the world down the side with the terminator drawn on which solves all those concerns.
Even if those problems were harder to solve its far more valuable to have a single time for your meeting than multiple times (depending on the time zones and savings time of the participants). The pain of organising international meetings was one of the reasons I did this.
It's been explicitly highlighted to me before that the calendar in the French Revolution is called the French Republican Clendar and not the French Revolutionary Calendar, so I'd assum the same is true and it would be French Republican Time
No because its a humane rather than scientific system. Day lengths in a scientific time vary, but in this system the hours are defined to be 10ths of a solar day so they vary as the day varies. Scientific measurement should be done with a different system. I'd suggest something based on the speed of light.
I recommend using as little as date/day mechanics as you can get away with. Instead using a monotonic durations as much as you can [1].
If you continue to program with date/day mechanics you'll run into plenty of issues when users cross timezones or just set the time on their phone to a different one.
My favorite things is when you connect to cell towers that have DST in one state, and not in another, the cell phone getting the time/date from "the network".
Yes, getting rid of the various savings time is important too.
All the people pointing out the (genuine) problems with decimal time seem to be weirdly tolerant of the problems with our current system, which has much worse problems.
If you imagine the reverse situation, where society uses a decimal time, and some people are arguing we should switch to this system because despite all the things it makes worse and harder, it divides better and the numbers map somewhat (although imperfectly) to the actual time-of-day in various locales and (even less perfectly) to the kinds of things people might be doing at that time, everyone would think they were crazy.
Firstly the time variance we're talking about is small and will only matter when milliseconds matter, secondly I addressed this - for scientific measures and scenarios where milliseconds matter a different time unit should be used.
If you want to prioritise ease of moving between the two systems, I'd suggest a normalised version of the new second. If you want to prioritise science, then something that makes a nice round number when giving the speed of light would be nice. If you want to prioritise ease of adoption, just keep the current SI second, just stop using it to measure times of day.
Either way the symbol for writing it should be different.
Of course, computers and humans must use the same time system. Which is why now that we have computers, we always specify the number of milliseconds in everything. Unfortunately, until someone builds a type of computer that can do the type of math needed to convert between two measurement systems, this problem will remain an insurmountable obstacle.
Which is why my system doesn't address year length. In practice I'd stick with the same leap day concept we currently use, so some years would have 36th quadday as their last day and some would have 36th pentday as their last day.
You'd think we can just interchange the . and : so you should wake up 12:83…, or 2:83… the next day.
But no, it's apparently 2:75 metric time. Why?
Also nice how they IMMEDIATELY ran into a continuing fraction 1/3, which is exactly the point of the 60-minute hour, 24-hour day, or 360-degree circle: lots of factors.
> Also nice how they IMMEDIATELY ran into a continuing fraction 1/3, which is exactly the point of the 60-minute hour, 24-hour day, or 360-degree circle: lots of factors.
While I agree in principle, this example is not evidence of it. It's not like we measured the amount of sleep needed and it's precisely 8 hours. A more reasonable thing to say would be: You need to sleep for 3.5 metric hours.
It’s almost certainly a rational number as activity in the body is finite. There will be an exact point at which a certain molecule has interacted with another molecule marking the exact point of perfect sleep. Averaging many rational numbers can’t get you an irrational number.
Also it is invalid to compare infinite series like you do in your paracentesis argument, there are infinitely many irrational numbers and infinitely many rational ones. If you do it, you run in to contradictions. (Something which is related to Cantor's paradox)
The statement OP put in parentheses is a well-defined statement in mathematics. It means that the Lebesgue measure of the set of the rational numbers is zero in the space of real numbers.
"If the physical property that time meassuring devices meassure is continuous, it must also contain irrational numbers?" Is that what you are refering to?
Mabye... To me it seams like nothing is truly continuous i nature. But mabye there is such a thing somewere out there somewere.
But irrational numbers require definitions that contain or require recursion. Mabye physical time is built with such a recursive definition?
"There will be an exact point at which a certain molecule has interacted with another molecule" statement is contradictory with the uncertainty associated with time as described by quantum mechanics and molecular interactions due to Brownian motion. Material that is returned when searching for those topics will better answer your further questions about them and those above than I here.
The uncertainty principle is about unavoidable measurement "errors" (measurements that can’t be done). If you are going to measure the time (or anything) you will always get finite values. Finite values are not irrational.
Either one talks about the underlying physics or about the measurements of it. If one talks about the underlying physics all bets are of, it is unmeasurable by definition. Anything is possible bellow the measurement threshold(including irrational numbers).
If one talks about the measurements you will always get finite rational values.
Are you sure? Isn't the average calculated by dividing by n, an integer?
Edit: I mean, you're right that nearly every number is irrational. But I think averages are going to be some of the tiny fraction of numbers that aren't.
Base 12 being 3-smooth, any number with 2 or 3 as a factor has a reciprocal with a terminating expansion.
base 60 is 5-smooth, so any number with 2,3,5 as a factor has a terminating expansion.
Just as SI has to add in deg min sec for fields like astronomy.
As the practical numbers are quite dense up to 60, they could divide by multiplication of the reciprocal for many more numbers.
Floating point is more complicated than just the base as the radix also matters. C(++) finally got decimal radix support this year in the standards and IBM has had decimal floats for a long time.
FFT and encryption often use mixed radix despite being a binary base too.
It is a far more complicated subject than it appears on the surface.
But there are problems that aren't easily solvable in base 10. The degrees of a circle are an example and why navigation uses the nautical mile, where 1 nautical mile= 1 minute of latitude is an example.
12,60,360 are Superior highly composite numbers and 12 is the smallest 3-smooth and 60 is the smallest 5-smooth.
This also means that with using 360 degrees one can divide a circle or semicircle in 12 sections with just a square, 345 triangle and equalatrral triangle. Where decimal or even radians requires the square root of 2, pi, etc...
I am a fan of universal units of measurement, but had they been base 12 it would have been better IMHO. SI could be more broadly adopted if it has been base 12.
Seximal is actually great for hand counting. You can use fingers the classical way for one digit per hand or base 6² compression for two digits per hand, allowing you to count up to 1296 with two hands.
The whole argument started when somebody above claimed that we probably do not sleep exactly 8 hours, but some weird irrational number. That problem persists even if you measure in radians...
Do the physicist thing and just define all your constants to be one.
"How long is a day? 1 day."
"I slept for .37 days."
This nicely unifies things with the tao manifesto[1] people since 1 day = 1 turn = 1 tau = 2 * pi radians and reinforces the periodic nature of these things.
I don't really care how much sleep I get as long as it's enough. I care greatly when the bus leaves and when my meetings are. Anything that hopes to replace a clock had better be adding precision rather than removing it.
Decimal seconds are slightly shorter than normal seconds, so times that use seconds are more precise. The other fun thing is that decimal time doesn't really need names for the levels, it's just numbers of significant figures so if you only want to give the first digit of the seconds that works too.
It's less precision for a given number of symbols. Fewer usable factors mean you can only reasonably divide up segments of time into larger, less precise chunks.
This is far from the truth. The closest time before midnight you can conveniently represent in the current time system is 23:59:59 which is six symbols for a standard second before midnight.
In decimal time the closest time before midnight you can conveniently represent would be .9:99:99 which is noticeably closer (0.163 of a second closer) to midnight and uses only 5 symbols.
This should not be surprising - our current system is very inefficient in its use of digits.
45 minutes and 00 seconds is very close (less than a quarter of a second difference) to :23:33. But there's no magical or particular reason why 45 minutes is an important time period to make a nice round number. :24 is a very convenient number (so many factors!) that's very close to the same time duration. That's 46.2963 minutes.
Or maybe you are referring to three quarters of the way through the major division. That's actually pretty natural too :75 like a percentage. Now dividing the major division into thirds is a little less convenient but I do that far less than I need to add and subtract times which this system makes much more convenient.
How close do you want to get? For what purpose are you representing 12:15?
.5:10 is 12:14 and 24 seconds. That's pretty close and uses only 2 sig figs - even fewer than the four you needed to represent 12:15.
But also, why would you obsess about these specific times? If we were using the proposed system and someone made the argument to switch, you'd be saying 'How well does your system represent the time .5:1 ? It's only 2 sig figs, but you need to go down to seconds to accurately represent it - 12:14:24.
It can get worse too - .5:01 is 3 significant figures, but to represent that time in the current system requires that you go down to tenths of a second with 7 significant figures - 12:01:26.4. Or if I go down to second equivalents, you sometimes need to go down to milliseconds - .5:09:01 is 12:12:58.464
Now apply that same logic to how we write our numbers. Duodecimal notation makes very clean multiplication tables and makes arithmetic easier to learn.
I'm a twelve maximalist. Let's convert SI to duodecimal prefixes. One kilometer should be 1728 meters (of course that's 1000 in duodecimal). One centimeter should be 1/144 of a meter.
I keep hoping for a chance to welcome some highly composite overlords.
Is 1/3 (1/6) such an important concept? 1/2 (1/4) is natural in metric.
I would think a bigger benefit is that you can drop up and down without any math depending on the precision you need. Or use decimals, no math involved either, since it's the same thing.
Yes, 1/3 and 1/6 are quite important. Do enough making and you will run into those fractions plenty. It just turns out that in practice, as you say, easy unit conversions outweigh the benefit of clean division. Metric's important benefit is that it's a consistent base, not that it's specifically base-10.
And now that we use computers for a lot of measurement, those infinite decimals aren't as big of a deal. (Well, most of the time, anyway. Representation of infinite fractions famously creates some programming problems. But I mean for the user making the measurement.)
Base-10 is easy to learn since it's the same we use for counting, but I think there's an interesting argument to be made that purely from a science/engineering/making perspective it would be better to use a consistent base-12 measurement system.
Now apply the same logic to currency. What is the combined value of thirteen shillings and ten farthings? Simple, it's 317 halfpennies, or £317/480. It's much easier to deal with than a more difficult calculation like $0.25 + $0.37 = $0.62.
I didn't check the details of that metric time system but there would be no problem living a 10 hours day with 100 minutes hours and 100 seconds minutes. The second would be defined with a different number of "cycles of the radiation produced by the transition between two levels of the cesium-133 atom" and that's all. Instead of rounding our lives at "our" quarters of hours or ten minutes, five minutes marks we would round to the metric quarters of hours, maybe eights of hours (we could have found a name for that, like for coins) etc, and nobody would notice because that would be what we are born with.
Similarly, people buy 50 cm x 70 cm frames in metric countries and 20" x 30" frames in the USA. Nobody thinks about that except frame factories in China, that have to cut them in two sizes.
Nobody questions the logical consistency of the system, I think.
Presently, we use 24-hour time (AM/PM or otherwise), so the loss in precision is enormous going from 24 to 10.
Think about it.
A tenth of a standard hour is 6 standard minutes.
A tenth of a metric hour is 144 standard minutes, more than two standard hours.
A hundredth of a standard hour is 36 standard seconds, or half a minute. It is in the context of hours a negligible amount of time.
A hundredth of a metric hour is 14.4 standard minutes! Imagine being 0.01 metric hours late for a meeting.
So everywhere you now have to specify time to two decimal places /at least/, three to have sub-standard-minute precision.
Layer on top of this the fact that the new system has significantly fewer distinct prime factors, so you quickly run into continuing fractions. Disaster.
What? A metric hour is 144 standard minutes - or do you imagine that a day of metric hours is 1441010 = 144,000 standard minutes long?
A tenth of a metric hour is 14.4 standard minutes. A hundredth of a standard hour is 1.44 minutes. Being 0.01 metric hours late, or 1 metric minute late, is not too much more than being 1 standard limit late.
And current time is specified to 6 digits for second precision - hours, minutes, and seconds.
Don't we have the same basic criticism going from Fahrenheit to Celsius? You go from a 10 degree swing being big to ridiculously huge.
That is, arbitrary number is arbitrary, at large. Most of the "benefits" of any system won't actually be realized by most people that are using it. Consistency, on the other hand, is hugely important.
> Don't we have the same basic criticism going from Fahrenheit to Celsius?
Sure, and one shouldn’t do that either. Fahrenheit’s range (0 being the freezing point of brine, almost but not quite intolerable to a human, and 100 being roughly body temperate and also almost but not quite intolerable to a human) is far more human than Celsius’s freezing-to-boiling range.
Having never used Fahrenheit in any capacity, the math for it feels inhuman.
Having grown up in metric, the values dont feel 'inhuman' just a scale that i'm familiar with. I know that my body is usually at 37, I know that water boils at 100, i know that I don't like anything below 8c or above 40c.
>Presently, we use 24-hour time (AM/PM or otherwise), so the loss in precision is enormous going from 24 to 10.
>Think about it.
No, YOU think about it. When do you use hour-only precision today ?
When you say "I have meeting at 8" that never means "any time period between 8 and 9", that means "it starts at eight". That doesn't change with change of the length of the hour.
You'd still use hours on their own only if you mean "an hour and close to zero minutes around that hour". You'd still add minutes for anything else.
> A hundredth of a metric hour is 14.4 standard minutes! Imagine being 0.01 metric hours late for a meeting.
...no ? it's 100 seconds so ~1.66 of standard minute
Imagine being 1.66 minutes late instead of one!!! Such horror!
Instead of changing the definition of second, it might make sense to separate day-time from scientific time. Decimal hours and minutes would be normal time keeping. If needed more accuracy, then would switch use centi-minute for casual use or second for scientific use.
One nice feature is that the day-time would be different on other planets. There would Mars-day and Mars-hour. But the second would be the same.
An hour is just an another name for a deciturn, or a deciday.
Tau is not a good unit for angles in general, unless you have circles, or you need lengths of circumferences somewhere. (You want to paint your clock and calculate the amount of paint, or something.) Turn is the natural angle unit. And since the earth rotates 1 turn/day, the amount of turns is the same as the amount of days.
Metric time is about factors too, but it prioritizes the factors that simplify the comparison across (literally) many orders of magnitude. This is way less helpful for everyday life because we usually only reason within 2-3 orders of magnitude, but we deal with tons of harmonic subcycles that the 24h clock makes easy.
For a fun, a more justified usage of metric time, check out Vernor Vinge's "A Deepness in the Sky" (also IMO one of the best sci-fi novels of all time). A spacefaring humanity that stretches their life across journeys and projects that span centuries, and which has to artificially produce their own daily cycles, gets a lot more value out of metric time.
Came here to mention/contrast Vinge's metric time with with my (quick, approximate) understanding of this approach...
The problem with this approach is it attempts to "redefine" hours and minutes... part of what I liked in Vinge's/Deepness' approach was, it ignored all that and just talked about seconds in standard exponential scientific notation... it makes a lot more sense in space though, where there's no need/logic to connecting or synchronize to a specific solar cycle, so they just think/talk in kilo-seconds (about 40 mins) and mega-seconds (about 11.6 days)... or at least those are the two I recall them using enough that I got semi-used to thinking in them while reading the book... I had to convert, but I didn't have to redefine any existing unit or remember/remind myself which meaning of hour/minute they're using, because nothing changed.
I'm a huge fan of both, but you're not the first person I've heard who had problems with the writing style / dialogue / characterization of Fire. In that respect, I'm not sure Deepness will feel different. Vinge has a pretty consistent style and cultural stance, which IMO is an interesting balance of Heinlein-style techno-libertarianism and genuine "humanism" (which extends to dogs, plants, and other thinking creatures).
Structurally, though, it's pretty different. The characters have much more opportunity to develop, and it's purely hard sci-fi where Fire used its setting to veer into fantasy ideas. So my biased advice remains: give it a shot :)
> You'd think we can just interchange the . and : so you should wake up 12:83…, or 2:83… the next day.
> But no, it's apparently 2:75 metric time. Why?
You can, they just messed up the example by unnecessarily rounding off the times. The night from 9:50 till 2:75 metric time only lasts 3.25 metric hours, or 7h48m in standard time; less than the 8 standard hours they started with.
You know what other measuring system is based around highly composable numbers?
It's not metric. It's the imperial system. Halves, thirds, quarters are all neat numbers. It does get a little weird because we only use feet, yards, and miles today, but there is a progression from inch to mile that all make sense to some degree.
Similarly with volumes. You can get from teaspoon to gallon without having to worry about any weird decimals.
The only thing metric really has going for it is uniformity of conversion.
People shit on it, but the imperial system is not actually that horrible.
I disagree, it’s not equal but different. The French famously tried to switch to a decimal system for angles before, but failed in no small part because of the relatively few unique prime factors. Being able to divide evenly by three turns out to be more important than five.
To quote yet another time format: NTP 64-bit timestamp format (rfc8877), which is 32 bits seconds since epoch + 32 bits fixed point second fractions. (Outside of Network Time Protocol, you'll find this baby for instance in ISOBMFF ProducerReferenceTimeBox(prft)).
Here seconds are just 1/(24*60*60) of a day as expected, but the base 2 fixed point part, where a tick "is roughly equal to 233 picoseconds" makes you want to pull your hairs out if you just want to accurately express milliseconds. (Similarly for other timescales frequently used in media processing, like 90kHz, 25, 60 or 29.97)
The answer to all this is of course: hand waving — "you don't need that". Your time can be perfectly accurate in itself (ie. an accurate discrete sample of continuous time), even if no accurate conversion exists to some other time system.
It's tongue in cheek or wink wink. No one uses it seriously but there is sometimes an undercurrent of resistance to what's seen as an external cultural standard. But you'll never see it used as a kind of rallying cry.
Yes, a very confusing error.
The metric time is the SI time (24h 60m 60s).
The decimal time is base 10.
French tried to get it used during the revolution and it did not work. Its the only unit that resisted decimalization, with a couple others ones in a handful of countries still using something called « imperial units ».
Another similar thought experiment is binary clocks which I remember using to get use to read in base 2. [1]
Weekly clocks are also a good way to change perspective on time. [2]
Both are fun to use, especially with other people if you manage to get them to experiment with you. Both have the avantage of avoiding any confusion with SI time.
>with a couple others ones in a handful of countries still using something called « imperial units ».
The only country I know of offhand that uses "imperial units" is the UK.
There's a different, but similar (and sometimes overlapping) system called "US Customary Units" that's used in the US. Imperial pints and gallons are NOT the same as US pints and gallons.
Thanks !
Sorry I overlooked this.
Did a little research and I think the most confusing in this is the ton. At least, pints, gallons and miles have a different name than the metric unit and are way different than their SI equivalent. Not close by around 10%, one more (the long ton) and one less (the short ton). A perfect way to get the wrong quantity of a thing without noticing it at first.
And if that’s not confusing enough, using « long » and « short » for a mass unit..
The now relatively uncommon UK gallon is the volume occupied by ten lbs of water in the same way a litre of water weighs a kilogram. Not only are the pints bigger there’s also twenty ounces (also rarely used now) in them which means a fluid ounce of water weighs by definition an ounce in the imperial system.
A second is the period of a pendulum with a length of one meter. :-)
(But not really; that was originally considered as a way to standardize the meter, IIRC, but the period of a pendulum varies too much even over the area of France to be used as a concrete standard.
But the relationship is remarkably close for a coincidence, like the way a rod is almost exactly five meters.)
The calculation of a second has been updated in 2019
The second is defined by taking the fixed numerical value of the cesium
frequency ∆ν, the unperturbed ground-state hyperfine transition frequency of
the cesium 133 atom, to be 9 192 631 770 when expressed in the unit Hz, which
is equal to s−1.
The conference where this was made official was quite a beautiful, upbeat demonstration of people from around the world coming together to agree upon a change:
Technically, yes. But measurements of weight, volume, and temperature are also part of the metric system, and those didn’t derive from the meter.
Seconds are also part of the metric system, but one of the few not based on decimal/base-10.
Again, I’m just speculating that the author used “metric” because it often represents decimal/base-10 measurements. Not really arguing whether they were technically correct in doing so.
This isn't totally true. Mass and volume measurements were indeed derived from the meter. A gram is a cubic centimeter of water. A liter is 1/1000 of a cubic meter. Apparently Celsius is derived from Kelvin (really just translated so 0 is the freezing point of water), which is derived using metric units in a formula that is a bit beyond me but available here:
> But measurements of weight, volume, […] are also part of the metric system, and those didn’t derive from the meter.
For volume this is obvious nonsense since the metric system expresses volumes in … cubic meter! And even weight, a kilogram is “the weight of a liter of water” (that is a thousands of a cubic meter of water).
Weight and volume were defined based on the meter. Volume is just expressed in m³ or litre, which is just 1/1000 of 1 m³. Mass was originally defined such that 1 kg of water is the mass of one litre of pure water at sea level.
And based on a 1,000 (kilo) of something so 1 metre 1,000g (kilometre), 1 gram 1,000g (kilogram). You can have centimetres but that not in the spirit of the metric system really.
Maybe we need a cron for an hour or a day? 1,000 crons for an hour sounds better than millicrons by breaking up a day if it was 1 cron.
> You can have centimetres but that not in the spirit of the metric system really.
"Centi" is an SI prefix just as much as "kilo" is, and has been part of the metric system since 1795 just like "kilo". I don't see how it's "not in the spirit" of the metric system. It also fits in neatly in that 1cm^3 of water is roughly 1 gram (the original provisional definition of gram was 1cm^3 of water at the melting point of ice; the current definition is more precise), and so 10cm^3 of water is roughly 1 liter.
The SI system has prefixes going up and down one power of 10 up to 10^3 and down to 10^-3, and then in steps of 10^3. They're all equally part of the system; some are just more common in some contexts that others (e.g. we use hectograms but rarely hectolitres, and decilitres but rarely decimetres, and centimetres and centilitres but rarely centigrams) depending on what happens to be convenient.
E.g. kilo(10^3), hecto(10^2), deca(10^1), deci(10^-1), centi(10^-2), milli(10^3), but then mega (10^6) and micro(10^-6) are the next steps.
Metric time is a better name, since it states that this time comes naturally from our common metric system, and something is very very off that we don't use it.
I don't think it is misleading. Although there is a name-collision with the SI time system that some people call metric time [0] already. I don't know how many people, or how official it is. I'll probably keep calling decimal time metric time, and we'll see if there is a real collision/confusion/misleading.
As mentioned elsewhere in the thread, it's called the metric system because the values were derived from the meter. Decimal time has nothing to do with the meter, and "decimal" refers to it being base 10.
Watch face has 4 hands, one for each pair of characters in the hex representation of the current unix time.
Being hex, it kinda makes it easier to understand for me in that the "minute" is 255 seconds. The next chunk of time is 65,025 seconds or about 18 hours. Then comes 16,581,375 seconds, which is almost 192 days.
Actually the minute is 256 seconds (2^8). The next chunk of time is 65536 (2^16). Then comes 16777216 (2^24). The whole clock rolls over at 4294967296 (2^32).
I was very confused by this too and finally realized the mistake. The chain of logic went like this:
> There are 2.4 standard hours in 1 metric hour
> Therefore there are 240 standard minutes in 1 metric hour
> Then divide by 100 to get 2.4 standard minutes per metric minute
Except there aren't 240 standard minutes in 1 metric hour. There's 2.4 * 60 = 144. That the author of the page couldn't keep the conversions straight does not bode well if we were to switch as a society...
I still don't understand how a metric day matches up with a standard day.
I flunked 10th grade math and later dropped out, my brain is having a stroke just reading those graphs.
I reverted back to seconds, which is how I used to use Unix timestamps when I first started with computers. But one day would have 100000 seconds, compared to 86400 seconds in standard time, so how can they both measure a day?
Strange the page does not describe the "metric second" then, since it is not the same as the SI second. I thought that the second was the unit which was the same as in SI (=metric). But then it is not the same as the SI second, so not metric at all. Very confusing.
That must be it. I kept watching the two clocks to try and figure out if the second was equal but couldn't. This is the key and should be the first thing you read, seconds not being the same length is a huge detail.
I feel it's an error to use the same names in the two systems. The values are different, it would be much less confusing to use names that are clearly different too.
It's not, hence leap second corrections to our current earth based imprecise observed solar time based on mean solar days (which are not apparent solar days).
Of course even if it were regular there's that pesky difference in rotation relative to what now??
Sidereal rotation time isn't equal to solar rotation time (mean or apparent).
I think it's more trying to make things which vary fit in a "you shall not vary" square box that is the problem. Technically, this metric/decimal method makes more sense for what we're trying to do, but it's less of a "time" thing than it is a "let's have the same numbers every day so we can agree on when synchronous events need to happen." To _measure_ elapsed time, using a fixed unit such as a second is perfectly fine.
My favorite is traditional Japanese time. Breaks day and night into 6 equal time periods each, and adjusts them as the seasons change. I made a Sahku Dokei (19th century Japanese pillar clock) simulator to play with it.
In a way, China kind of has something like the Swatch gimmick for real. There's just one time zone in the whole country (which is roughly the size of the Continental US). This has benefits (easy to coordinate video conferences in different cities) and drawbacks (the official time is far off from what the sun would indicate in much of China).
It only works because the overwhelming majority of the population and all of the political and economic power lies on the east coast of China in a single time zone. I doubt that the people in Urumqi are happy to have the sun rise at 10 am, and I doubt that anyone cares about their opinions.
I stayed in Urumqi four times (in 1993, 1996, 2006 and 2010), each time for some weeks. It is really confusing that the (traditional) working hours are from 10AM to 2PM and 4PM to 8PM. I found myself everytime looking at the clock substracting two hours. Similar as to when we changed currencies in the Netherlands when the Euro was introduced. I guess it would have taken about half a year stop doing the reverse time calculations.
Given that the sun rose at 10 am approximately 0 times this year [1] I guess the people in Urumql were ecstatic. Also note that the sunrise time varied by about 3h over the course of the year so how many times do you want to change the clocks?
Of course to your actual point and everybody else that brings that same one up. You do not need to wake up at 8am every day ... If the sun actually rises at 1400 then feel free to start your day at 1500. I'm not sure what why people keep arguing as-if they can't figure out a time besides 8am to wake up; look at the world around you, so many people wake up at wildly different times in a timezone.
I get that it's all arbitrary, but if we're going to go for a global time system, just makes sense to me to try and align with already-existing universal standards. Both UTC +0000 and UTC +0100 are pretty alien to me as someone in UTC +0800 so it's not like I have any bias toward Switzerland or the UK either way...
I think the idea is that for most human uses of time we don't specify start or end times to a precision of more than about 5 minutes. Stuff like train timetables you might want to go down to about a minute. So one could argue that we have at least 60 times the resolution we really need for day-to-day use.
If you absolutely need more precision (accurate timestamping) then decimals are available.
Yep tho most ppl use microwaves by pressing the "30s" button (I guess it would be labelled 1/2 or 1/3) n times. Other cooking seldom requires time precision < 1 minute, for finicky precise things you usually watch the process and manage it by eye, rather than relying on absolute time.
"way less precise" ? There are only 1440 minutes in a day, so a beat is 1 minute and 26.4 seconds, precise enough. And then, if you you want more precision, like we use seconds for minutes, you can divide a beat by 100 (@500.12), not less inconvenient than using seconds.
Apart from history and (dis)advantages, nobody seems to address the site's rationale for this (rather impactful) idea:
> would make all the mental math we have to do when adding and subtracting time so much easier—especially when it comes to different timezones
First off: the time zones argument is BS. And the people I know have no problem subtracting or adding (quarter, half or whole) hours to a given time. It's a skill we picked up at primary school, so it really can't be that hard. The people who can't do that, probably also will have problems with decimal time. The only thing that takes more mental effort is something like "193.8 minutes after 17:03", but how often that does happen?
The argumentations following the rationale are also BS: there's no AM/PM in a 24 hour clock (as mentioned in other comments), and there's no advantage to 3.33 vs 8 hours of sleep.
IMO there are no advantages, and the page doesn't discuss overcoming the disadvantages and how to overcome them, so frankly is irrational. There's no reason to discuss this.
You could call this lots of things, but you shouldn’t call it “metric” because its second is not a metric second. In the SI (metric) system, the second is one of the fundamental units. The world does not need a conflicting definition for the second.
As someone who’s implemented a date and time library, the real pain is in dates, leap seconds and time zone transitions. 86,400 seconds in a day is a relative piece of cake.
The AM/PM thing is a solved problem. Many countries (not the one I live in) already use a 24 hour clock, in which 11pm is 23:00. Because many countries use it, most devices that keep time can be set to 12 or 24 hour clock. That includes almost every clock I own, including the oven in the kitchen, my car, all the HVAC units, and of course phones and computers. An exception is the irrigation system – the old one (designed in the USA) supports 24 hour time, but its replacement (designed in Australia) does not. I don’t think anyone, seeing all my clocks, has ever commented on them being in 24 hour mode. Most people have seen it before.
You are right of course but also keep in mind that, and that is just my thoughts after reading, is that the French tried to implement decimal time along with the rest of what we call the metric system in the 18th century. And it was the only system people rejected. So I think the author named everything metric to make the point that if it would be part of the system a metric second would be … long. But again I could be wrong. In any case the page would also work by naming everything decimal-something. Maybe not as catchy.
I don’t know the exact history, but the rest of the metric system is designed with a base unit and decimal derivatives. Assuming we want to keep the day length consistent (I can’t imagine a system being practically useful otherwise), we’d have decidays, centidays, etc. and not have hours, minutes, and seconds in the system at all. A system with days, decidays (2.4 hours), millidays (1.44 minutes), and microdays (0.0864 seconds) doesn’t seem bad to me at all, I’m sure people would come up with a good name for 10 microdays for daily use (0.864 seconds).
kDay, MDay, etc could work in space but they don't fit well with the length of the year. As long as we live on Earth we cannot escape from our planet taking about 365 days to orbit the Sun. History proved that it's convenient to have the same event (let's say start of spring) falling at the same date every year. Hence all the refactoring of calendars and leap years.
I wonder how we would settle that matter if we'll ever be able to travel fast between planets. Each city had its own time zone before trains required us to sync them because of conflicting railroad timetables. So we ended up with the current timezones. With planets, each one would have its day length and number of days in the year, maybe even inconstant seasons in the case of precession of perihelion or double star systems. I'd say we'd settle on local time and a common space time but who knows.
I never thought about this. And actually never bothered to read up on the proposed terminology. Only thing I always assumed was that the first draft of all terms is not 100% what we use today. Especially because it comes from France. But that’s only assumptions. But yes I think you are right with the naming convention.
Time was the same for the people that were measuring it with the same tools.
Western sundials started with 12 hours as they worked only during the day [1] and people that were not measuring time eventually measured it with a 24 hours system.
I could not find many sources about Chinese sundials but from the pictures at [2] you can see that they had 12 hours in all the day. A hour on the second sundial is divided in 8 parts. The one in the first picture seems to have the same 8 characters as the other one but each hour is divided into 2 parts, each divided in 4 parts.
I'm not surprised that everybody settled around some small and convenient number. 12 has more factors than 10 and dividing by 2 is more convenient than dividing by 3. I would be surprised to find a 9 or a 15.
There is no conflicting definition. Minutes, seconds, thirds, and fourths etc. are sexagesimal subdivisions, as tenths, hundreds, thousandths, ten thiusandths etc. are for decimal.
I know what you mean, and you are correct in that this is why we have minutes and seconds of arc, but the linked page is literally suggesting a different definition.
there is no multiple definitions going on: what happened is an _elision_.
The actual word is "second minute" (as opposed to the "first minute").
Most languages have by now elided "first" from "first minute" resulting in the "minute" as we know it today, and elided "minute" from "second minute" resulting in the "second" as we know it today.
i.e. "second" literally means "the one that comes after the first", but is implied to be about the subdivision of the small unit of time.
Part of the point is the "second" definition based on a subdivision is no longer the SI second. You are right about the elision, but in the context of the time most people are now eliding System International ("SI") from the beginning rather than "minute" at the end.
"Seconds were once derived by dividing astronomical events into smaller parts, with the International System of Units (SI) at one time defining the second as a fraction of the mean solar day and later relating it to the tropical year. This changed in 1967, when the second was redefined as the duration of 9,192,631,770 energy transitions of the cesium atom."
A while "prime minute" generally contains "61" "second minutes" it sometimes can contain 61 "second minutes"; so clearly the "second minute" is the true unit here :-)
While metric time has some appeal, I think it is not ambitious enough.
Base 10 is not good. 10 just does not have enough factors, so we are left to deal with complex fractions. Let's instead use base-12 numerals and keep time unchanged!
In base 12, 1 year is 10 months (or 265 days), 1 day is 20 hours, 1 hour is 50 minutes and a minute is 50 seconds.
Easy enough!
Edit: wait! I just realized this is still not ambitious enough!!!
What we need is to halve seconds in two. So, in base 12: 1 day = 10 hours, 1 hour = 100 minutes, 1 minute = 100 new seconds.
"Ten", "eleven" and "twelve" can stay the same. "Thirteen" to "nineteen" are more problematic since "*teen" refers to "ten". We would like something meaning twelve-one, twelve-two and so on. Then "two-twelve" intead of "twenty" ? Feels heavy. Maybe stick to "twenty" then ?
If you're going to coexist with decimal, you're best off starting over with a distinct set of words.
For instance, scales have 12 notes, with seven common words for the notes (do, re, me, fa, so, la, ti). Add five more similar but distinct phonemes (go, ki, za, we, je) to insert at the position of the half notes skipped, shift la to position 1 since l looks conveniently like 1, and add nul for zero: nul, la, go, ti, do, ki, re, za, mi, fa, we, so, je.
Add in some rules for forming larger numbers (laj, goj, tij..., soj, jel, jel-la, jel-go..., jes-so, la-gross, la-gross-la, ...) and begin learning your addition tables over again (la plus la is go; go plus go is do; do plus do is mi; mi plus mi is doj).
Personally I think decimal time fails to recognize the cyclical nature of days. Decimal time is like switching from 360 degrees to gradians. A far superior better system would be to adopt radial time and have 2pi hours in a day. We could extend this to the whole year and have 2 pi months in a year, finally divorcing the counting of rotations from the counting of revolutions.
> A standard hour is broken into 60 minutes. There are 2.4 standard hours in 1 metric hour.
That makes 1 metric hour equal 144 standard minutes. Since 1 metric hour is 100 metric minutes, that means 1 metric minute is 1.44 standard minutes. But the site says:
> A standard minute has 60 seconds. There are 2.4 standard minutes in 1 metric minute.
Even without doing any calculations those scale comparisons for the hour and minute can’t be the same as shown. Standard is going down by a factor of 60 and ‘metric’ by a factor of 100 so they can’t keep the same ratio.
Time is never going to be easy to decimalise, since days, months, and years are all natural periods and not so easy to change.
Maybe once we have left the Earth and spread out into the solar system and beyond, there will be no reason to keep Earth time, and we'll just use Unix time, and stick to seconds, kiloseconds, and megaseconds. (Hopefully in the next few gigaseconds.)
I was going to say: time is based on degrees of a circle, and 10 doesn’t work great for it. I’m not a fan of non-metric length measurement, but time seems to be a rather sensical usage of it.
The important aspect of the metric system is consistency, not the actual base. The base is 10 because it makes math easier in almost all modern languages
Time is weird because its units are so necessarily arbitrary. We don't control (yet at least) the relationship between the rotation and revolution times of the planet, and both of those values are so very much essential
It's also interesting that you almost never will need to convert between time units. In normal life you will maybe convert minutes to hours and days or months (which aren't even of uniform length) into weeks and years. But scientists or engineers will always work in "metric" units of seconds and astronomers / archaeologists will always work in "metric" units of years
Compare this to mass and length/volume units, where a normal person will frequently need to traverse multiple orders of magnitude even just to bake a cake (grams to kilograms and milliliters to liters) and will have frequent experiences involving much higher orders (meters to kilometers every time they are following directions on their phones, or tons if they are loading a truck or buying a car)
Nothing about time is based on degrees of a circle, other than one specific way to visualise it - which many people don't use anymore and consider antiquated :)
Sundial; our notion of time is inextricably linked to the observed 180 degree arc path of the sun. The modern notion of time descends from that, and fails to stand on its own, despite silly attempts to define SI units via ad hoc correspondances.
What I don't like about this is the redefinition of the second. The length of a second and its subdivisions is so fundamental to so much of everything in our society. Other physical quantities like speed or frequency or power all depend on it. You would have to redefine everything. Kilometers per "metric hour", "metric hertz", "metric watts-hours", you get the idea.
> There is no AM or PM with metric time.
It isn't there in standard time either. It seems to be another uniquely American thing. The only time I'm exposed to AM/PM is when reading an analog clock. Or when talking to an American.
Heck, even in US, there's "military time" that is 24-hour.
That to me is the key point: measures of time are so intervowen with everything else, this would be an absolute nightmare to enact without seven years of fallout and random "bugs"
This is exactly the decimal time system used in France during the brief period of the post-revolutionary established First Republic, 1794 to 1800.
Although the metric system originated at the same time, it's important to note that the French Republican decimal second isn't the same as the one used by international metric system (SI) that we ended up with. So calling this "metric time" is quite misleading.
The decimal second is shorter than the standard one because there are 100,000 decimal seconds in a day vs. 86,400 SI seconds in a day.
(If you're a rich antiques collector, a late 18th century French decimal clock might be a very interesting object. My understanding is that they are rare because their active use was so short and most clocks were repurposed to standard ones.)
> Working with base-10 numbers is so much easier than trying to think in base-60, base-12, and base-24.
In some ways, sure. If you're doing precise mathematical things. Otherwise, if you're doing simple mental math, base-12, -24, and -60 have some advantages.
60 can be evenly divided into 1/2 (halves) or 30 min, 1/3 (thirds) or 20 min, /4 or 15 min, /5 or 12 min, /6 or 10 min, and by 12, 15, 20, 30, and 60.
24 can be divided by 2, 3, 4, 6, 8, 12, and so on for 12. I always assumed this was the reason for the "imperial" unit measures and for time and length. Dividing things into thirds is a common use-case and it's nice to be able to do that evenly.
This was a fascinating read for a totally unexpected reason.
I’ve spent most of my life in countries where the metric system is used for distance, weight, temperature etc.
This year I’ve had to travel a lot to the US for work and found the constant mental conversions a PITA. I kept wondering why people keep holding out against such an obviously easier system.
Then I read this article about 8 hours of sleep would be 3.33 metric hours. How you wake up at 9:50 after sleeping at 2:75 and I notice real-time at the absolute recoil I feel reading this. Maybe I’m getting older , but I completely get how familiarity to numbers being represented a certain way is hard to let go of.
8 hours comes from the worker's movement anyway. 8 hours work, 8 hours recreation, 8 hours rest. It's clearly only chosen to make a nice slogan. In reality 7-9 appears to be optimal. If you were using decimal time you'd just say 3-3.5 hours and be done with it. Convenient enough.
Yeah, too much of our society is based upon 24 hour time.
The transition is also unlikely to have many benefits. Unlike most of the other units of measure, the everyday conversions are fuzzy. The only exception I've seen is when payroll bean counters expect minute precision converted to decimal hours, which is a pain! Everything in science and engineering tends to be maintained in seconds, which is decimalized anyway so there is no benefit there.
I don't see "metric time" making any headway, particularly since something like universal time would be much more beneficial yet hasn't gained traction.
I feel like every developer/technically inclined person goes through a phase of trying to fix timekeeping. I know I did. I think it's a good exercise in recognizing that we (largerly) make tools and standards that fit existing social constructs, not the other way around, and that there are some things that are just inherently messy.
base-10 time only makes sense because we use a base-10 number system. the Babylonians didn't use a base-10 number system so the current time system was more intuitive. however, the French already tried this a couple hundred years ago. it didn't stick around.
personally, I'd prefer we drop all base-10 conventions (units, numbers, etc) and switch to base-12 and base-60 for everything (dozenal or duodecimal)
I still remember reading about Swatch time years ago in some pc magazine. Back then it felt like a cool idea that surely will be adopted as standard unit of time for the Internet era, making communication across the globe much easier and sadly... reality was quite different.
I fondly remember Swatch Internet Time and .beats. Back then (around 1998), I used to have a watch that displayed Internet Time and I genuinely believed that this was going to the future way of how we keep track of time and synchronise with each other.
I went into a pretty deep dive into dozenal systems a few years ago. I really like the dozenal unit systems that people have came up with and even designed my own. I don't exactly remember what I had but I think I tried keeping all of the metric base units and then multiples of 12 from there.
Some of the dozenal measurement systems try to replace the second or meter, for example, which I don't think would be necessary. And some also try to redefine the clock, but honestly 2 sections of 10d hours is more than dozenal enough. And I like the blended base-60/base-12.
What's really interesting is trying to extend the calendar to base-12. Depending on whether you want to keep the 7-day week or switch to 6-day week or abolish the week altogether[1], you can come up with several different interesting concepts.
[1] Even though it had 7-day weeks, intercalary days outside of the week is what shot down the closest we've got to calendar reform since Gregory. https://en.wikipedia.org/wiki/World_Calendar
I love how this mentions "decimal is as easy as money, assuming US dollars" when the US is basically the only country still using non-decimal units for most things, and money was wildly non-decimal in other places (cough UK cough).
Speaking of different clocks. Here is the year on a clock face. 12pm is summer solstice, 12am is winter solstice. https://clock.mohiohio.com ( I'm in the southern hemisphere so it might be the wrong way around if you aren't. I need to fix that.)
This is cool. However, for me, it's the months that are the weirdest issue. Why do we all get paid the same each month, pay the same rent/mortgage but some months have 4 weekends, some have 5.
We should have 13 months, all 28 days long, exactly 4 weeks. Whenever there's a leap year we get an extra day for free.
> Why do we all get paid the same each month, pay the same rent/mortgage but some months have 4 weekends, some have 5.
Is paying monthly an American thing? In Australia most things seem to revolve around n weeks.
- I get paid on Thu every second week (2 weeks)
- My pre-paid phone auto-recharged is every 28 days (4 weeks).
- House rentals are listed as weekly rates, paid at whatever n weeks you negotiate.
- Mortgages are monthly by default, but most let you opt for n weeks, so most people pick the day after they are paid. Interest is still added monthly though.
The only things I can think of that I'm charged monthly for are online subscriptions, which I suspect got influenced by America.
> We should have 13 months, all 28 days long, exactly 4 weeks. Whenever there's a leap year we get an extra day for free.
Same length months would be cool... but as above, months don't mean practical month to me except for having to remember which one we are in.
> Is paying monthly an American thing? In Australia most things seem to revolve around n weeks.
Like pretty much everything else, it depends. On the income side, being paid biweekly is the norm in my experience, but some jobs pay semimonthly (my current job does and I kind of hate it). Recurring bills tend to be monthly, but I've had some which were weekly or biweekly. So it all kind of varies.
Lol, this was EXACTLY my high school sr. year social studies project in 1980, with roughly the same effort (sadly, I didn't create a working clock). I won first place, with just this. The only difference is I named the metric units 'mints.'
Also, no mention of the need for only 10 big-ass time zones instead of 24 relatively narrow time zones. The continental US might have only 2 time zones instead of 4.
This is already the case, timezones have weird shapes and some of them are huge.
In Europe and China you have timezones spanning over 3 "sun" timezones. In practice that means the sun rising and setting at different times. People are used to have the sun setting late or rising early depending where they live.
There is some evidence our bodies have split up the day into 16 parts: consider the 90-minute REM cycle. This suggests it would make more sense to try to use base-2. 4 bits of depth required to specify the 90-minute "hour". 16 bits of depth corresponds to a resolution of about 1.32 seconds. Then following the obvious pattern however "minutes" would probably be 8 bits of depth, corresponding to 5.625 standard minutes, so this is the most jarring difference. Of course then you wouldn't have clean divisions into 3, 5, etc. but it would be simple to calculate, notate, and reason about.
What I do hate is the need to make a day ten hours. I'd actually prefer we keep a day 24 hours, it makes the transition a lot easier, and all the other mental math still simplifies.
Immediately after, in rationale, how much easier the math is... "There is no AM or PM with metric time. Just 10 hours in the day. To get a good night sleep (8 standard hours) you'd sleep for 3.33 metric hours."
Author has committed the classic blunder of ignoring the utility of a base that can be divided evenly into halves, thirds, quarters, sixths, and twelfths. Turns out the ancients knew something about head math because they had to.
Measuring time is simple. It is just a linearly increasing number, right?
No. We humans have additional demands.
1. We want a part of the number to be the same for the time of day.
2. We want a part of the number to be the same for the day of year.
This is fundamentally impossible, because:
1. The year does not have an integer number of days, so we use lap years.
2. The day is the duration of a complete turn of Earth around itself, but the duration is slowly changing, this means the duration of the day won't stay the same, so we use lap seconds.
3. Year and day lose their meaning if we leave the Earth.
4. The time of the day is different for different parts of the Earth (solved by timezones).
We should use an epoch-based time for points in time, i. e. just a number, and convert it to the date-time whenever neccessary. The exact form of the date-time is not important, only that we already understand it.
Switching to decimal time does not help: it does not solve the fundamental problem, however we would need to relearn the time of day. It is similar to switching to Dvorak.
"Metric time" should mean the time it takes for light to go 10e8m, which is ~3 seconds. If that's 1 chrono, then 1 kilochrono is 55 minutes and there are around 25 kilochronos / day. 1 megachrono is 386 days. It would be a very good measurement unit for a solar system based space faring society.
Decimal time doesn’t seem as useful to me as localized time. With GPS coordinates, we don’t need a timezone system, everybody could have noon at solar noon. When setting up meetings, one only needs to include the longitude and everybody could easily join at the correct time. It would be cool if airplane rides showed a real local time while in flight. Might be a cool app.
While we’re at it, the Gregorian calendar could be made celestial as well. Date could be based on the solstice/equinox and moon. Apparently this is called a lunisolar calendar and repeats on the Metonic cycle every 19 years. Though it’s off by a couple hours. That means if I recorded a message with lunisolar date and local time, a historian could probably infer the exact year it was recorded.
There's a parallel universe where the French chose to base their new system around twelve instead of ten. Divide the day into twelve twelfths and each twelfth into twelve parts again. (They'd have to coin prefixes for each.)
I once (around 1983) met someone who had an astrological (astronomical) watch, that showed 'time' depending on the position of the earth with respect to the zodiac. He believed that periods when a sign would appear on the horizon (and another disappear) were periods that you beter could not start another activity. I am not sure if it was one made by Jaap Venker. Jaap Venker made 1500 of these watchs which now sell for hunderds of Euro/USD.
I think having a clock that depends on the position of the earth with respect to the center of our galaxy is an interesting idea.
On the likelihood something like this might be considered, dont hold you breath. Our continuing use of Babylonian time is probably the most pervasive and enduring network effect known to humanity.
To the author: the font is so thin that it exposes subpixel color artifacts on a 1080p 15" screen. I'd remove all the font-weight statements from CSS or set them to normal, except maybe for the one for H3.
Fun metric time fact is that France conceded to Greenwich becoming the international prime meridian on the condition that the conference on the meridian also expressed a hope for a decimalised system of time (and angles). https://en.wikipedia.org/wiki/International_Meridian_Confere...
In the end they ended up not adopting the Greenwich meridian for quite some time afterwards.
10 hr dial will certainly create problems for human systems, because, time counting is getting relatively more coarse that way. instead a 20 hr dial is better facilitating, but can it be called matric time?
thought as, a 10x10 clock, metric because 10 hr dial, has 100 mins every hour. it is sort of ok, but number of hours reduced from 12 to 10, that is counting 4 less hours everyday, means trouble and 100 mins every hr though --- in total a 10x10 clock is 'difficulty'.
a 20 hr dial, on the other hand --- dose it solve?
The problem is base 10 is a terrible base for any unit that needs to be divided. It’s an arbitrary base with no positives based on our fingers. No reasonable species would have picked base 12 when base 12 is so much better and so close. 10 can’t be evenly divided into 3 or 4 parts. 60 minutes in 3 parts is 20 minutes. 100 minutes in 3 parts is 33.3333 minutes
True. And having 7 days per week is majorly impractical, also for reasons of divisibility. With 7 being prime, it's impossible to do things every other day or every three days without being left with a remainder.
It's a shame the internet has pinned this on bodybuilders just because of the URL. TheJosh has a progress photo on his profile - he must be very deep into an offseason. bodybuilding.com is a general exercise forum and is infamous for the trolling on its misc board.
Just as bad as having 31 days a month, for randomly chosen months.
If any periods of time need to fixed it's surely weeks and months. How about 5 day weeks (3 working, 2 not - and yes, therefore 10-day "fort"nights) and scrap months altogether...
Wouldn't that make sense in a context where the useful number of 6 counts the days reserved for the affairs of mankind, and the seventh was an extra day reserved for rest or worship? I.e. the background of our 7 day week
I'm sure that's religious revisionism (just like the Christmas holiday). Like so many time-based things, the 7-day week comes from the cycles of the Moon, and goes back at least as far as the Babylonians.
Right, religious zealots have been ever revising the story of human history, which has been mostly secular peoples that practiced empiricism and appreciated astronomical phenomena as interesting but did not take them as divine.
12 hours / 9 = 80 mins. 12 mins / 9 = 80 secs. That's the point. And yes, Base 12 is superior to Base10 for this specific reason. But only for this use case.
I made something similar a while back where you can set your own start and end time for the "day": https://alexsaveau.dev/10hrday
The point was to be able to divide the day into nicely sized bites for getting stuff done. I chose 10 minutes in an hour for that reason: you can get a small task done in one "minute."
I think it was sometime between 10 and 20 years ago, I was curious to see what a clock that shows fractional time in years would look like: https://dmitri.shuralyov.com/projects/shuryear-clock/. I found looking at it occasionally reminds how even a year worth of time can pass fairly quickly.
The first thing that came into my mind seeing this title is the time component in the metric tensor [1]. Which I was asking why it is on HN front page :)
I'm really fascinated by such ideas. However, it's just academic as time is so much baked into physical devices (watches, clocks, ...) so this will most probably never happen.
I'd rather start redoing the calendar. It doesn't need to be metric but the different lengths of the months (and the naming Oct. should be 8 not 10?) could be reworked.
Yes. Decimal time almost happened in France during the revolution, but fixing all the clocks was impossible so people kept their standard clocks, so they kept the old habit...
I'd pay good money for an analog wall clock that has 10 hour days and 100 minute hours! AFAICT, this product does not exist. Closest thing I could find was this blog [1]
I just realized that the standard time watch face feels like it will follow the sun rising from left to set on the right side. But since the sun is rising in the east, it should be horizontally reversed. Or you should always look to the south / to the sun. Or it feels right because of my western idea of writing from left to right?
A 24 hour clock with midnight at the bottom tracks the motion of the sun if you are in the Northern Hemisphere looking South. If you are in the Southern Hemisphere looking North you need a clock that goes anti-clockwise.
Well, the nice thing with decimal time is that it also gives you a better indication what portion of the day is over. at 6 o'clock you know 60% of the day is gone. With 2pm you don't make that calculation. Not that it's important, but it's one of the nice benefits.
But how would daylight savings work? Or perhaps more seriously, time zones...
Anyway given the U.S. is still hanging on to measurements like Fahrenheit and inches, we're not going to see more sensible ways to measure and indicate time until most of humanity is living on a different planet.
The major flaw of metric is that half of 10 is 5 and half of 5 is 2 and 1/2 . A good measuring system should not yield fractions when dividing by 2 until you reach 1. I will fully support the octal system. Long live Octal. Off with the pinky fingers.
Considering it would take 3-4 generations to adopt a new time standard, I'd opt for some kind of "universal time" instead - not tied to one random planet (Earth) rotation speed. Think stardates, but real :)
There is really nothing scientific about time as we mortals use it.
As soon as you bind the definition of time to the rotation of the earth, all bets are off about being "scientific".
A day has x hours.
An hour has y minutes.
A minute has z seconds.
Therefore, you have now defined a second as 1 / (x times y times z)
What happens when the earth speeds up by a fraction of a second?
Does the definition of a second change?
We can't define time with a day has x hours, an hour has y minutes, and a minute has z seconds.
If we define time outside of the rotation of the earth, metric is meaningless. We should be able to adjust. Let's say earth sped up. We should be able to say a day has more seconds now.
My guess is you define a second as the following?
A more precise definition of a second is the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom.
Ok that's good. In practice, we aren't very scientific. It will matter whether a minute has 60 seconds or 61 seconds or 100 seconds. I'm guessing we will have bigger problems if earth rotation spend up or slowed down considerably anyway.
Lots of people are pointing out inconveniences with this decimal time system which are the same inconveniences I find with other metric systems. You can’t divide by 3, 4, 6, or 8 easily in this case.
I'm confused by this page, the standard clock is in my timezone (CEST, UTC+2). Is the metric clock also set to this standard, i.e does 00:00:00 in that clock occur at 22:00:00 UTC?
Nooo, we should not lose the one place where duodecimal actually has a footing, even jokingly! We need to replace more things, including our counting, with duodecimal.
I can't believe no one has pointed out the one huge fatal flaw in metric time yet.
The resting heart rate of a healthy adult is approximately 60 BPM. So 1 second is 1 heart beat, which makes it extraordinarily intuitive since it's literally pulsing through your body.
Making a metric second take longer by 40% is the opposite of what you want since average resting heartbeat of the population is higher than 60 BPM.
PS: Anyone downvoting is someone who feels offended having significantly higher BPM due to lifestyle. For normal people, the intuitive measure of a second is very useful.
I always had the feeling that I'd love decimal time, but seeing it on a clock, I absolutely love it. Especially that it is called metric time, since it is not just decimal, but it is the natural time system in our metric system, which we doesn't use bc of inertia, but that can and should change.
honestly, the mental math around metric time demonstrated in this post makes me think we should go the opposite way and make everything else base 12 instead of making time base 10. Being divisible by 2, 3, 4 and 6 makes a lot of mental math easier.
I like imagining how the life would look like if you change what is essentially a convention. Set the week to 6 weekdays and a 3-day weekend, have 6-hour workdays, or 80% company work / 20% service to your community. I'm very curious about the societal change a universal basic income could bring.
With metric hours, I'd hope 1-classical-hour meetings wouldn't become 1-metric-hour meetings because that's what we did in the past ;)
My version also removes timezones. The numeric time is the same whereever you are in the world, but the display changes based on your location so that local solar midday is straight up on the clock face and local solar midnight is straight down on the clock face. Day and night hours are drawn on the face.
I used to have it attach to your google calendar and draw meetings onto the clock face too, but I think in the last 11 years the google calendar api has changed and I haven't updated it.
Obviously once you have metric time, you need a metric week too, so I have new week days: nullday, unday, duoday, triday, quadday, hexday heptday, octday and nonday. There's no need for months anymore, I just number the weeks. Of course 28th Quadday is the 285th day in the year, as the first day is 0th Nullday. Fixing the number of weeks in a year to a round number is left as an exercise for the reader.
Time and date maths becomes very simple.