Due to my background as both an American and (chemical) engineer, I am fluent in metric, American, and domain specific unit systems used in chemical engineering that are neither American nor metric. I can switch between these as necessary quite fluidly; it is an elementary skill as a chemical engineer.
I would argue that being limited to a single unit system is like being limited to a single language. In principle, no one needs more than one but in practice the differences in expressiveness for different purposes are interesting and useful. These differences in expressiveness are connected to the continued existence of different systems.
America is big enough and is sufficiently independent of trade in its economy (most of its stupendous production is also internally consumed) that the cost of not transitioning to metric is marginal. I understand why metric is a good system but simultaneously understand why the benefit of metric is dubious for the average American. Remember, American units are defined in terms of metric units; it is a preference, an American can precisely convert to metric at any time if they deem it useful. But they don't because it serves little purpose.
Basically, like their language, Americans occupy a big enough economic sphere that they get to define their standards. An enormous number of global standards are American in origin as it is. I don't sweat the lack of metric even though I use it routinely. Once you become familiar with enough unit and arithmetic systems, you quickly learn that they all suck in some context.
It is pointless to turn these things into religions.
>I would argue that being limited to a single unit system is like being limited to a single language. In principle, no one needs more than one but in practice the differences in expressiveness for different purposes are interesting and useful.
After considerable reflection, I have come to the view that in fact the differences are not interesing and not useful. What is useful though is consistency and ease of application (which the metric system has) and familiarity (which just depends on the circumstances).
I work in the oil and gas industry, and I had to learn to think in terms of bizarre units like stcf/bbl, or pounds per gallon. In the end, the only driver was that once you digest enough information, you can compare things on a like-for-like basis... if you know the horsepower ratings of 10 car models, it is more immediately useful to be given the horsepower of the car you consider buying rather than its power output in kW.
But in the end I see no downsides, only upsides, if the entire industry switched to the metric system.
stcf/bbl (standard cubic feet per barrel) by the way is a customary measure of the amount of gas dissolved in the oil. After 25 years in the industry, I know for example that 1000 stcf/bbl represents a gassy oil that will take some effort to stabilise, and 100 stcf/bbl is a relatively dead crude... so inormation about new crudes is useful to me in this form. But - what a weird unit it is! The information could much better be provided as a dimensionless ratio, eg gas-to-oil ratio of 200 for gassy oil would be intuitive and better suited to use in further calculations, and once you've seen enough data expressed in this way, the familiarity issue is taken care of.
I invite you to provide counterexamples, where the customary unit has an inherent advantage (familiarity does not count). I can't think of any.
Cooking units are much better suited for the task than metric units.
Typographic units are non-metric and metric units are inconvenient in typography, because millimeters are too big for character sizes and too small for page units and centimeters are both too big and too small at the same time.
French (?) shoe sizes are numbered like 38, 39, 40, 41, etc. and the difference between these sizes are 2/3 cm, because it's a practical difference to mass-produce different shoe sizes. It's not metric, obviously.
Parsecs are not metric. Heck, light years are not metric; and look at how rich the unit is, it tells you a lot about the distance. Try to express the distance to Alpha Centauri in metric units and comprehend it.
> Cooking units are much better suited for the task than metric units.
No they are not. Measuring everything on the scale to a gram yields much better and consistent results than wondering if your dish will fail because some inexpedient was more tightly packed or if your spoons are the same spoons that someone else had.
And while a person could argue that ounce vs gram is moot, it is total nightmare for volume given stuff on dry ingredients.
I agree that weighting is the key to stability for professional cooks and bakers, but up to a gram or not depends on what you're weighting, how much of it, and for what purpose (and yes, the gram itself as a unit is irrelevant). This is engineering tolerance and it's same in every field; even a fine metalworker can be imprecise provided he stays within the defined range.
Yet for part-time cooks as most of us are weighting is impractical: we don't have scales, and even if we do the amounts are usually too small. And stability is not necessary the goal :) For home cooks the usual task is to scale a recipe up or down and with cooking units the math is much simpler. Another thing about cooking units is that the number of different units helps to convey that engineering tolerance I've mentioned; the smaller the unit, the less the tolerance. When everything is expressed in one unit, you need to specify the tolerance explicitly.
This is a very American view. When I grew up in Germany EVERYONE had a scale for cooking because every recipe gives all larger quantities in grams. I think the only exception is the difference is the equivalent of "a pinch". When I moved to the US I didn't have a scale and most recipes I cooked were American and I ended up using cups etc. I got totally used ot it after a while and everything was fine. However, I love cooing and wanted to get more into modernist cuisine. That pretty much requires a scale. Having used a scale for cooking again I cannot overstate how incredibly convenient it is. It saves a lot of time and cleaning of measuring devices. You just put your container (pot, bowl or whatever you will use later for processing) onto the scale; zero the scale; and start puring in the first ingredients till you go tthe right amount. Then you just zero the scale again and do the next ingredients. No need to rinse measuring cups in between. The metric system also has the added benefit that 1 liter translates for water based liquids to 1kg which means you again just pour the liquid into your bowl that's on the scale. The scale has pretty much only upsides. The downside that most people in the US don't have a scale is only a downside because culturally US Americans don't use scales for cooking. On the flip side you can say that measuring cups are incovenient because Europeans don't have them.
I love to cook myself and I agree that using scales yields a more stable result (provided the ingredients are stable). But to be simpler than volume-based units the scales have to be pretty sophisticated, like the modern electronic scales. Would the process be simpler if the scales could not zero-out on a weight and you'd have to weight the ingredients separately or do the math mentally? Or if they were mechanical and thus harder to read? Or if they were balance scales with pans and separate iron masses? :) Well, this is too much, perhaps, but I remember using such scales to mix solutions for photographic process. But not for cooking; this would be completely unpractical.
Of course, nowadays the scales are very smart and a pleasure to use. But here's the thing: with smart equipment we don't need to bother about being metric. It's not a problem for smart scales to display the weight in any unit imaginable. I think there must be culinary apps that convert between weight- and volume-based units to suit everyone's tastes and they're either free or cost less than $5. The proponents of metric system claim it's simple. Maybe, but it's a simplicity of a typewriter compared with a modern typesetting program. Why would anyone with a smartphone care about this kind of simplicity?
For example, there's ISO 216 standard for paper sizes: A0, A1, A2, etc. The sizes form an interesting progression: each size is exactly 1/2 of the larger size. But how did they select the 1st size in the row, the A0? It's pretty interesting: the A0 size is exactly 1 square meter. I bet the designers of the standard though it would be be a feature, because the users (e.g. printers) will be able to use this fact to simplify their calculations; e.g. you need to print 1000 A4, you know it's 1/16 of A0 and the paper is 100g/sq.m., and you go from there. But I don't believe anyone does this kind of math nowadays; everyone has computers and printer jobs have much more variations in paper sizes and densities for this "simple" rule to be of practical use.
"Measuring everything on the scale to a gram yields much better and consistent results than wondering if your dish will fail..."
It really depends on the recipe. Some cookbooks, for example, recommend that measurements be as accurate as possible for baking. However, I doubt the majority of the worlds population use scales to measure ingredients when they cook. In fact, great cuisines from around the world have developed perfectly well without the use of scales for measuring ingredients. Can you honestly say 250g of chopped carrots is preferable to two medium-sized carrots, chopped?
> Parsecs are not metric. Heck, light years are not metric; and look at how rich the unit is, it tells you a lot about the distance. Try to express the distance to Alpha Centauri in metric units and comprehend it.
The distance to Alpha Centauri is so mind-bogglingly huge that the human mind can not comprehend it. This has to do with the scale of things, and has absolutely nothing to do with the unit.
I can define the distance to Alpha Centauri to be 1 Alce. 1 is a really nice number that you can grasp, but it still helps you no further in understanding that distance.
Can you actually name this number without looking in a dictionary? I cannot. Nine and a half zillions, I guess. So Alpha Centauri is like 44 zillion meters away; very enlightening.
Sure: nine quadrillion, four hundred and sixty trillion, five hundred and twenty-eight billion, four hundred million.
Next would be quintillion, sextillion, septillion, nonillion, decilion, undecilion, duodecillion, tredecillion...um...quaterdecillion(sp?)
Granted, I had to write it out to convert (or I could have sat and derived the rule--divide by three and subtract two to get the prefix). And I may be an exception, having spent a brief bit of my childhood interested in names for very large number. :-)
But it's pretty easy to know that 10^15 is bigger than 10^10. 10^5 times bigger. Ideally, we'd get sufficiently familiar with scientific notation to immediately understand that the difference between 9 lightyears and 9 000 lightyears is the same as between 10^12 meters and 10^15 meters.
I don't hold any illusions that people do have that fully internalized, of course. I don't.
You probably know Archimedes names for big numbers too :)
What I'm trying to say is that every sufficiently different field of human interest has some kind of units natural to this field. The metric system claims to be universal, but it's like one size fits all, but nobody looks their best. I can measure a backyard in millimeters, but why would I need it? I don't need this kind of precision there. A yard is a more natural scale because it's about the size of a step and this is about the right unit for what I do in the backyard. And inside the house I usually need a finer resolution, so I use feet instead of yards. If I build something on a smaller scale, I switch to inches; and so on. Every field has its best unit and the relationship between them is organic and is not always the fixed 1:10 as with metric.
You can always invent a name (and "light year" is nothing else, just not based on a "round" number of meters).
For example, since failure rates in safety engineering are usually pretty small and nobody wants to pronounce "something times ten to the minus eight" or similar, the term "fit" was invented to stand for 10^-9.
(Although I admit that "failure in time" is a stupid name for a dimensionless constant)
Similarly you could invent a "galactic length" or however you'd like to call it.
Light year is not just a name; year is a natural length of time and light speed is also a natural constant. The whole unit is natural and is based on things that make sense on such distances. This is same for all natural units; they grew up from usage. (This resulted in things that appears illogical, like having different with similar name for different applications, but this is only appearance. We measure water differently from oil or precious drugs and differently when we cook with water or build a dam, so it makes all the sense to have different units for different purposes.)
Meters, however, have a different story; the meter was invented by some French scientist about the time they were changing everything after the revolution, including month names; somehow month names reverted back to normal, but meters stayed. The meter was initially defined as 1/40,000,000 of the length of a meridian. Here only the length of the meridian is natural to some extent, although I fail to see how it is relevant to what is normally measured with meters, and the constant is completely artificial.
All the unit systems are functionally equivalent but each generally have their own advantages. Chemical engineering 101 was learning to be fluidly unit agnostic. It is such a trivial skill that I do not see why it matters other than training people to do it.
The nominal value of metric is that it is easy for back-of-the-envelope arithmetic. That is only true for abstract math because humans learn base-10 counting. There are many kinds of engineering systems, like chemical reaction kinetics or computer science, where base-10 counting systems have minimal relevance to back-of-the-envelope computation.
I actually have designed a lot of physics-based representation software. It does not operate on metric models of reality even though it is translatable to metric units and presented that way to the end user for both input and output. In fact, a lot of sophisticated implementations represent all values as infinite intervals over the domain of integers with a strong preference for binary treatment because it is efficient. The presentation unit is almost irrelevant. Same with chemistry, which has its own archaic unit systems.
Metric has a lot of practical advantages and disadvantages, depending on the use case. They are all trivial in real engineering scenarios.
Americans occupy a big enough economic sphere that they get to define their standards
[Disclosure: US citizen here] While this is true, I don't think it's helpful to attach American economic/cultural dominance to the metric debate (at least wrt advocacy), because US and Imperial measurements and standards are almost identical. But that "almost" is an important qualifier. The root system they both share derives from the Middle Ages. Here's a helpful explanation of the differences:
I would argue it is extremely simplistic to compare languages to unit systems side by side.
In languages, it is both a way of communication and also a way of thinking, to say the least. Mandarin Chinese is my native language. My English is just ok, but it already gives me a new way of thinking, much more than a different perspective.
Also, there are many words and phrases in one language that you can't find a matched translation in another language, even between languages that are closely related, like English and German.
Different unit systems, on the other hand, can be converted to and from each other without losing anything. There could be some affinity attached to a system one grew up with. It is incomparable to languages, though.
I would argue that being limited to a single unit system is like being limited to a single language. In principle, no one needs more than one but in practice the differences in expressiveness for different purposes are interesting and useful. These differences in expressiveness are connected to the continued existence of different systems.
America is big enough and is sufficiently independent of trade in its economy (most of its stupendous production is also internally consumed) that the cost of not transitioning to metric is marginal. I understand why metric is a good system but simultaneously understand why the benefit of metric is dubious for the average American. Remember, American units are defined in terms of metric units; it is a preference, an American can precisely convert to metric at any time if they deem it useful. But they don't because it serves little purpose.
Basically, like their language, Americans occupy a big enough economic sphere that they get to define their standards. An enormous number of global standards are American in origin as it is. I don't sweat the lack of metric even though I use it routinely. Once you become familiar with enough unit and arithmetic systems, you quickly learn that they all suck in some context.
It is pointless to turn these things into religions.