Hacker News new | past | comments | ask | show | jobs | submit login
“ISO obstructs adoption of standards by paywalling them” (twitter.com/timsweeneyepic)
1010 points by linksbro on March 8, 2021 | hide | past | favorite | 312 comments



Finally this is getting some attention. Somewhat related, Healthcare startups are struggling with this because the standards they have to comply with (for developing medical software) cost up to 280 EUR (for a pdf!). [1]

One common workaround is to go to "the Estonian site" which offers the same, English version of standards for a much lower price [2]. Being a bit cynical, I would say that Estonia prioritises open information much more highly than.. other developed countries. I created a price comparison on my website [3].

But: The core problem of standards being openly available is still not solved. Why is this not possible? For me, standards are very comparable to the law: A large number of people should comply with it. For that, they must be openly available to everyone. Everything else doesn't make sense. Is that unreasonable?

[1] https://www.iso.org/standard/38421.html [2] https://www.evs.ee/en/ [3] https://openregulatory.com/accessing-standards/


> Healthcare startups are struggling with this

Healthcare startups are one place that shouldn't really have a practical problem with this - the cost of standards is a drop in the bucket compared to your overall cost of regulatory/quality implementation, filing, etc. Sure, it stings a bit to hand out a few thousand for pdfs but it isn't going to affect your business plan. If you don't have a plan to pay for all of this this you are already dead in the water.

I'm all for streamlining the implementation but a lot of it is just about implementing solid engineering practices. Note, I'm absolutely not saying you should insist on bloated slow processes, this definitely isn't required.

I also think it would be great if we could fund the ISO a different way and make all the standards freely available. It's just not going to make any real difference to your ability to execute as a healthcare startup.


> Sure, it stings a bit to hand out a few thousand for pdfs but it isn't going to affect your business plan.

How will they even know what their business plan is without knowing what standard they will be providing?

Last time I checked in Australia, compliance with pay-for-access standards was required in many laws. It is outrageous, because it means you don't even know what you are legally required to do without paying some corporation money.

To add humour to the situation, I think that corporation got bought out by the Chinese at some point. Can't swear to that, I forget what is handled by SAI Global v. Standards Austarlia.

Anyhow the whole situation is a black stain on the idea of equal access to the law.


> How will they even know what their business plan is

If you head into something like healthcare without any idea of what you need to do, you are begging for trouble. By the time you are buying any standards, you know which ones you need and how they fit in with everything else. Any real business plan will have researched this enough to get a ballpark. This is table stakes stuff.

You can research it all yourself in a few days and get a reasonable idea. If your team is all inexperienced you should probably spend a few consulting hours on guidance.

If you have anything like a real plan for a company, this is peanuts. If you don't, you do not need any of this stuff yet.


There’s a difference b/w how things are and how things should be. The former doesn’t excuse away the latter.


I think that everything a law sets forth should be publicly accessible. No law or reference in law should be paywalled. There might be a small fee involved, but it should only exist to cover administrative costs of distributing the information.


> but it should only exist to cover administrative costs of distributing the information

and that is what created the current situation.


> There might be a small fee involved, but it should only exist to cover administrative costs of distributing the information.

Laws should be freely (as in free beer) available. Laws should not be tucked away behind a paywall or behind "case law" or behind "international standards" that the layperson can't access without a fee. No fees. That's what taxes are for.


> How will they even know what their business plan is without knowing what standard they will be providing?

Are there no drafts available for most standards? For C++ the draft is often as good as the standard, and anybody can access the drafts. Is there nothing similar in other areas?


They exist and it is possible to get them, but I have never done so in a way I could find through a search engine.


Well, if you're a single guy WFH with a cool-idea for some medical tech and you want to build a prototype to see if it's worth starting a company, then 200€+ per PDF (per person?) adds up and becomes cost-prohibitive.


You don't need any of this to build a prototype. You need it to productise your cool idea, and they you aren't a single guy WFH.


How can you build a prototype without the information in the standards that tells you the limits to how you design your prototype?


You are going to have many iterations after that first prototype. Your looking for proof of concept here and some early user testing, a long way to product yet; basically enough detail to know the right questions to ask.


Spend GP's €200 on some time with a consultant who has already read them.


Just do it all again when you get customers, which you can only get after your’re actually approved for business, for which you need the standards.

I guess maybe you can just ‘hope’ you comply and let the inspection figure it out if they ever show up.


I agree with you, partially.

Yes, the short-term financial implications (cost of standards) are minor compared to cost of employees and consultants (source: I am a consultant). However, non-free standards have a gazillion second-order effects which tend to get overlooked:

- Potential founders (think YC) can't "just quickly browse a standard" to see whether they'd want to build a startup in that area.

- Individuals not affiliated to a company won't purchase them for themselves (too expensive) and therefore are excluded from contributing.

- People can't write blog posts which freely cite passages from a standard --> Less public information on how to actually implement a standard in a company.

- People can't make presentations (slides) with passages from a standard --> A lot of paraphrasing and beating around the bush, making presentations more useless.

Imagine, for a second, the following thought experiment: Open source software on GitHub would be behind a similar paywall, say, 50 EUR per software package. You could make your argument again: Compared to engineering salaries, that's a minor amount. Sure. But the second-order effects are gigantic, essentially killing the open source ecosystem because individuals (like a random person from a third-world country) are entirely cut off and barred from contributing.

Or, like: Would DHH have written rails if access to Ruby would have cost 50 EUR?


I'm just finding the healthcare example among the weakest.

To be a bit US specific (this varies worldwide); the FDA publishes a ton of information on what their expectations are. You absolutely should be reading their guidance documents, filing process, etc. That gets you past the "is this something we should do" phase.

You'll likely get to the point you have a conformance plan for 13485 or similar, at which point you have a real project committed so the small fraction cost thing applies.

The most impactful healthcare standards aren't about "here's how you build a device", they are about "here's how you build an organization capable of building, shipping, and supporting a device".

Once you've got your head around that, you're off and running. On the R&D side of things, there is actually little bureaucracy for it's own sake, mostly it's about implementing good engineering practices in a traceable way.

I guess my response is to your "Healthcare startups are struggling with this" is that I know a lot of healthcare startups, and they struggle with a lot of things, but this typically isn't one of them.

To me ISO-8601 not being freely available is obviously counterproductive. ISO 13485 or 60601 not being freely available is more "well, not optimal but doesn't make much difference"


I'm in biotech and its a very similar situation here. We're a small company and dropping a few grand on regulatory documents would be a non-issue, a negligible part of our operating budget. Having said that, we would rather just pay for consulting/expertise from someone who has experience with the regs rather than interpret them ourselves.

I remember buying my first ISO document (INCITS/ISO/IEC 14882-2003) for $30 in like 2005 or something, back when I was still writing code. I thought I would become a better C++ programmer, armed with the standard, but turns out it didn't help me at all (reading books/writing code helped more). I did become a better standards-lawyer though! :D


Very much in favour of open standards on principle, but is a few hundred EUR really an obstacle for any kind of serious medical startup?


It is:

- You need to purchase multiple standards (at least 4).

- In theory, you need to purchase a multi-user license if more than one person should be allowed to read the pdf in your company (hint: nobody purchases the multi-user license).

- Every few years, new versions of the standards are released which you have to purchase.

- Sometimes, you just purchase standards to realise that they're not applicable to your company.

- The industry is riddled with shadiness: A German standards web shop offers a "standards flatrate" for a "great price" of e.g. 750 EUR for 10 standards. [1]

- Getting off-topic, but more related shadiness: Your purchased PDFs are watermarked with your company name and full name of purchaser (!) in the footer of each page to prevent sharing.

[1] https://www.beuth.de/de/regelwerke/normen-flatrates-im-ueber...


Even with all of these, the cost of implementing the standard (time for engineering, design, etc.) will likely very quickly dwarf what you are paying for the standard itself. I agree that open would be better, but the fees themselves really do feel nominal compared to overall implementation cost.


The point is that it kills everything that isn't an engineering project with some funding secured. It kills exploratory work by individuals or small companies. It kills education. It kills popularization of standards.

And in case of standards in computing, like ISO 8601 - a lot of them are of interest to open source developers. If they could access them for free, they could make their code compliant. Software companies use a lot of open source, and often whether or not a product follows some standard somewhere is entirely dependent on whether the OSS component it uses follows the standard.


> - In theory, you need to purchase a multi-user license if more than one person should be allowed to read the pdf in your company (hint: nobody purchases the multi-user license).

How does the law treat this differently from a book? You can buy a book and then give or sell it to whomever you wish without any restrictions... how is this different?


When you try to compare digital downloads to physical media that isn't easy to copy, like books, most of the logic falls apart just like that.

Then there's also another side to digital distribution, especially in entertainment — if you bought a (heavily DRM'ed) video game, or a movie, or a book, or something else, online, it's tied to your account. You can't lend whatever it is you bought to a friend like you absolutely could the same exact thing on a physical medium. You can't resell it either. You also rely on the mercy of the seller to not pull your access to the thing. Yet, even though it lacks this basic trait of a physical medium, sellers treat digital and physical as mostly the same thing.


To clarify, I was comparing physical media against physical media. The standards are sold as physical copies. e.g., https://www.iso.org/standard/65464.html


Oh shoot, sorry, I just realized I got confused. I misinterpreted the original comment (about the PDF) to mean that you're supposed to buy multiple physical copies too, and that nobody does that either. If that's not the case then that would explain it. Though then the question would be—is it not worth just buying the physical copy and passing it around?


Buy a physical copy, scan it, and give the resulting PDF to everyone who needs it, lol


While A few thousand euros definitely inhibits an amateur from becoming an expert, which is a terrible thing that greatly reduces the labor pool and possibly even some startup ideas, but I'm not sure it would stop many startups.


And if it did those startups never stood a chance in the first place


Not every startup has big investors from day one


Of course this is adjacent to the point of the entire thread, but how many medical companies/ideas have emerged from the mythical garage?


<Completely ignores linux> Sometimes people just want to make the world better, why stand in their way?


It's not just one document. And you don't know what will be relevant before you buy them, because you won't know the contents!


ICD code licensing is highway robbery. I would guess most health tech startups also need those.

And those prices look reasonable compared to drug database license costs.

And the real killer is meaningful use certification.

The whole field seems engineered to prevent competition.

So while ISO costs are unjustifiable, they’re a pittance compared to other compliance costs that most in the field will encounter.


ICD-10-CM that you need in the US can be obtained from CMS free of charge. CPT is obnoxious, but you can also generally substitute HCPCS from CMS as well at no cost.

Source: work for a medical billing company, have had to deal with this garbage before.


Not sure about licensing, but one important difference with ICD codes is that you can access them for free:

https://icd.who.int/en


The whole field seems engineered to prevent competition.

It is. After all it is not in the interests of anyone who is established in the field to make competition easy.


The problem is not just a few hundred EUR if that standard is your core business, it's the 10's-100's that are roughly adjacent that if they were free you would just have on hand and use/adopt if it makes sense, but to have a price gatekeeper means you have to think about every single standard that might make sense to follow.


The problem is that all these standards apply to large and small devices, and to software as well.

I was once tangentially involved with an app that was basically a weight tracker. It was supposed to be a simple, reliable solutions for doctors to prescribe their patients, that allowed doctors to export data, and did not have annoying ads.

Since doctors were supposed to prescribe the app, it had to be certified as a medical device. It was fortunately a self-certification process.

But still the certification process was 10x more expensive than just the app development.

The problem is that not every medical "startup" has multiple full time employees and millions in funding. Some are just a single doctor with a simple idea, and for them a few thousand dollar here and a few thousand dollar there quickly add up. The standards aren't their only expense.


It's a few hundred for one person, but buying it for your whole company is so expensive they won't even tell you the price: https://www.iso.org/terms-conditions-licence-agreement.html#...


If they charge that much, then perhaps we should choose not to use ISO standards whenever possible.


A better model is certification. Make the rules public, and when you want to sell your product, you must pay a standards agency to put their logo on your product.


You have to do that too. Well, you can never use their (the ISO) logo but you can use the logos provided by the certifying company to "prove" that you are certified. People don't abuse those logos because the companies in question have a lot of lawyers and will litigate relentlessly.


The fact that the content is copyrighted and can't be publicly discussed is the main issue.


That's not what copyright is at all.

Copyright might restrict reposting parts, but nothing (except perhaps license agreements) restricts public discussion.

You can see this by the fact that Wikipedia has complete details of the ISO date standard (as referenced in the parent tweet Tim replied to).


This means every single thing must be very carefully rewritten. You can not cite every single sentence in the full document, you would have to mimic Wine and rewrite the entire thing until your lawyers aren't worried about getting sued anymore.

And nobody's doing that.

Most people aren't going to quote much anyway, which means the people who don't know what's referenced can't understand it because they have no context. Which brings us back to square one, you can't meaningfully discuss the content in public.


The parent I was replying to said "and can't be publicly discussed is the main issue".

You absolutely can, trivially as the huge number of discussions or the ISO date format on StackOverflow show: https://stackoverflow.com/search?q=ISO+Date

There are lots of issues with locked down standards. But copyright means they can't be publicly discussed is factually incorrect.

(Also fair use means it's fine to quote enough context for a meaningful discussion anyway)


The main problem here is that people who lack access to the document effectively can't participate in the discussions, if there isn't enough public information about the document accessible by other means.

You can discuss the general requirements, and paragraph numbers, etc, but there's a limit to how much somebody who has the document can quote without legal problems. Somebody else who sees the discussion can not know if the discussions contain complete enough information to "reverse engineer" the standard well enough to meet its requirements.

All those discussions about the ISO date standard involves getting information from somebody who has access to the document and who then shared the information in public. Not all documents has that degree of public commentary.

And if you need to ask about sections which haven't been previously described in sufficient detail for your needs, then you're personally relying on individual people who have access to read it and rewrite the information for you. Which is a lot of work and also legally uncertain.

Fair use standards are not all that consistent. And that's a legal defense you can use in court after already having been sued, not before. Enough for context can vary between one sentence or three pages.


> Which brings us back to square one, you can't meaningfully discuss the content in public.

Fair Use exception in the US, and I think that would be covered by Fair Dealing in the UK and similar countries.


That's an affirmative defense in the US though. That means you can still be sued and have to defend yourself in court.


I guess the trick is to pirate a specific localized version of the standard from an ISO member country that doesn't care about IP rights enough to sue you... say... PRC?


Most jurisdictions' copyright law covers translations as derivative work which is still protected.


You only have to do that if you're trying to make a substitute document. That's miles away from just discussing.


In the medical devices field we have standards like ISO 13485 and IEC 62304. They are very broad and high-level standards and require a lot of interpretation. What would REALLY help small companies that are just getting started, is a line by line explanation of what you need to do, why, and what possible solutions there are. I'm not sure, but I believe doing so could violate the copyright. Even the fact that I'm not sure prevents me from doing it.


> a line by line explanation of what you need to do, why, and what possible solutions there are.

That's where expensive consultants who already have interpreted the standards are paid to explain to you. It's a bit of a "cartel" imho.


Yes, and each one will tell you something different ;) I much prefer open discussion where the best ideas stay afloat and bad advice can be publicly shamed.


Wikipedia does not have complete details of ISO-8601. There are many aspects of the 2019 revision which are not mentioned at all in Wikipedia.


Well standards are not just for companies, they are of interest to individuals too.

For instance, the C++ specification is registered as an ISO standard, which means you have to either rely on free drafts, or pay for it...


For a startup - probably not.

But getting funding through finances for 'a few hundred EUR' in a largish company could be nearly impossible. It's not a thing that can go into your yearly budget and it's not attached to any project. You'll have to pay out of your own pocket for that.


When ISO is defacto law in some industries, I would argue that laws should not be paywalled.


Your comment reminds me of related thing, with building codes in the US. The Supreme Court recently ruled they could not be copyrighted, for the reasons you say, they are laws that need to be available.

These codes are often produced by a single organization, "International Code Council", a non-profit somewhat analagous to ISO, which I believe sells them to governmental jurisidictions which adopt them as law, sometimes with some customizations or "choose A or B" choices.

One of the parties to the lawsuits involved happens to be a Y Combinator funded company, "UpCodes".

https://archinect.com/news/article/150195411/supreme-court-r...

https://www.constructiondive.com/news/construction-code-purv...

https://techcrunch.com/2020/11/16/a-court-decision-in-favor-...

In the US, if there are any cases where an ISO code is mentioned in law as legally binding in some way, it's possible someone could try to challenge the ability to keep from sharing the relevant standard text freely. It's not exactly the same situation, but this supreme court decision provides a possible path anyway.


Same problem with the building standards in Australia.

Same problem in Australia with the AS/NZS standards. I've been having problems with my whiteset plaster, which is like a liquid applied white plaster surface used on almost every home here in Western Australia. Mine was done incorrectly, I had to purchase two different $250 standards to understand how it was done wrong, how it should behave, how it was tested, in order to file a complaint. It may not surprise you part of the reason it was applied incorrectly is because not every trade has a copy of said standard.

And then even once you purchase it, it's a "one user" watermarked PDF you're supposed to only have 1 copy of and there's lots of harsh warnings about that, so even those that have it and scared to run around with it.

It's a crazy situation. Because this is legislated stuff for building. As a consumer it's very expensive to inform yourself on these things. If you wanted to inform yourself on all aspects of a build it would get expensive fast.

It's also difficult for me to publish and discuss this information in the public domain to help other consumers having the same problem, as the limits of how much text I can "copy" appears technically set at 0 even though it's standard to "reference" it. But it's very easy to mis-interpret the standard if you don't read things in context.

If the standards are effectively government legislated they either need to be government funded (this makes total sense to me) or the price needs to be much more token, 10 dollars, with much less draconian access. But at that price the government may as well fund it anyway.


It's not so surprising in Australia - protectionism is the norm here, where legally you have to hire an electrician to change a plug on an appliance. Making the standards expensive tends to force the average consumer to pay for the (overpriced) services of a professional.


> These codes are often produced by a single organization, "International Code Council"

Somewhat off-topic, but I've never quite understood the American tendency to call something the "International X" when the US is only the country of any significance involved in it.

(It may be technically true that a handful of small countries have adopted the US building code – such as Bermuda or Western Samoa. But that doesn't change its status as an essentially American code. The US is the only major economy to use it, and non-US entities have very minimal, if any, input on its contents. And a few small countries might have adopted the US building code even if it was called "US" rather than "International".)


I wouldn't call this an American tendency. Generally the word "international" is used for things which are expected or aspire to be truly international.

If anything the American tendency is to restrict interest to the US.


Can you name some examples? In Europe it's so easy and normal to organize a multi-national conference that it'd seem weird to start something that aspired to be international without inviting participation from multiple countries.


I hadn’t heard of the ICC before, but their “About” page claims chapters in 38 countries and their “Find a Chapter” page (https://www.iccsafe.org/membership/chapters/icc-chapters-and...) has links for Canada, Australia, and Mexico. I think it’s probably ok to consider that international in that context.


Australia doesn't use America's building codes. Australia has its own. And their "Australian chapter" is the Australian Institute of Building Surveyors (AIBS) – which it is worth noting is not the Australian body which develops Australian building codes – that's the Australian Building Codes' Board (ABCB). The AIBS is the professional body for building surveyors, and as such while its members have some role in enforcing those codes, it doesn't develop them itself–although they are able to provide input to the ABCB's public consultations (same as any other organisation is.)

I don't know what the actual point of having the AIBS as a chapter of the ICC is. Probably an excuse for some overseas business trips.


Good to know, but that’s probably moving the goal posts a bit. This looks like a private industry group set up to proactively influence adopted standards (ie: a lobby) and does so internationally.

At least in the US it’s fairly common for lobbies to offer prewritten codes in the hopes that the adopted codes are (at least pretty close) to the ones they want.

You could argue whether such a practices are how private industry and governmental regulations should interact, but it doesn’t seem like the term “international” is particularly problematic in the name. The “World Series” (of baseball) on the other hand...


> This looks like a private industry group set up to proactively influence adopted standards (ie: a lobby) and does so internationally.

Who are you saying is lobbying who here? Are you saying AIBS is lobbying the ICC? Or that the ICC is lobbying the AIBS?

I don't see why the AIBS would engage in lobbying about the contents of US building codes. What difference does it make to building surveyors in Australia what building codes in the US say?


You seem like you have a more detailed picture of the ground truth here, but that part you’ve basically said yourself (and I agree with, under the disclaimer that I found an “About” page and that’s the end of my knowledge):

AIBS appears to be a member of the ICC. The ICC appears to be an international lobby. In that context, AIBS (probably) lobbies the ABCB with some help from the ICC as do other chapter members their own respective government representatives or regulatory bodies. Or maybe they just enjoy the ICC newsletter emails. I was only pointing out that international here is not an really a presumptuous “the US is the world” misnomer.


Marketing.


We call our baseball thing the “World Series” because sometimes Canada plays.

Realistically it’s more probable that the group started out with a goal of world-wide adoption of the standards they produced and failed to get traction outside the USA sphere of influence.


"Samoa" (nee "Western Samoa") since 1997.


Delusions of grandeur


If anyone is interested, below is a relevant portion of the court's analysis from the Up.Codes case in the original document linked to at the end of the Techcrunch article above. They lay out criteria by which a copyrighted work is considered "the law", giving the public free access to it:

"the principles that guide the Court’s analysis seem relatively clear. The law is in the public domain, and the public must be afforded free access to it. SeePRO, 140 S.Ct. at1507. That a law references a privately-authored, copyrighted work does not necessarily make that work “the law,” such that the public needs free access to the work. CCC, 44 F.3d at 74. However, a privately-authored work may “become the law” upon substantial government adoption in limited circumstances, based on considerations including (1) whether the private author intended or encouraged the work’s adoption into law; (2) whether the work comprehensively governs public conduct, such that it resembles a “law of general applicability”; (3) whether the work expressly regulates a broad area of private endeavor;(4) whether the work provides penalties or sanctions for violation of its contents; and (5) whether the alleged infringer has published and identified the work as part of the law, rather than the copyrighted material underlying the law."


That is a severe overstatement.

The Supreme Court ruled in Georgia vs public.resource.org, which was very much not about building codes. P.R.Org actually does have another ongoing lawsuit that is similar to UpCode's: American Society for Testing and Materials et al. v. Public.Resource.Org

The UpCode ruling was at the district court level, and merely cited the ruling from Georgia vs public.resource.org.

Until we get at least appellate level decisions on the copyrightability of enacted codes, I'm unlikely to feel satisfied.


Hm, thanks for correction. I'm definitely not an expert. I just vaguely remembered that it was something that was at one point being legally challenged, so looked it up and found those articles, with headlines including:

"Supreme Court rules that building codes cannot be copyrighted"

"Construction code purveyor calls Supreme Court's ruling that annotated code can't be copyrighted 'monumental'"

Are you saying those headlines were overstating?


If I read the TechCrunch article properly the UpCodes case was decided at the district level by judge Victor Marrero. The judge cited the recent victory by PRO Inc. in the Supreme Court

https://www.supremecourt.gov/opinions/19pdf/18-1150_7m58.pdf


> The Supreme Court recently ruled they could not be copyrighted, for the reasons you say, they are laws that need to be available.

No, it didn't. It ruled that the State of Georgia couldn't claim copyright on officially-annotated legal codes. Your sources note this and speculate about how it might impact a different ongoing dispute about privately-copyrighted building codes owned by the ICC that are often referenced in, and thereby given force of, law (and the TC one of discusses it having an effect on a non-terminal ruling in the case (a motion to dismiss which was denied, allowing the case to move forward but not resolving it.)


> but is a few hundred EUR really an obstacle for any kind of serious medical startup?

No, it really isn't if they are serious, or even if they plan on being serious in the future.

If you are shipping anything classified as a device, regulatory & QA work will typically involve multiple full time hires and/or equivalent consulting help by the time you file. Things are a bit leaner in SAAS only world, but still significant.


Especially compared to the cost of reading and complying with them.


To be sure, this entire thread is about principles, and while the costs may not be prohibitive, the title is "obstructs," not "prevents."


Startups: millions of dollars a year for AWS bills is just the price of doing business!

Also startups: We can't afford a hundred dollar PDF uwu.


It's incredible to learn from the responses in this thread how widespread and systemic this problem is, not just in software but seemingly every industry. Thank you for these examples.


Wish Sci-Hub hosted copies of these things as well.


b-ok / z-library has numerous copies of ISO documents I need to be familiar with as a hacker, including 8601 from the subject discussion.


Still, having support for a URI similar to doi, maybe as e.g. iso:8601 (with similar specs for din, en, etc) would be neat.


RFC 5141[0] has a mapping of ISO standards to the URN namespace.

SO the latest version of the English-language version of ISO 8601-1 would be encoded as

urn:iso:std:iso:8601:-1:en

[0] https://tools.ietf.org/html/rfc5141


This is not new. It has happened to inter-networking. Well that is why we call FTAM F...

It was a shock for people doing internet (which in 1990s based on us gov std will be obsoleted and all moved to iso/osi). Or in my job doing EDI. It is just crazy. We want to do us gov standard and world standard. But can’t.

Let us temporarily use this lousy Ethernet, to be obsolete internet ... can’t use sna we all know and decnet (iso ?) is too buggy.

At least this ftp, email, telnet seems work. It needs at least 2 working sample before the standard is a ietf paper. And you can at least download the paper to learn what is it.

Good luck we are in 1990s. We never did a migration from the temporary internetworking thing call internet to the grand great hall of iso/osi. Just hope health care do not repeat the same mistake. Only open available standard could work if you have more than 1 parties.


Try insurance - the old xml standard elephant in the room scales up to like 100k a year based on company revenue


Sometimes they literally are the law, for example where paywalled standards are “incorporated by reference” into regulations. There are official workarounds [1] — sometimes — but the situation of a law that isn’t freely available is disturbing.

[1] https://www.nist.gov/standardsgov/accessing-standards-incorp...


Luckily, FHIR exists is an open healthcare standard which gives it an advantage. (Disclosure: I work at Commure which uses it.)


I fully agree


I know European funding has its issues but is a few hundred Euro that bad?


ISO standards commonly refer to other standards for more details. Then you also need technical reports which are also pay walled to get a sense of practical application. It very quickly adds up and there is no way to "explore" which documents are really applicable to your company and products.

And to top it off most of these licenses under which you buy them only allow for a single digital copy (one person).


As mentioned above, if it few $524 per document, and you don't know how many documents you will need to consult in order to comply with the standards and at the end you need to buy the relevant document ... It's tricky. Standards are like laws. The law shouldn't be pay-walled.


For one or two standards? There's just the headache of getting approval to spend the money.

But when it gets to dozens/hundreds, plus requiring vendors to have their own copies, it quickly multiplies into a massive burden. And that's not even getting into the open source issues.

Not to mention the fact that you might not know if you NEED the ISO until after you've already bought it.


Any kind of positive value is bad if we are discussing the price of an open standard.


The price of standards makes me angry. They're essentially a form of legislation in many countries - to sell anything in the EU you have to comply with the relevant safety standards. Yet they're kept under lock and key - to read the rules which you need to follow to sell something in your own country, you need to pay a 3rd party hundreds of Euros.

And when you read the standards, they reference other standards. Eventually you have to build a graph of standards to which you must comply, each one costing hundreds of Euros. It's a complete racket.

The worst thing is that the standards themselves tend to be written by 3rd party organisations with an interest in that domain, so they have a strong incentive to make the standard match with whatever they're doing. So not only does a new startup have to spend months reading hugely expensive dry safety standards, you also have to build something which is essentially a worse version of the incumbents.


Same problem in Australia with the AS/NZS standards. I've been having problems with my whiteset plaster, which is like a liquid applied white plaster surface used on almost every home here in Western Australia.

Mine was done incorrectly, I had to purchase two different $250 standards to understand how it was done wrong, how it should behave, how it was tested, in order to file a complaint. It may not surprise you part of the reason it was applied incorrectly is because not every trade has a copy of said standard.

And then even once you purchase it, it's a "one user" watermarked PDF you're supposed to only have 1 copy of and there's lots of harsh warnings about that, so even those that have it and scared to run around with it.

It's a crazy situation. Because this is legislated stuff for building. As a consumer it's very expensive to inform yourself on these things. If you wanted to inform yourself on all aspects of a build it would get expensive fast.

It's also difficult for me to publish and discuss this information in the public domain to help other consumers having the same problem, as the limits of how much text I can "copy" appears technically set at 0 even though it's standard to "reference" it. But it's very easy to mis-interpret the standard if you don't read things in context.

If the standards are effectively government legislated they either need to be government funded (this makes total sense to me) or the price needs to be much more token, 10 dollars, with much less draconian access. But at that price the government may as well fund it anyway.


The worst part is the amount of cross referenced standards. You buy 1, then realize to interpret it correctly, you need to buy a list of 10 standards, then repeat the process with those 10. This encourages piracy and use of old standards.


Yes, it is a big problem and a very good business for some firms. I have 2nd hand experience with that thanks to my closest family. There is a partial workaround using a public or university library, which at least in Germany and most likely other European countries, seems to work. They tend to have many standards available and they can usually order more if you ask politely. At least bigger central libraries or those connected to a big university shouldn't have an issue with that. Of course, even so you may have to buy a standard (licence) because of the licence part. At least you know, what it contains before you think about buying the licence. It may be, you don't actually need the licence at all or can reduce the amount of the total licences you need.


Agree with all you say. Apparently people sometimes put ISO documents on libgen, potentially saving the expense.


I would love to fight the government on a code violation by essentially arguing that "Either you put the code standard in the public domain, or it is not valid law". But I'm sure I'm not wealthy enough to afford that fight


> The worst thing is that the standards themselves tend to be written by 3rd party organisations with an interest in that domain, so they have a strong incentive to make the standard match with whatever they're doing.

Related anecdote: years ago I worked a semiconductor company, which was working with a competitor to standardize the sensor/camera component for the OpenMAX [1] multimedia framework (like GStreamer).

We wanted a unified component that output a finished image (because our hardware was capable for that), but the competitor wanted modular components that produced/consumed pieces transferred to memory (because their HW was more limited).

I think the final standard followed theirs, much to our chagrin.

[1] https://en.wikipedia.org/wiki/OpenMAX


I wonder whether it is worth to employ a student just for getting access to Standards / Papers via their university. (Back when I was at a technical University we could access that via our student logins)

That way you might explore which standards are necessary and buy them if need-be.

edit: Obviously I would advocate for open and free standards.


University libraries sometimes make standards available to everyone with a library card. But at least in the cases I've seen, they have ToS limiting usage to research/non-commercial. Which also makes the "hire a student" plan a bad one, hiring someone and then expecting them to break their uni's rules/the standard's license isn't great.


I dont understand why these PDFs are not in scihub or libgen...


Apparently they're extensively watermarked to identify the purchaser.


Why aren't there mirrors?


> The value of standards is in their adoption.

Tim Sweeney correctly observes this, then continues to talk about "millions of hobbyist programmers". I do not believe that ISO targets, or even has any remote interest, in this market.

ISO is comprised of nation-state members who will inevitably mandate ISO standards as part of legal compliance. Various stakeholders actually participate in standardization efforts and thus also both already know the standard and are able to push it through. All of these categories (government, industries in highly regulated sectors and large stakeholders) have large amounts of capital. The amount of money required to fund a purchase of an ISO standard barely even factors in on a balance sheet.

Hobbyist programmers arguably make a lot of open source software that builds the foundation for today's and tomorrow's platforms. However, when the big bulldozers from the previous paragraph roll in, hobbyist programmers give way to highly paid employees of these giants; be it by merging a patch or be it by being worked around with a greenfield project or fork.

On the other hand, ISO has an incentive in charging money for their standards because this adds perceived value: If something is freely available, it is easier to dismiss it as a non-serious effort when debating whether it is worth to bind personnel for participation in the standards committees; the standards come across as valueless, worthless.

Looking at this vector of interests of the various parties involved, I see little reason for this state of affairs to improve.


Perhaps it will be beneficial in the discussion to add examples of other orgs e.g. IETF, Unicode. Unicode spec is fully available[1] and their funding comes from a membership-model rather than a pay-model[2].

ISO's argument is compelling but we see other standards organizations taking different approaches and more or less still finding success.

[1] http://www.unicode.org/versions/Unicode13.0.0/ [2] https://home.unicode.org/membership/why-join/


Also ECMA (C# and JavaScr... er, ECMAScript) which provides standards at no cost vs ANSI (no notable language specs since C and Pascal) which charges a fee.


Last year I finished the school year early because of the coronavirus lockdown and had too much free time - so I wrote an interpreter for CLR bytecode (https://github.com/Leowbattle/clr_lite). The ECMA-335 standard contained everything I needed to know for that project: documentation of the EXE format, VM instructions, etc.

I learned a lot doing this project, and I would never have been able to do it without free access to the standard. So I think Tim is right to recognise the value open standards provide to hobbyist programmers.


> ECMAScript

ECMAScript is nowadays amusing because the ISO standard for it is literally a single page document… that normatively references the ECMA published document.


Apparently 3 pages. AND STILL COSTS 38 CHF ($40 USD)! https://www.iso.org/standard/73002.html


I mean, you can download the PDF version for free here: https://standards.iso.org/ittf/PubliclyAvailableStandards/c0...


ANSI for languages now operates under the aegis of ISO. For the C and C++ languages, ANSI is one of the voting member bodies.


I think "ECMA C#" lags behind C# as implemented in .NET, but still.


If I have a choice or am in a position to influence a decision, I always push for open standards orgs like OASIS: https://www.oasis-open.org


OASIS is good but there member orgs need to pay membership dues and do it every year. I think ISO encourages independent experts from public sector and academia to provide expert feedback without paying for membership.

Though, I think that ISO can be fully funded by the national standards bodies on an annual basis just like OASIS is funder by companies and not charge for PDFs.


I'm afraid that you are comparing apples and Walmart.

Unicode is a standard for encoding characters. ISO is an organisation that _creates_ standards for just about anything.

Unicode became a standard as a result of beating other competing standardisations. ISO declares that whatever they came up with is the standard, no competition required. Hence the effectiveness of the business model.


Parent is talking about the Unicode Consortium, not the character encoding, that the consortium is responsible for.

https://en.wikipedia.org/wiki/Unicode_Consortium


> Tim Sweeney correctly observes this, then continues to talk about "millions of hobbyist programmers". I do not believe that ISO targets, or even has any remote interest, in this market.

They probably don't. And, maybe that made sense decades ago when it was more true (though this has never been completely true) that serious software came out of big industry with up-front allocation of resources, whereas what any hobbyist programmers were working on was something else entirely. Today, much serious software is built by informal networks of developers who don't necessarily share an organizational affiliation even if some of them have institutional sponsors. And a lot of it evolves fluidly, without mandated standards as institutional requirements, so if developers can't get a standard freely to assess it for fitness, they simply won't consider it at all. Thet’ll either ignore it entirely, or approximate the behavior of some other piece of software that implements (perhaps not faithfully) functionality from the standard relevant to the needs of the new project (maybe making alterations to suit the different use case, without reference to the inaccessible standard, which may have a solution for the new projects problem that the project used as a model didn't need.)

ISO/ANSI/etc. model of selling standards documents to fund the standards maintenance organization introduces friction for even “serious software” in that environment, whereas the models used by the IETF, ECMA, W3C, etc., that funded standards work without relying on selling standards docs does not.

Now, of course, there's an easy workaround; if you want technology to be used, and multiple standards organizations make standards in the field, don't submit it (at least not exclusively) to the paid-access organizations. But you've also got to get, e.g., governments onboard so that they don't adopt paid-access standards into law.


I do not believe that ISO targets, or even has any remote interest, in this market.

"Hobbyist programmers" in this context includes "pre-revenue startup founders" and "open source project maintainers". Those people need to have access to standards and shutting them out only serves as a barrier to the industries their applications could disrupt.


It also includes people wanting to write standards-compliant code, and even "people who want to contribute to compilers." The closed nature of the standard means secondary sources dominate my search index. A previous rant about how the entire internet fails to warn you that atoi("a"); is undefined behavior: https://news.ycombinator.com/item?id=14861917

I've contributed a few patches to rustc, shipped multiple C++ projects, and even technically submitted a patch to LLVM. I've got a copy of the C++03 standard... but even I haven't rushed out to empty my pocketbook for C++11, C++14, C++17, and C++20, despite using them.


There is a middle ground between hobbyist and large stakeholders that ends in the worst spot.

As you point out those are legal requirements, and a small/middle size company would need to know them before entering a field or hitting some legally bound clients. The paying upfront makes it difficult to do pure discovery to even know how hard are the requirements to implement.

I saw that at a small company where we could have been interested in applying for a standard, as a nice to have to bring more business. But the cost of paying for the doc, go through it to then perhaps realize it’s not worth it, brought enough friction that it was delayed to oblivion.


Yup. Monetization of things that ought to be free and standardized is one of the biggest unnecessary drags on productivity growth today. That many non-profit bodies contribute to it just makes it more sad. Large companies don’t care about this relatively small cost, and they actually benefit from the (small) moat that monetization creates.


I recently wanted to get the list of countries and provinces/regions (ISO 3166); this codifies things like "US" for United States and "US-TX" for Texas.

It's ridiculously hard to find this data. Sure, there are lists floating out there on the 'net, but many of them are incomplete, outdated, or borked in various ways.

I eventually ended up writing a script to extract it from the Maxmind GeoDB files, which was the best source I could find.

I think I spent half a day on this on total: trying to find a good list, examining the lists I found and discovering they weren't good, I wrote a script to extract this from another source (some US gov't agency, I forgot which) only to discover there were problems with that data as well (it dropped diacritics IIRC), etc.

This is the simplest possible ISO standard, and a 10-minute job ballooned to this. In hindsight it almost would have been better to just pay the €200 or whatever it is, but bugger if I'm going to shell out that kind of money for a fairly small hobby project.


Similar story as you with ISO18245: the list of merchant category codes (4 digit codes that categorize the world).

I did end up paying for the standard (which doesn't even give you the data in a programmatically accessible format! I had to write a pdf extractor). Thankfully there are public lists by visa, mastercard and stripe which also contain private ranges, so my published code shares not just the iso standard but also the matching entries in the other lists.

https://github.com/jleclanche/python-iso18245

https://github.com/jleclanche/python-iso18245/tree/master/is...


What’s insane is that’s literally a list of data - uncopyrightable in the US. Did you make your results available?


I thought about it, and even have a ~/code/iso-3166-2 directory with some stuff in it, but the problem is that I'm not entire sure how accurate/complete this data is either. It's certainly enough for my use case as it's specifically tied to GeoIP, but it's not inconceivable that there are missing entries (or outdated/wrong ones, for that matter). Perhaps there are regions which don't have an IP address in the GeoIP database (like some of the small pacific island states)? I don't know...

ISO does have a browsable website[1] and you can get the data from there, which is the best "official" source I know of, but it's not easy to extract the list from there. It's probably better to shell out the €80 (less than the €200 I remembered) to just buy the damn PDF than spend a day writing a script to extract the data from there, but I'm kinda poor at the moment, so... :-/ Also no idea if that buys me updates as well (being outdated is a big problem with many of the lists I found, turns out countries change, who knew).

Wikipedia also has pages for them[2], but again, I'm not sure how accurate/complete/updated this is.

[1]: https://www.iso.org/obp/ui/#search (click "country codes", no permalink).

[2]: e.g. https://en.wikipedia.org/wiki/ISO_3166-2:ID


True, but you missed the underlying concept of this statement.

The trend of innovation is increasingly becoming grassroots-driven.

In part, that's because our fundamental research advancement has stagnated, i.e., the nation states can no longer steadily produces ground breaking tech that leaves the industry to adopt. As a result the adopt of standards become more relevant to SMBs and individuals.

On the other hand, Internet has driven down the knowledge acquisition cost to probably bare minimal. I.e., modern days, one who has good understanding of English can learn pretty much any highest-level of knowledge almost for free. The SMBs and individuals are becoming more and more sophisticated, making them gradually become competent to be involved in the standardization process.

In short, ISO's practice is fine looking from a perspective of 10 years ago, but it's now wrong.


> The trend of innovation is increasingly becoming grassroots-driven

Please qualify this, I call bs. I acutally think its harder and harder to innovate now a days and we need large companies to innovate because scale is now the battle for any product that makes an impact.

If its a "trend" then you must have historical data that displays the change and im sure you have a hard definition for "innovation" and "grassroots". This is basically a flippant comment that Im not even sure you feel strongly about but it sounds nice.

> down the knowledge acquisition cost to probably bare minimal

you dont understand how important experience is. Books have been around for centuries, all this knowledge was not much more difficult to get decades ago but you still have people cant perform surgery from reading a book or create a rocket ship. Knowledge is about 10% of the solution to any problem.

>individuals are becoming more and more sophisticated

Complicated yes, sophisticated no. Look at music, you think this is a sophisticated society???


> Tim Sweeney correctly observes this, then continues to talk about "millions of hobbyist programmers". I do not believe that ISO targets, or even has any remote interest, in this market.

I completely disagree. As a hobbyist programmer I want to improve myself so that I can sell my skills to international standards. It's more difficult to do that when standards have to be bought before I can even determine how hard it would be to learn and adapt to it.


The parent's point is that ISO doesn't care about you, the hobbyist programmer.

The parent wasn't saying that millions of hobbyist programmers don't care about standards. The point is that millions of hobbyist programmers DO care about standards. But ISO doesn't care about the milions of hobbyist programmers and therefore paywalls standards.


Actually, hobbyist programmers often create what become competing standards to the ISO track one resulting in low adoption of the ISO standard.


I see and I agree with you -- I had misinterpreted the parent post.


But I think the ISO paywall prevents adoption in the market you're describing too. I've worked in enterprise and government facing commercial databases for half a decade, been involved in dozens of discussions about whether some proposed piece of syntax is standards-compliant, and I haven't once encountered a reference to a section or page number of the actual SQL:2016 ISO standard.


Modern SQL standards seem to be primarily written by the PostgreSQL team - new features they roll out and syntax to describe them tend to be widely adopted. Granted, it could be coming out of ISO itself but nobody would be able to tell since their standards aren't openly published.


ISO standards are often discussed at PostgreSQL conferences, and whenever possible they try to follow the standards. Sometimes standards are written after PostgreSQL implements a feature, and it some cases it seems like they are even written to accomodate PostgreSQL (eg. the SQL/JSON standard seems to take into consideration that a database might have multiple JSON types just like PostgreSQL)

But whenever the topic of standards comes up, someone immediately mentions the high price of the standards, and I think only some of the developers actually have a copy of the SQL standards.


The only bits people actually know are yyyy-mm-dd and pre-92 implicit joins.


Standards cost money because the process to create them costs money. Sometimes governments, big companies or organisations like the EU fund standard development, these then end up freely accessible.

That said, it really shouldn't be this way.


Standards cost money because the process to create them costs money.

But the money you pay to the standards doesn't go to the creators, it goes to the organization to make them available. (I've contributed to ANSI standards).


It really should be that way - i.e. we agree that standards are a net good (and I don't think anyone would argue otherwise) for industry and society, so make them free at the point of use and fund their development through government.

This removes any commercial incentives that may result in regulatory capture - a big issue in some areas - and should (if implemented properly) drive investment in strategically sensible areas.

The profit that most national/international standards bodies just about covers costs, but none of these bodies are particularly expensive to operate in the grand scheme of things. Some already get partial funding from govts. Full funding is clearly the sensible model.


One copy of a book has value by itself. But a standard needs everyone to follow it, so a business model that shrinks the audience is self-defeating.


The people who make the standards do not get any money from you buying a copy of the standard.


>Various stakeholders actually participate in standardization efforts and thus also both already know the standard and are able to push it through.

And this is exactly the perverse conflict of interest. In those cases, the authors have no interest in actually attracting independent implementations that adopt the specification. The specification is a formality to bolster the credibility of their product or technology. The authors don't want some outside team to read this and actually build something on their own; they want that team to buy their [chip/component/patent license/product/service] that implements this.

It's not a solo endeavor. The coauthors and stakeholders tend to be complicit in this game, often as peers in the oligopoly.

When a spec author actually wants their specification to be independently implemented, you can tell. They don't make it exclusively for purchase in the dusty basement of the bureaucracy superstore. They put out guides, conformance tests, reference implementations, examples, and tooling.

And those materials often do exist for ISO standards... but as expensive, proprietary kits from one of the spec authors, only available to customers who buy their [chip/component/patent license/product/service]. That's part of the business plan. Why write a clear spec and give it away for free when you could be selling one of the only conformant implementations of your Public Standard™?


Ah, the old fallacy of "If they charge money for it, it must be good!"

Meanwhile, Oracle out here laughing their way to the bank.


People adapting the standards adds more value than perception. If these standards are being paid for by governments, then of course they should be free. But it they aren't free, it means not everyone will adapt then, and lowers their value


I think the real issue comes from once-removed tools. The people writing a tool that you use may have access to and adhere to some ISO standards, and that information may be quite helpful for debugging while they're building their tool - but it doesn't really help you all that much since you lack access to those same standards.

The C++ standard is pretty different and an interesting example here, traditionally there were portions of the standard that weren't really accessible and that, in part, contributed to different compilers not being called out on differences of opinion along with the ability to have differences of opinion not being called out by experts since they didn't have access to the standards to see where those contradictions lay.

In the modern world the C++ standard is what I think ISO should aim for with all their standards, it is widely available and heavily discussed and that feedback has allowed the standard to grow by leaps and bounds as the language has adopted things once seen as impossible (outside of Qt & boost) like foreaches and not-terrible-to-work-with lambdas.


> hobbyist programmers give way to highly paid employees of these giants

They all learned C++ somewhere, and I doubt that most of them had access to the official ISO standard while doing it. (Yes, I'm aware of the final drafts etc)


> They all learned C++ somewhere, and I doubt that most of them had access to the official ISO standard while doing it.

There's a difference between “learning to use” and “implementing” (though having access to the standard is good for both, it's more critical for the latter.)


ISO actually shouldn't have an incentive to charge more money than the support of the standardization process costs.

Also, regulators should (in my opinion) pay for standards to be freely available when they harmonize/adopt them for their country/countries. It is kind of insane that one as a customer can't access the rules by which products are approved without paying. It is as if laws would be hidden behind paywalls.


Agreed. I do know that ISO and at least two national standards bodies I work with basically do charge on a cost-covering basis (which is why pricing is largely tied to tiers of page count, which is a reasonable proxy in most cases to effort/time). But industry/hobbyists shouldn't really cover this cost - or should do through general corporation taxation.

To be legally compliant to install cables in the UK I've got £500 of standards on my desk. We actually pay our national body circa £5k a year for access to various industry standards we need to be able to hold suppliers to account. Govts should just properly fund this stuff.


Just last year the US Supreme Court ruled (in a 5-4 split decision) that states can't put their laws behind a paywall. Before that it wasn't terribly uncommon for that to happen.

Capitalism, baby!


Actually, laws are kind of hidden behind paywalls! For most laws, I wouldn't trust my own judgement on their interpretation and would have to pay a lawyer.


This always struck me as the dagger into the idea of freedom. How can one follow the law if one is not fully informed of the law?

Of course, those with money like it this way. It's a barrier for competition and exercising your rights.

Once you start looking at barriers, such as ISO, you start noticing them everywhere. Real estate, dentistry, doctors, school teachers. You can't even cross state lines as a school teacher, or other professions. People often argue that software developers should be licensed much like engineers. Let's be thankful that's not the case. Imagine the headache of being remote and having to get licensed in multiple states!


Not in France. Everything is publicly accessible on Legifrance.gouv.fr.


There's an important distinction here. Laws (and court decisions) are freely available in many (most?) places. Relevant commentary on how these laws are applied tends to be more costly.

I'm not very familiar with the situation in France, but I can offer a data point from Switzerland: The civil code is (of course) freely available online and in PDF format. A printed copy is available for CHF 15 or so¹, both from the federal press and from other publishers who might throw in an index or a keyword reference at the same price.

However, if you're actually looking to apply any of the contents, you'll want qualified explanation and references to jurisprudence alongside the legal text. Affordable commentary² on the civil code start at CHF 250 or so, and the industry standard "Basler Kommentar" to the civil code is sold as two volumes, retailing for CHF 598.- each.

¹ It's a few hundred pages; IIUC the price pretty much reflects the cost of printing, binding and logistics. Key point: Nobody is getting rich off of selling these.

² For the civil code, specifically, you'd be in luck: Some consumer advocacy organizations publish hands-on guidebooks that are significantly cheaper than the usual commentaries. So you might get by on CHF 100 or so. But these tend to not be available for other, less mainstream, laws.


Binding legal precedent is often paywalled to read too.


This isn't a robust way to write good software. It's better to capture the requirements up front and apply those early on in the project's life. Relying on other users to notice defects is likely to result in only some of the software being correct. Especially if it's a date library, the author should be able to know the proper date format at the outset.


Would IETF where the specs are publicly accessible but it's vendors and so on in the various working groups be a better model?


In principle, people participate in the IETF as individuals, though in practice it matters who they are employed by.


It’s not just an issue for hobbyist programmers. That’s also an issue for most startups and small companies. You just can’t buy all the documents you need when you’re designing hardware. That stuff is insane. And it’s not just about the cost, it’s also about the time necessary to buy these documents. Creates way too much friction compared to the open model of the IETF, W3C, Unicode Consortium, etc. The ISO and other organisms using paywalls live in the past and are hurting innovation.


well I guess innovation has a price (or maybe a cost). And as long as ISO puts up a paywall they have determined by market forces what that cost is.

My perspective on this idea is that altruism is dead when it comes to open source. Peoples work needs to be paid for. Weather its ISO or the guy creating a program using it. We are not to be slaves to the future and I dont wish that for future devs. Making a profit isnt evil but wanting others work for free is selfish. People can always make their own standards and come together across nations and do the work. But they much rather cry and call hardworking people bad.

Tim is trying to save a buck I bet.


I'm fairly confident that Epic Games don't have any issues paying a few thousand dollars for any ISO standard they wish to have access to.


Standards should be myriad and have benchmarks.


What do nation-states and legal compliance have to do with standards like C++ and SQL? I've never heard of a government mandating that compilers be standards compliant, or anything like that.


ISO deals with way more than just C++ and SQL.


I'm guessing that those parts of what ISO does aren't what Tim Sweeney had in mind - he specifically called out C++ in the tweet.


POSIX is mandated by government contracts - and so are many other things. If you’re selling into governments you need to be able to sign off that you comply with the appropriate standards (whatever the government wants - usually a standard written by the vendor they want to choose)..


One of the things with government regs that I've learnt working in biotech is that access to the document(s) isn't enough. There are various nuances with the interpretation, and only someone well versed and experienced in them can prepare you for an audit. Our company pays top-dollar for these consultants.


I can't speak about the ISO as a whole for engineering fields, but the ISO standardization process has worked out horribly for the C++ community. Not only for the issues Tim Sweeney points out, but the entire C++ standardization process is defacto a closed-off and secretive process where participation is limited to those who can physically travel from place to place and it's painfully obvious that the quality of features in C++ are much lower than what they could have been otherwise.

A common claim made by the ISO C++ committee regarding criticism of the language is that these guys are volunteers working in a mostly unpaid capacity on the language, and often have to hit tight deadlines to have any shot at getting a feature into the standard, and that's true exactly because of how arcane the ISO standardization process is. It's this pseudo-antagonistic process where maybe one or two individuals are tasked to "champion" a paper in front of their peers and then everyone is supposed to pretend that there's no politics involved and that the paper gets approved entirely on its technical merit.

C++ would have been much better served from ditching that and doing what Java, Python, and Rust do, have broad community feedback and input. It's hard to imagine what beneficial features are in C++ that would not still be there had there been involvement from the broader community of game developers, embedded device developers, desktop software developers and a host of people who use the language regularly, but it's clear many clumsy and awkward features would have been eliminated, including the now 50 ways of initializing variables, broken standard library features like std::variant, the now unusable std::regex, the minefield that is std::random, the upcoming bloated and error prone std::ranges, it's no wonder many C++ development teams are skeptical of the utility of the standard library and just roll their own alternatives.

I hope no other language goes down the road of using ISO to standardize its language.


C++ at least had the benefit of industry and OOP trend behind it. It was rather fortunate in this case.

I've argued elsewhere on HN that Common Lisp died because they were closed and secretive at the exact moment they needed to go the opposite route.

https://www.cs.cmu.edu/Groups/AI/html/faqs/lang/lisp/part4/f...

Around 2004, Lisp was having a bit of a revival of sorts. Lisp was becoming trendy, various blogs and web sites were created. But the only documentation you could find was the HyperSpec. Which, as anyone that had the misfortune of reading, is awful as a reference. It's both too technical for casual software developers and not official enough for language implementers. There were two free, open source Lisps available (CMUCL, CLISP) and both were rather unloved and clunky at best.

Even Linus had a bit of trouble getting his hands on POSIX standards. Imagine Linux dying because it couldn't follow standards that Linus could not acquire.

By the mid-to-late '90s the writing was already on the wall. Perl, Python, PHP, Ruby followed no standard. It became common for the free implementation to be the standard.

Clojure arrived and largely filled the Lisp void. Racket attempted a similar movement, by renaming itself from PLT Scheme to something that removes the emphasis on any particular standard. If you want Lisp today, though, you're probably doing Clojure.


> I've argued elsewhere on HN that Common Lisp died

It's not dead, it just smells funny.

> not official enough for language implementers

The HyperSpec is just a different text rendering -> here as Hypertext. The content of the official Common Lisp standard document is in the HyperSpec. As a language implementor it makes no/zero/null difference reading the HyperSpec or the official standards document in PDF format.

> Which, as anyone that had the misfortune of reading

I like it and have been using a lot.

> There were two free, open source Lisps available (CMUCL, CLISP)

GCL, OpenMCL, ECLS, SBCL, ...


I agree with the thrust of your argument as I was learning Common Lisp around then but you make a factual mistake:

> Around 2004... There were two free, open source Lisps available (CMUCL, CLISP) and both were rather unloved and clunky at best.

SBCL and ECL existed and were quite usable.

The documentation was as you say, another story. There was the HyperSpec, copies of CLtL, and a number of out-of-print or hard to find books that described some kind of Lisp. It was hard to be a tyro without a guide.


Honestly, attempts to pin down Lisp's obscurity to a single cause are almost as old as Lisp itself. No, Lisp did not miss out because people could not read the docs; it simply was as far from XML/Java zeitgeist then as from UNIX/C/C++ before.

> Perl, Python, PHP, Ruby followed no standard. It became common for the free implementation to be the standard.

And I mean look how well it worked with transitions to Perl 6 and Python 3.


> Which, as anyone that had the misfortune of reading, is awful as a reference. It's both too technical for casual software developers and not official enough for language implementers.

I don't think this is fair. It is a very useful reference, and it doesn't claim to be a tutorial:

" 1.1.1 Scope and Purpose The specification set forth in this document is designed to promote the portability of Common Lisp programs among a variety of data processing systems. It is a language specification aimed at an audience of implementors and knowledgeable programmers. It is neither a tutorial nor an implementation guide."


C++ was pretty horrible prior to C++98 as well (and likely 33 out of 50 ways to initialize a variable already existed by then). It did improve considerably within the past decade under the auspices of the committee.


std::string has been in C++98, however, you couldn't use it to create a std::ifstream as it had only char* constructor.

It took a decade to add a std::string constructor...


And it will probably take another decade to add a std::string_view constructor.


One of the biggest impediments caused by the C++ standards committee is the tight scope of the standard. Anything outside of the language/library spec is entirely outside the realm of standardization.

So any tooling improvements (dependency management, build process) are not able to be made to the language. Fractured solutions harm adoption, languages with good dependency management and good build processes have one solution they push on everyone.


Its clear that C++ is very popular, and a lot of people chose to use it, for better or for worse. I'd like to hear your thoughts referencing specific issues you have w.r.t. the language (not libraries). As it stands, I'll take your comment as a passionate plea :)


I’m not sure if you meant it, but your comment comes off as being in exceptionally bad faith. Of course C++ is a popular language and people put up with the decisions ISO makes-that doesn’t mean it doesn’t have issues. The distinction between “the language” and “libraries that ship with the language” is not useful but even if it was the comment presents initialization a wart in the core language.


>The distinction between “the language” and “libraries that ship with the language” is not useful but even if it was the comment presents initialization a wart in the core language.

I think it is extremely useful, as the standard library is easily (and often) replaced. We have a difference of opinion, and that is perfectly fine, but please do realize that this doesn't mean my comment is in bad faith.

Every library has a goal, and the standard library's goal is not to serve as a industry-ready plug-in for high-performance code. Writing high-performance code in C++ is an advanced task that necessitates more control, which makes the standard library not a good fit. The model of (language) + (library) makes it easy for the end user to pick a library of their choice, for their application.

>comment presents initialization a wart in the core language.

Okay, that is one point. When people say "X sucks" or "X is horrible" there is very little a reader can gain out of that. If a person's opinion is formed by deep experience with that X, then I am interested in knowing specifics that lead to that opinion.


The distinction is useful in the context of a discussion about alternative standard libraries, but in the context of the ISO working group that does the design for both it is not.

I understand that you intention was to get more information, but I wanted to make sure you're aware that responding to someone who is listing out their complaints with what is essentially "a bunch of people use the language productively, can you please give me information that you just went over, otherwise I am going to disregard your comment" can be interpreted as bad faith because it's a common troll/asymmetric effort tactic.


Thanks. I'll try to better phrase my comments so as to avoid misinterpretations :)


> it's painfully obvious that the quality of features in C++ are much lower than what they could have been otherwise.

It's not obvious to me at all; in fact I'm more tempted to believe the opposite. C++ has its shortcomings, but when I (say) compare the C++ standard library against third-party libraries, I find the standard library design & implementations to be of much higher quality. They're often far more flexible and handle far more edge cases than open-source libraries do. So, while I would love for the C++ standard to be free, I think this would be more of an argument for not making it so.


Any specific examples come to mind? I usually find Folly [1], Abseil [2] or the EASTL [3] beat the standard library on almost every metric you can imagine, including the under appreciated compile time metric.

And then of course there's boost [4], but people have very mixed opinions about it.

The reason a lot of developers use the standard library in C++ is because dependency management in C++ such a nightmare that many people writing a library are forced to use the standard if they want any hope of adoption even when far superior options exist. It's literally something people writing C++ libraries will advertise "Dependency free header only library!" because they know without that a lot of developers won't bother using it.

Anyways, I would be interested to know what part of the standard library you find is better than third party options.

[1] https://abseil.io/

[2] https://github.com/facebook/folly

[3] https://github.com/electronicarts/EASTL

[4] https://www.boost.org/


Before I give an example, note a couple of things:

- You've picked some of the best C++ libraries as if they're somehow representative of the ocean of C++ code that's out there, whereas I was talking more about the general landscape.

- It's hard to do an apples-to-apples comparison for a library (like Abseil) that tries to avoid replicating what's already in the standard library, so those aren't necessarily the best examples to discuss here.

That said, OK, here's one trivial example. It is a major time sink to dig these up and write self-contained examples for the sake of argument (it took me 1 hour to write this entire comment), so I hope this example can be sufficient to get my point across.

So folly has FBVector and it supports custom allocators, right? OK, so just try making a vector with a custom pointer type:

  #include <vector>
  
  // #include <folly/FBVector.h>
  
  template<class T>
  struct my_pointer
  {
   typedef my_pointer this_type;
   typedef T element_type, value_type;
   typedef ptrdiff_t difference_type;
   typedef this_type pointer;
   typedef value_type &reference;
   bool operator!=(this_type const &other) const { return !(*this == other); }
   bool operator==(this_type const &other) const { return this->p == other.p; }
   difference_type operator-(this_type const &other) const { return this->p - other.p; }
   explicit operator bool() const { return !!this->p; }
   friend this_type operator +(difference_type n, this_type me) { return me + n; }
   explicit my_pointer(T *p) : p(p) { }
   my_pointer(std::nullptr_t p) : p(p) { }
   my_pointer() : p() { }
   reference operator *() const { return *this->p; }
   template<class U> using rebind = my_pointer<T>;
   this_type &operator++() { ++this->p; return *this; }
   this_type &operator+=(difference_type n) { this->p += n; return *this; }
   this_type &operator--() { --this->p; return *this; }
   this_type &operator-=(difference_type n) { this->p -= n; return *this; }
   this_type operator+(difference_type d) const { return this_type(this->p + d); }
   this_type operator++(int) { this_type copy(*this); ++*this; return copy; }
   this_type operator-(difference_type d) const { return this_type(this->p - d); }
   this_type operator--(int) { this_type copy(*this); --*this; return copy; }
   value_type *operator->() const { return this->p; }
   reference operator[](difference_type d) { return this->p[d]; }
  private:
   T *p;
  };
  
  template<class T>
  struct my_allocator
  {
   std::allocator<T> base;
   typedef T value_type;
   typedef size_t size_type;
   typedef my_pointer<T> pointer;
   template<class U> struct rebind { typedef my_allocator<U> other; };
   pointer allocate(size_type n) { return pointer(this->base.allocate(n)); }
   void deallocate(pointer p, size_type n) { return this->base.deallocate(&*p, n); }
  };
  
  namespace std
  {
   template<class T>
   struct iterator_traits<my_pointer<T> >
   {
    typedef typename my_pointer<T>::pointer pointer;
    typedef typename my_pointer<T>::reference reference;
    typedef typename my_pointer<T>::value_type value_type;
    typedef typename my_pointer<T>::difference_type difference_type;
    typedef std::random_access_iterator_tag iterator_category;
   };
  }
  
  int main()
  {
  #ifdef FOLLY_CPLUSPLUS
   folly::fbvector
  #else
   std::vector
  #endif
    <int, my_allocator<int> > numbers;
   numbers.push_back(1);
   return 0;
  }
This compiles and runs fine with GCC, Clang, and MSVC (on, say, C++17).

But now try to uncomment the #include so that it uses folly and you suddenly get errors like this:

  /usr/include/folly/FBVector.h:148:27: error: no viable conversion from 'folly::fbvector<int, my_allocator<int>>::Impl::pointer' (aka 'my_pointer<int>') to 'int *'
            S_destroy_range(b_, e_);
  /usr/include/folly/FBVector.h:365:34: note: passing argument to parameter 'first' here
    static void S_destroy_range(T* first, T* last) noexcept {
Funny, so S_destroy_range is used internally, and it requires raw pointers. But who says my fancy pointers will even necessarily map 1:1 to a linear (and contiguous!) pointer space?

What we see here is folly pretends to support custom allocators, but it cuts corners internally. (!) Which is awful not only because of the inflexibility, but because it misleads you, too. If they're introducing unfounded assumptions internally just out of sheer convenience, how am I supposed to trust the implementation? Heck, if I had implemented implicit conversions to raw pointers so that the code compiled, I might not have even discovered there's a latent bug in my program.

In contrast, in my experience, actual standard library implementations pay attention to the details and don't tend to cut corners like third party libraries do.

Now, again, keep in mind this is what we get with some of the best libraries, whereas I was talking about the general landscape, so the situation isn't even remotely this good on average.


I do appreciate the effort you went to for this, but you make a lot of very strong claims that depend on very obscure minutia and unfortunately they end up being false under careful scrutiny.

>But who says my fancy pointers will even necessarily map 1:1 to a linear (and contiguous!) pointer space?

The standard as of C++11 does. std::vector<T> provides the member function T* data() which is required to return a pointer to the first element of a contiguous memory region spanning the entire vector:

https://en.cppreference.com/w/cpp/container/vector/data

Note that it specifically returns a T* rather than a my_pointer<T>, or what is referred to in standardese as a "fancy pointer":

https://en.cppreference.com/w/cpp/named_req/Allocator#Fancy_...

>What we see here is folly pretends to support custom allocators, but it cuts corners internally.

No such corners are cut. Folly has full support for custom allocators including fancy pointers and in this case it's giving you a compile time error for what could have been undefined behavior. That's an advantage in my book, I'll take compile time error over hard to debug memory corruption any day of the week.

Fancy pointers to T are required to be implicitly convertible to T* and while the small snippet of code you pasted doesn't make use of this requirement in GCC and clang, MSVC does make use of this requirement and you can see as per the link below that your allocator fails to work:

https://godbolt.org/z/7KrE41

You'll notice the error is that std::vector is trying to convert your fancy pointer into a raw pointer in order to call _Delete_plain_internal:

>C:/data/msvc/14.28.29910/include\vector(683): error C2893: Failed to specialize function template 'void std::_Delete_plain_internal(_Alloc &,_Alloc::value_type *const ) noexcept'

Other parts of the standard that depend on this are node based containers such as std::list, std::set, and std::allocate_shared also makes use of this requirement.

The fact is that writing a standard conforming allocator that uses fancy pointers is incredibly difficult and error prone, the standard is not clear about the rules with numerous defect reports related to fancy pointers and as of today no compiler actually has full support for it:

https://quuxplusone.github.io/draft/fancy-pointers.html

The fact that you could have picked any example to prove your point about the standard library and the one you chose to produce took you a very long time to exercise what is a fringe and corner case example of a defective language feature that you do not fully understand is not a particularly convincing argument that the standard library provides the high quality implementations and APIs.

That may seem blunt and harsh, but you're in good company since I myself also don't fully understand it, and anyone who's being honest would also admit that fancy pointers in C++ are an area that quite possibly no one really truly understands.

Before jumping to the conclusion that Facebook got it wrong, they cut corners, and their code is not trustworthy over this, perhaps it might be best to pause a moment; this issue has nothing to do with Facebook and instead is merely a reflection of the sheer complexity of C++ as a language.


> The standard as of C++11 does. std::vector<T> provides the member function T* data()

I actually do believe data() is a defect in the standard. It should be returning 'pointer', not T*. In fact I believe the entire contiguity requirement is a defect, as there's no inherent reason for a vector to require physical contiguity in memory to begin with. Contiguity needs to be with respect to the pointer. Getting to that point is not trivial, though, given so many implementations have assumed raw-pointer-like behavior in the past, so that likely plays a role in why they use raw pointers in places like this one.

That said, you do have a point here, in that the standard requires physical contiguity in memory for std::vector, so this isn't a bug as far as memory corruption goes. In my rush to write up an example, I forgot about this with respect to std::vector, and I just assumed data() would return 'pointer' as would be common sense.

> Fancy pointers to T are required to be implicitly convertible to T*

Do you have a link? I'm failing to see this in [1] or [2]... is it cited elsewhere? Its existence would seem to render the addition of std::pointer_traits<Ptr>::to_address() rather redundant.

> Folly is giving you a compile time error for what would have potentially been undefined behavior.

folly is not giving me that error though... my own code is. By not defining an implicit conversion to a raw pointer. If I had added that, which many would, then I wouldn't have gotten the error.

> You'll notice the error is that std::vector is trying to convert your fancy pointer into a raw pointer in order to call _Delete_plain_internal:

We got different behavior because you're using the debug runtime (/MDd) and I wasn't (/MD). I tried it without and didn't get that. So yes, MSVC also gives an error with the debug runtime.

> the one you chose to produce took you a very long time to exercise

No, this is twisting what happened. It took an hour to write, not to "exercise" the 'corner case'. And it takes a long time to write only because C++ is so extremely verbose and it takes almost 100 lines of code just to implement a simple example around a trivial pointer wrapper. Practically everything in the example was basic boilerplate. And even then I later noticed I still missed other boilerplate like the < and > operators.

> The fact that you could have picked any example to prove your point about the standard library [...] to exercise what is a fringe and corner case

Oh come on. I literally said in my comment "they're often (read: not always) far more flexible and handle far more edge cases than open-source libraries do." You cherry-pick some of the absolute best C++ libraries out there as if they're somehow representative of the landscape, then force me to whip up a counterexample for you on the spot as if I have one lying around at my fingertips for every library. And when I nevertheless try to find something to give you an example of like you asked, you complain that it's... a corner case? Didn't I say that it's an edge case to begin with? And isn't this doubly ironic when you yourself cherrypicked libraries that were very much "edge" cases to begin with in terms of their high quality?

> fancy pointers in C++ are an area that quite possibly no one really truly understands.

This is a weird way to put it. This isn't something where C++ is just too complex for mortals to comprehend; it's something where the standard itself has shortcomings (like the ones you yourself linked to). The standard needs to simultaneously (a) provide some kind of generality and usefulness, while (b) attempt to address past issues in a mostly backwards-compatible way. Which is intrinsically hard because in the past it has made assumptions that probably shouldn't have in hindsight. All of which I'm more than happy to acknowledge; I never claimed the standard is flawless or that somehow easy to improve it when it might break a ton of old code.

What you need to realize about fancy pointers in particular is that part of the very reason they're under-utilized is their poor support, not because they're somehow fundamentally a "fringe" concept for people to want. (Unless you think nonstandard allocators are weird altogether, in which case I'm talking to the wrong person.) The commonly-cited use cases (like shared memory) are far more obscure than some fairly normal things you can do with them. For example, they're invaluable when you're debugging custom allocators; you can put things like bounds-checking into them to make sure the rest of your allocator is correct (in fact I'm pretty sure I was doing exactly this not too long ago). But to be able to do anything with them you need a container that is flexible and careful enough to not treat them interchangeably raw pointers.

But this is digging too much into fancy pointers and missing what I was trying to say, and it's making me waste hours more on this than I ever intended to. I was saying, in general, with things that require more flexibility than the obvious implementation would imply, I tend to find better support among standard libraries than third-party ones. This statement obviously has nothing to do with Facebook or folly in particular; you just picked 3-4 extremely high-quality libraries and made me fit what I was saying into a mold that's already a cherrypicked outlier, so I tried to come up with something for you on the spot to get my point across. Whether you think it was a good example or not, it's very much missing a point I was making about 3rd-party libraries in general. We can literally even assume those 4 libraries are flawless, and even then it would hardly even matter for a discussion that's about the ISO standard and its impact on the average C++ library.

P.S. There was a typo in my example for 'rebind'; you need to put U instead of T. You'll need to fix that and other uninteresting stuff (like operator<) to address errors with other containers.

[1] https://eel.is/c++draft/allocator.requirements

[2] https://en.cppreference.com/w/cpp/named_req/Allocator


Basically I ask you to give examples to showcase how the standard library handles flexible edge cases with high quality, and when I point out how the example you gave is fundamentally flawed your counter argument is that for your own example, and you could have picked anything: the standard is defective and goes against common sense, it takes 100 lines of code and an hour to implement a trivial example to showcase how flexible it is, it has made assumptions that it probably shouldn't have in hindsight, and a host of other reasons that basically showcase that the standard library isn't nearly as flexible or high quality as you made it out to be.

It was your example to give and it turns out that just providing a basic example requires all this complexity, exposes all these defects, isn't standard compliant and not portable across compilers.

You are certainly welcome to your opinion and I doubt either of us are going to convince one another at this point... but I am fairly confident most sensible people would not look at the example you chose to showcase and think "Wow, what a flexible and powerful API the standard library provides, very high quality." They will come away thinking that your example is everything wrong with C++; it's convoluted, error prone, and incredibly fragile.


I am a rather seasoned database person at this point, but when I wasn't - when I was just getting into database interactions - I had an academic background in relational algebra and I knew that SQL was the main communication language. For day to day work this was fine but at one point I was tasked with making our application DBMS neutral with support for running on top of SQLServer (TSQL), PostgreSQL, and MySQL. My first thought when coming to this problem was, well, let's take a good look at the grand daddy doco so I attempted to find the SQL standard.

It disappoints me that, to this day, the best reference for pure SQL out there are the postgres docs, postgres is actually pretty good about calling out non-compliances so you can get a really good grounding of what code is likely to be cross platform compatible.

I 100% agree with Tim Sweeney's sentiment. ISO are terrible at their job.


I was curious where the actual SQL standard document is. With a quick search this is the best I can come up with:

Second informal review draft from the accepted answer: http://www.contrib.andrew.cmu.edu/~shadow/sql/sql1992.txt

Downloadable iso standards (C-f sql) http://www.iso.org/PubliclyAvailableStandards

https://stackoverflow.com/questions/1714461/ansi-sql-manual


It's way beyond comical right now. PDF standard document has been available from evil monopolist's Adobe's website for free up until version 1.7. Version 2.0 has been "freed" from Adobe and is now an "open" ISO standard and costs about €180. Guess what I'm reading when I need to work with PDFs...


About 15 years ago I worked at a small ISP and one of my jobs was handling dmca take downs. Normally this was someone using bit torrent or other p2p software, and I would call them up and tell them to knock it off, or put a note on their account and block them until they called in.

One time it was IEEE, one of our customers was a college professor, and he was using his personal web hosting space (remember that?) To host his syllabus including a pdf of one of their standards. I tried to get a hold of him multiple times, but they kept contacting us, more aggressively than the RIAA or MPAA ever did. So I just deleted the file.


Aside from Tim making very good points, ISO-8601 is not a good standard - it tries to specify too many formats and ends up being so flexible that full compliance is rare. RFC 3339 is an open standard and is much simpler and more practical.


I do think ISO 8601:2004, which is now ISO 8601-1:2019, is a reasonable standard. It never tried to become a computer data type and/or format standard; it is a simple extension to human-readable date and time format modulo ambiguity. Even infrequent formats like intervals are not complex and can be easily learned. By comparison RFC 3339 only covers a very specific use case (which is the internet protocol), and if we had only RFC 3339 but not ISO 8601 we will still be fighting over mdy vs. dmy vs. ymd order for human-readable dates. The only problem with ISO 8601 is, well, it is not openly available and Tim is very right about that.

ISO 8601-2:2019 [1] is however a complete mess. It is too complex for human consumption and too ambiguous for computing uses. I argue that it should be shelved as soon as possible.

[1] https://web.archive.org/web/20171020000043/https://www.loc.g... (draft standard)


> ISO 8601-2:2019 [1] is however a complete mess

Wow, you weren't joking.

It introduces things like "?2015-?02-31" (year and month are uncertain, day is known), which may be abbreviated as "2015-02?-31". 'The character '?' (question mark) is used to mean "uncertain". The character '~' (tilde) is used to mean "approximate". The character '%’ (percent) is used to mean “both uncertain and approximate".'

There's also "1950S2" (2 significant digits, so a year between 1900 and 1999, estimated as 1950). You could also write 19XX, though that has a slightly different meaning (leaving the last two digits unspecified).

There are special "month" values to denote other divisions of a year. "2001-21" is spring of 2001, 2001-22 is summer, 2001-33 is the first quarter.

"R/20150104T083000/PM15S00/FREQ=YR;INTR=2;BYMO=1;BYDA=SU;BYHO=8,9;BYMIN=30" is a fifteen minute time interval every Sunday in January at 8:30:00 AM and 9:30:00 AM, every other year


Genuinely curious to hear the supposed use case for this stuff from the original author.


It was created by the Library of Congress for bibliographic purposes, hence approximate and uncertain dates. I agree that they have their uses. However it is also apparent that 8601-2 is trying to amend 8601-1 to fill the gap, in my opinion very badly:

- Sub-year groupings are not very well defined and you may want to avoid them, but if you want the basic approximate and uncertain dates you can't avoid them because they all belong to the level 1. Ideally each extension to 8601-1 should have been defined separately; they do have to be described altogether to avoid the conflict among them, but any part of the additional standard should have been optional.

- 8601-2 essentially took the whole recurring time interval mechanism from iCalendar and it is a simple textual format. Why not the same for extended date and time instead of sigils (e.g. `2021-03-09;SIG=P1M`)? Unlike 8601-1 it doesn't have an absolute requirement for usability. I can only guess but editors seem to have forgotten to harmonize different specifications.


Excluding the final monstrosity of an example in the GP post here — I think some of these would actually be useful in genealogical (family tree) research software — given the correct kind of UI.

Approximate dates, and unknown portions of dates are fairly commonplace in genealogy.

(No idea what the original author was thinking though — I wonder if the intended use cases are documented anywhere?)


Refugee children are often assigned birthday Jan 1 with an estimated birth year, for purpose of filling out immigration forms. More: https://www.racgp.org.au/afpbackissues/2008/200810/200810ben...

Though, I doubt many collection systems actually recognize this format for uncertain dates.


You're not wrong, but unfortunately a lot of libraries call it ISO 8601, and not RFC 3339. I think a lot of people probably mean RFC 3339 when they say ISO 8601 (myself included for a long time).


It's not just the ISO. ASME (American Society of Mechanical Engineers), IEC (International Electrotechnical Commission), PIA (Plastics Industry Association), etc. all have standards which they are supposedly trying to promulgate. There are a few standards that I've needed professionally which are freely available: the USB specs and the MIDI specs. https://usb.org/documents https://www.midi.org/specifications

When specifications needed to be printed and shipped, I understand that costs money, but electronic standards should be very low to zero cost to download.

I did find that DIN (Deutsche Institut für Normung or German Institute for Standardisation) is starting to publish some of their standards for free. https://www.din.de/en/din-and-our-partners/press/press-relea...

In the US, anything that is published by the government is supposed to be free of copyright. https://www.govinfo.gov/about/policies


Keep in mind that the DIN is only publishing the DIN SPECs, not the DIN NORMs freely. That's not nearly as good as it sounds.


And IEEE for important standards like Verilog and VHDL.

And ANSI and 3GPP for all kinds of telecom.


The posted link took me straight to Tim's tweet, so I didn't realize at first that he was replying to @isostandards:

> Hello, unfortunately, the ISO Central Secretariat does not provide free copies of standards. All ISO Publications derive from the work and contributions of ISO and ISO Members that contain intellectual property of demonstrable economic value.

> For this reason, considering the value of standards, their economic and social importance, the costs of their development and maintenance, we and all ISO Members have the interest to protect the value of ISO Publications and National Adoptions, not making them publicly available.

The ISO standard that I have the most experience with is ISO/IEC 9899:1990, aka C90. Part of the reason for that, of course, is that I had a used copy of Herbert Schildt's Annotated ANSI C Standard; it's also a considerably smaller standard than, say, SQL or C++.

I'm of mixed minds as to the value of the C standard. There is certainly value in having a standard. After sufficient study and deliberation, you can usually determine whether an input program or compiler implementation is standards conforming. When I compare the evolution of C and C++ to say, Python and Rust, I have trouble pointing to the specific value that ISO adds.

This isn't really a fair comparison, because the difference between C/C++ and Python/Rust isn't just the process, but the end result. I judge C and C++ not just by ISO's efforts, but by those of Microsoft, IBM, GNU, LLVM, etc. Python and Rust, meanwhile, ship a working reference implementation, and do a pretty good job of it. Rust has improved quite a bit six weeks at a time. C standards, meanwhile, ship closer to every decade. Even the new rapid pace of C++ is every three years.


Aside from obstructing access, ISO produces really low-quality standards. The prose alone leaves much to be desired in terms of clarity and concision. Knowing that, I'm relieved each time I am to deal with an IETF standard instead, which wins on both fronts (quality and access).


> Aside from obstructing access, ISO produces really low-quality standards.

Low-quality standards are themselves an access issue, and I think are reinforced by the paid-access rule, since that means the people reading the standards are a narrower group who is more specialized in dealing with ISO standards, who are more prone to become blind to “that’s just the way ISO standards are”.

ANSI (also paid access) standards such as the whole ASC X-12 suite widely used in industry and a portion of which is mandated federally for healthcare under HIPAA are a complete nightmare mess, too. (And it doesn't help that not only are they paid-access standards, but they often incorporate by reference literally hundreds of other, largely paid access, standards from other bodies, some of which have become obsolete and are no longer available when the standard referencing them remains mandated.)


ISO doesn't write the text of the standard, that is done by members of the individual Technical Committees and Working Groups.

I am currently a member of TC184/SC4 and have also been a member of TC22/SC31.


I hope you don't mind my asking this, but as a member of technical committees, do you ever get compensated for your time and expertise? Or is it simply a case of being paid by your employer to fulfill this role? Does any of the income from standards access ever make its way back to you, your employer, or other members?


None of the income from the sale of standards gets back to their creators. The model is no different really to that of how publishers control access to academic journals, you don't get paid directly for writing a standard or for reviewing and writing ballot comments on ones written by other people.

For the TC22/SC31 work I was just being paid by my employer and they provided the travel costs to meetings.

The scope of TC184/SC4 is much bigger, the core people work on it pretty much full time and get paid by industry consortia to deliver something that will meet their needs. Some parts have been written by postgraduate students who got their PhD out of the work.


ISO is in charge of forming these WGs/TCs and taking their output. When ISO WGs/TCs do a poor job, and ISO approves it, it's ISO's fault.


Anyone who wants to contribute can join a WG. Voting on whether to approve something is done by member nations. The ISO central secretariat does contribute editorial ballot comments but the WG members can reject them during the ballot review.


So what is "ISO", if not the sum of its "Technical Committees and Working Groups"?

If they write the text of the standard, that means ISO writes the text of the standard.


Would you claim that Elsevier is the sum of everyone in an University ?


And you really do need the standard.

I implemented the entirety of ISO 8601 years ago (not just the most common 5% that most people are aware of) -- parsing, representation, writing, validation, and arithmetic -- and there's no way I could've guessed some of the stranger cases the standard explicitly supports, and how, without being told what they are.

I hope I never see some of the truncated and reduced precision forms in practice, since perhaps no one but the handful of people with access to the standard will be able to interpret those correctly. :)


I actually had an issue like that recently, where our code didn't handle a special case in ISO8601 correctly. I managed to obtain ;) the standard and could adapt it, but if I hadn't obtained the standard, I'd have had to just close the issue with "WONTFIX".


ISO's argument seems to be that the fees fund the actual standardisation process – trips to conferences, meetings and other "technical work".

I could agree for something like the international definition of physical quantities, where people have to do actual experiments, or travel to see equipment or even transport standard items.

But in software designs, often when you tip more resources into a project, you get a worse result. This is true for the C++ standard. Some necessary improvements occur (modules, finally, albeit incomplete with respect to build details), but as long as a large group of people meet to evaluate each others' proposals in a series of desirable resorts, the language will continue to grow in size and complexity whether or not it actually needs to. For example, adding "xvalues" to "lvalues" and "rvalues" may solve some earlier design problem in the standard library, but makes the language less understandable for users and implementers and the standards text more opaque.

Indeed, the standards language for C++ is so inscrutable one suspects that you need to be a full-time expert to fully understand it. It then becomes a career for the people involved in the standardisation. Now they have an incentive for the language to grow without bound, and the more complex and hard to understand it becomes, the better.


As far as I know the fees don't even pay for most of that stuff. Committee members have to pay to be part of a committee, usually paid by their employer. The time actually spent developing the standard is on the members' employers' payroll as well.


This is definitely how it works with the IEC - I can't imagine the ISO are any different.


Sometimes even, the standard is split into different norms. For example, the EV charging cable is defined in EC 62196-{1..6}. Which in turn address other standards. So you finally need to by at least 10 PDFs to understand the darn thing..


Private standards are also how you end up with a variety of implementations that are all non-conforming in their own subtle ways. Some because the authors didn't have access and were attempting to do the best they could with documentation from other implementations, third party articles, and reverse engineering. Some because the authors have the standards, but the consumers of their products don't and won't hold the authors accountable since they don't know how it is really supposed to work anyway. Others because they think there is competitive advantage in deviating from the standard.

It's hard enough getting consistent behavior out multiple implementations of the same public standards.


I would at least start by distributing for free all standards targeted by a legislation.

Because those suckers cross-reference each other like crazy, a standard can have only a few paragraphs of useful content (and pages and pages of legaleeze and revision management around).


Speaking as someone who is on an ISO committee, this topic of discussion comes up every couple years.

The bottom line is that there are and will always be costs associated with a running an organization, regardless of whether they are for or non profit. There are overhead costs associated with running a website, conferences, technical review, or collaboration tooling for example. While some revenue comes from participants and membership fees, not everything gets covered. The organization does a lot for the public good and it's unacceptable to criticize these passionate engineers that have dedicated their careers to ensuring proper standards globally.


Tell that to the IETF, who publishes thousands of standards for free, standards that made your comment on this forum possible.

Open standards. Open research. Open code. This is how we progress as a society. Rent seeking organizations like ISO can rot in the dirt for all I care.


Sorry, but you're only kidding yourself if you think you can even compare an RFC with an ISO publication. There's rigor, formality, lawyers, technical writers, and many others involved. RFC is exactly what it sounds like and what was originally intended for: "Request For Comments". There's no conference or central body (at least to the degree that there is for ISO) and it's mostly done in an ad-hoc manner. Nothing against IETF or RFCs, but you're really comparing apples and oranges. I've been coauthor on two RFCs and it is nothing like an ISO committee.


Do you mean the same IETF who standardize a technology covered by Cisco patents ? These guys are a joke... Heck, there is even a song about it... https://www.openbsd.org/lyrics.html#35


Surely making a standards organization entirely dependent on subsidies by forbidding it from selling literally the results of its work in a free economy isn’t going to backfire in any way…


We have to make money somehow. I don't understand how people don't understand why this is the case. If you increase the cost of other things then organizations and people won't want to be part of the committees. A lot of the revenue comes from organizations purchasing the work.


ISO should be a division of the UN or something like that and get a steady budget to pay its staff. Not a regular private company that "has to make money somehow".

At the very least, ISO could make the cost to get a copy of a standard not some arbitrary 3-digit number of Swiss Francs, but let the committees themselves have a say in this.

In my opinion, for example, it would make perfect sense to make royalty-free specs (i.e. standards that only have Type-1 declarations) available online for free, while charging 4- or 5-digit numbers for patent encumbered specs (standards with Type-2 declarations) for which those who need access to those will have to pay 7- or 8-digit numbers in royalties anyway, so it's still an insignificant extra obstacle for them.

The goal of ISO should be to promote international standardization. Its current business model is instead creating financial obstacles to adopt international standards, as well as creating financial incentives to base implementations on outdated drafts and old versions, which defeats the purpose of having processes to amend, correct and extend standards.

Other standardization organizations like IETF, W3C, ITU are capable of "making money somehow" without having to put their standards behind a paywall.

Standards are a lot like laws, except you in theory voluntarily comply with them (though not always — there are plenty of cases where compliance to ISO standards is obligatory in one way or another). It would be insane if politicians would put the laws they write behind a paywall, for sale to lawyers at 118 Swiss Francs per article, with the argument that "judges need to make money somehow". If you expect people to respect the law, the least you can do is to let them actually see the law, for free. Why would it be any different for international standards?


> ISO should be a division of the UN

Heeellll no!

An international co-op, fine, but please not part of political committees such as the UN!


Yep. This is part of the reason I think a world where ISO is more or less self-sustaining, rather than subsidized by/part of some other entity that would be really tempted to influence it, is probably a better world.

In any case, ISO WGs tend to publish final drafts openly. Purchasing the actual standard document is not something required if you are just experimenting or playing around (and if you’re not, you probably can spare some change to support the organization).


There always have been and will always be international politics at play in industry standards organizations, regardless of whether you make it part of the UN or not.


And will there be more, or less, as part of the UN?


Probably roughly the same.


> it's unacceptable to criticize these passionate engineers

Sorry AN_ISO_Dude, but whatever merit someone/group has doesn't exempt them from valid criticism.

Show us the receipts, and how these standards couldn't exist without paywalls and I (we) would be a lot more sympathetic - the argument here is purely qualitative.


I know a whole bunch of people who are on an ISO committee, and not a single one of them is happy that their hard work will be locked behind a paywall.


Most (if not all) people on the committees don't like the paywall. The accountants have to explain every couple years why its necessary. Then people seem to forget and we have to do it all over again. It's cyclical.


But other standards organisations do just fine providing free standards, including fairly large and complex standards, so it doesn't seem particularly necessary at all. Hell, some ISO standards are freely available, presumably because standards that don't get implemented due to lack of access are kind of useless.


What would be a comparable organization to the ISO?


The ITU, for example.


Firing the accountants and all the admin staff would be a good start to save money. No need to save these jobs...


This has bugged me for a long time.

My company complies with HITRUST. Many of the HITRUST controls are syntheses of controls found in NIST, ISO, and other organizations.

In some cases, I want to know where a HITRUST control comes from: But since I can't look at ISO without paying, I am blocked from understanding the provenance of some HITRUST controls.

I don't like that.


This has always frustrated me. I sit on a number of IEC working groups as a technical expert (as part of TC 80 for maritime equipment). None of us get paid by the IEC (which is fine), but we all donate either ours or our company's time to attend meetings and produce input papers, etc. Meeting rooms tend to be provided by either national standards bodies (like BSI here in the UK) or companies who are providing technical experts to the meeting. In pre-COVID days the meetings were face-to-face, which meant somebody (not the IEC!) stumping up for travel and accommodation as well.

Even as members of these working groups writing standards that refer out to other standards, we don't get free access to these and have to pay for them. It's crazy.

I'm really proud of the work I've done over the years helping to get these standards published, but it's really annoying that there's such a barrier of entry to people who want to use them to produce compliant equipment.


As one that has experienced a project in the research phase being figuratively knee-capped by this money grab one would be led to believe that the ISO organization themselves would clearly and quite obviously see the lack of standards adoptions sourced from their very own actions. Catch-22 anyone?


I went to make a SQL to AST parser; was absolutely shocked to discover you had to purchase the SQL standard.


IMHO the fact that it costs money to read the ANSI SQL standard, is the biggest reason there’s never been any work toward a standardized SQL wire protocol.

(At this point Postgres’s wire protocol is becoming the de-facto wire protocol for smaller players to make themselves compatible with, but that’s still a long ways from being able to talk to every major enterprise DBMS with the same codec-layer library, rather than needing ODBC-style codec plugins.)

You need tons and tons of people to work together on a cross-implementation standardization effort like that (including, hopefully, lots of volunteer FOSS contributors); and all those people would need to be referencing the standard constantly. It’d just be untenable if every one of them needed to pay to see the standard they were coding against.


Don't know for which technical domain he demands open access to ISO standard texts (because many ISO standards, in fact, can be freely accessed [1] while others are accessible as pre-final, nearly identical versions), but I'm guessing ISO being essentially a complicated joint org of national standardization bodies, changing treaties and getting member votes etc. would be a difficult thing to do at this point. There's also the issue of spec editors selling copies of books containing spec text with commentary for financial compensation of their work.

[1]: https://standards.iso.org/ittf/PubliclyAvailableStandards/in...


I have had the hardest time with getting other developers to understand standards because of this. If the standards are walled off in some privileged access, how can the public, or users thereof, understand, much less contribute to said standards? There has to be a better mechanism.


I was not aware people buy ISOs at all. I thought everyone pirated them and only companies paid the actual price, same as with software licenses.


This is the engineering equivalent of scientific journals.


It is a surprise to see even the title. It is the fundamental reason why ietf beat iso. How to do x25, ftam, ... etc., you are F... Sorry. Still remember the university I do have an outdated x25 as it can pay to get the update. How to implement ... sorry who on earth you dare to write anything on asn.1 which is supervisor of your post-doc.

long time ago there is about a guy going around the world how his book on decnet and a new networking standard call “internet” is doing. He showed a bot how hard to get hold of iso standard.

Those were the days. I thought ... or they still do not learn.


> It is the fundamental reason why ietf beat iso

Like by making a patent covered tech a "standard" ? https://www.openbsd.org/lyrics.html#35


Is there going to be a scihub for specs? Spechub? This seems to be the build IP towards that.


Even if you get past the paywall, you'll be presented with incredibly obtuse standards.

I tried to use ISO 1016 for writing a software design doc with some success, but it was like pulling teeth.

First, you needed multiple ISO dictionaries to find out what half the words are referring to, and even then things are ill-defined.

For example, one of the required sections in an ISO SDD is the Context, but nowhere is context defined or described.

The standards just seem like a web of academic garbage with no connection to reality.

Woe to anyone that must implement them as part of their job.


Agree. Usually the parties developing a standard have a incentive to make it difficult for others to implement. They can claim it's open, while still limiting competition.

I remember when the first HTTPS spec came out, and me and everyone else were saying how hard it was to follow. And someone from Netscape came out and said the text was "intentionally terse".


I haven’t seen anyone ask this question yet: do we LOSE anything by removing these fees? Does the incentive structure change in any meaningful way? i.e. do we need these fees in order to incentivize the production of quality standards?

I agree that the fees seem like annoying gatekeepers, and antithetical to the purpose of standards. But if we remove them, where do the economic incentives come from?


I vaguely remember ISO wanting to make their standards freely available a couple of years back, but the BSI (British Standards Institute) blocking the move, because it conflicted with their business model.

I can't find a reference to it though and it was only briefly announced during an ISO meeting. Is there someone from ISO who can back this up?


IETF works around ISO's paywall by including necessary info in their free RFCs. For example, TLS uses X.509 certificates which use "ASN.1 object identifier" numbers from a non-free ISO/IEC/ITU document [0] [1]. IETF includes the required and optional numbers in the appendix of their free RFC on X.509 [2].

[0] https://www.iso.org/standard/80321.html

[1] https://www.itu.int/rec/T-REC-X.520

[2] https://tools.ietf.org/html/rfc5280#section-4.1.2.4


I was certain that the Ada language spec - an ISO standard - was made freely available. I was mistaken. They only make an old version freely available. [0]

I get the impression the standardisation of Ada has been broadly successful in providing assurance of conformity, [1] but that still doesn't excuse paywalling the document.

[0] https://en.wikipedia.org/wiki/ISO/IEC_8652

[1] https://www.adaic.org/ada-resources/standards/ada-95-documen...


As an editor of ISO/IEC 18181 (JPEG XL), I politely yet strongly disagree with this ISO policy of paywalling standards. It is an adoption obstacle and an incentive to use outdated drafts.

The core of the problem is that we somehow put a private Swiss company in charge of international standardization. Of course that company needs to make money _somehow_ to pay its staff. But that would better be done via government subsidies than via selling specs.

In a way, this reminds me a lot of academic publishing. It's not the publishers who write the papers, yet they get to paywall science.

We need open access standards, just like we need open access scientific journals.


German DIN / Beuth Publishing does the same. Even for students the only option is to come by a branch of the publisher. Can't comprehend how they are still considered a 'standard'.


That's pretty wild. If the standards aren't open-access how am I supposed to know if someone who claims to be following them is actually doing so?


Can we frame this as an equity or social justice issue, so that we can leverage the social media MaaS (Mob as a Service) infrastructure to effect change?


Yes, which is why so many standards are not ISO.

Some are, to be certain. Especially standards where a lot of money is riding on all the adopters agreeing on compliance, so there's a certain benefit to the "money where your mouth is" aspect of the service ISO provides.

But plenty of highly influential standards are not ISO.


I think that’s one of the reasons why Linux doesn’t have a multimedia stack on par with MS Media Foundation or Apple Core Video: https://en.wikipedia.org/wiki/MPEG-4#MPEG-4_Parts


FFmpeg/libavcodec is already available.


Technically it is, practically less than ideal. Example: https://www.willusher.io/general/2020/11/15/hw-accel-encodin...

That's for a $50 computer with 45 millions copies sold, with Linux being the only OS supported by manufacturer. On other computers can be much worse.

By comparison, install Windows on a computer, install a GPU driver, and Media Foundation API will use the hardware to decode and encode all supported formats, no recompilation needed.


Access to hardware acceleration has absolutely nothing to do with the standards of the formats themselves. Clearly, since open implementations exist, open source developers know how the formats function. The problem with HW accel is that hardware vendors keep the details of how to access functions of the hardware is kept secret or otherwise locked behind proprietary code. But the same can be true of completely open standards like AV1.


> Access to hardware acceleration has absolutely nothing to do with the standards of the formats themselves.

I have implemented accelerated video playback myself https://github.com/Const-me/Vrmac/tree/master/VrmacVideo and I disagree. Needed a lot of these standards, not just for containers and audio, for video decoding too.

> since open implementations exist, open source developers know how the formats function.

These formats are way more complicated than you think they are. Just because open source implementations decode/encode something doesn't mean the implementation is standard-compliant. My one certainly is not.


Not true. On x86 there is VDPAU/VAAPI for accessing HW encoders and decoders (GPU/CPU) and it works perfectly. FFMPEG and players support it out of the box on Arch Linux. HW encoders produce much worse quality than CPU encoding. But who needs HW encode/decode, there is no problem accessing them on Linux!


That article is from 2021: https://mastransky.wordpress.com/2021/01/10/firefox-were-fin...

Note the wordings like “for some Linux users”, “restricted to AMD/Intel graphics cards”. Also check the comments, there’re many other issues, with 4k on Intel GPUs, different desktop managers, and so on.

I think if these MPEG4 standards were freely available, Linux kernel developers would come up with something better. For instance, V4L2 API portable across hardware, with a higher-level wrapper around it.

If you look at MS Media Foundation docs, you’ll see portions of the API surface (like h264-specific media type attributes, or the bit stream, or the APIs for dealing with containers) taken directly from these ISO/IEC specs.


What you need is pluggable? It looks like GStreamer is the answer from FOSS community.


I need working and reliable. Patching code and re-compiling things is not the best UX.


And it's an industry standard, and best in its class. Sounds like someone who hasn't used Linux since the 00's.


Yep, hence GNU C extensions. Many are good and solve deficiencies in the language. I wish there was more collaboration between implementations before any became widespread in use, but I can appreciate avoiding design by committee.


I've recently implemented an AVIF encoder (built on ISO HEIF), and working with the specs was quite unpleasant. It's a tower of ISO standards built on top of other ISO standards, with each level they're getting older, broader, and less relevant to the original format they were supposed to be for, so sum of all these specs makes something that is bloated and overcomplicated.

A key spec in this pile was paywalled, so I've reverse-engineered an existing implementation instead. And guess what? The implementation I used as a reference wasn't compliant with the paywalled standard either. So now AVIF-in-the-wild is not compatible with the ISO-AVIF. Great job, ISO!


Is it even legal to require standards compliance in the law? I thought that law has to be open and available to everyone?


If a standard is the cost of a monthly wage in most of the world it's not a useful standard


"Everything should be free except the things I make" Twitter is at it again.


Is there an example of something akin to ISO standards but on GitHub?


having a paid standard devalues the entire concept of 'standards' and instead encourages non-standard approaches, like having a secret law would promote non-compliance to it.


I believe that the largest group of related ISO standards in terms of number of pages is ISO 10303 [1], I suspect that they could have achieved wider adoption if they were not paywalled.

[1] https://en.wikipedia.org/wiki/ISO_10303


Healthcare has long been the same way, HL7 was paywalled ($1000+/yr membership until 2012).X12 is still in the thousands for membership.


30 years too late.


I am glad that this is getting attention. Open access has been a discussed issue for academic journals for some time already but the locked down access for standards has received little attention. In addition to ISO the IEEE, SAE, NMEA all have their standards behind paywalls. Even ASN.1 was for many years, long after it was adopted for the RSA PKCS standards, a paywalled standard. This is incredibly frustrating.

I remember sending cheques to "Global Engineering Documents" in Englewood Colorado to get printed copies of various standards back in the 1990s and hoping that would die with the advent of the Internet and the possibility of cheaply distributing information. It was understandable in the world of paper that if you wanted some obscure technical document that it was expensive. They retained the publishing model but eliminated the reason it was expensive.

It has been encouraging that people like Carl Malamud have been working at making various aspects of our laws, regulations and standards public but more work is needed.

Some of my tweets over the years on the topic of open access standards:

"I am really thrilled by sudden attention on paywalled standards. Current model hurts standardization. So how about it @SAEIntl, @IEEESA, @NMEA_org, @isostandards, @ITUstandards will you join the 21st century and move to free open access standards? Alternative is your irrelevancy." https://twitter.com/mjduigou/status/1369033695030513670

"It annoys me that the IEEE standard for publishing test results probably won't be adopted by software industry because it is behind a fricking paywall https://ieeexplore.ieee.org/document/8662798" https://twitter.com/mjduigou/status/1273718847497887744

"I don't like that @IEEESA or @SAEIntl put their standards behind paywalls. Whatever revenue this publishing model makes is grossly offset by the impairment to, you know, standardization." https://twitter.com/mjduigou/status/1308889681187221505

"How many bullshit encodings have been created since 1984 because ASN.1 wasn't a freely available standard? Not that it is perfect but SO MUCH PAIN could have been avoided if there had been community adoption. That adoption didn't happen because it was not an open access standard." https://twitter.com/mjduigou/status/1308892326098546691

"Free the codes! https://www.kickstarter.com/projects/publicresource/public-s... Only slightly worse than patent trolls are public standards behind paywalls." https://twitter.com/mjduigou/status/384767046753320962


Let's protest by switching to YYYY-DD-MM.


Recently I was curious as to how the wireless emergency alert systems work. But the actual specifications [1] from ATIS (another standardisation body) that carriers are required to implement are paywalled with ridiculous prices. $145 for a 35 page PDF is way too much, and makes the whole system way less transparent.

[1] e.g. https://www.techstreet.com/standards/atis-0700036-v002?produ...


I see a nice parallelism with the concept of OSS. All these paywalled standards are like closed source software, in the sense that they are not created under terms that ultimately protect the freedoms of final consumers (those who end up reading the PDF).

The ISO business model is creating something (standards) and then profiting from their distribution.


Seems a bit dramatic.


The Fortnite money printer has made it possible for Sweeney to tilt at as many windmills as he likes.


In a time of vast databases of paywalled scientific papers you're telling me nobody has a collection of ISO standards?


LibGen has a section for standards, but I’ve rarely found it useful.

A better hack is to approach your local standards body. BIS in India publishes the ISO standards mostly-as-is as adopted by Indian Law/Regulations. They don’t make it easily accessible on the Internet, but it’s available if you ask them for it.


If there is, I haven't found it yet. Many national standards I've seen don't seem to even have DOIs assigned.

My tiny stash of climbing relevant EN safety standards only exists because of a scrappy Russian climbing community web server someone uploaded to.


There are torrents of related ones floating around, although they tend to be huge and not very seeded. For computing-related ones, especially popular subjects, you can often find a copy someone has hosted sneakily amongst other unrelated documents if you search hard enough.


Most would likely not even understand them as they are written in the best example of newspeak (from '1984') I have ever seen.


this dude needs to learn about libraries.


What library contains copies of ISO standards?


University libraries


And everybody in the world is a student at some university?


Dear Tim Sweeney,

Get together with Unity and open source your engines so we can drive towards a standard.

What’s actually interesting is content. Why should we developers be locked into walled gardens?

Thanks


There's something beautiful about watching a billionaire (or almost one) tell the ISO Central Secretariat "That's dumb."


What does their bank account have to do with anything...?


If there is anyone on earth who could reinvent the internet, it is Tim Sweeney. The metaverse is coming, and it will exist because of him. Think I'm being hyperbolic? Go listen to his SigGraph 2019 talk


It will obviously not ‘exist because of him’.

It wasn’t his idea, and thousands of people have worked on the technologies to enable it, and he’s not even the only billionaire with a company who is working on such a thing.

For the most part I see him complaining about people impeding him, rather than just solving the problems with his own resources.


Saving you a click... since I, for one, had no idea who Tim Sweeney was.

"Timothy Dean Sweeney (born 1970)[4] is an American video game programmer, businessman and conservationist, known as the founder and CEO of Epic Games and as the creator of the Unreal Engine, a game development platform."

https://en.wikipedia.org/wiki/Tim_Sweeney_%28game_developer%...

"Tune in to hear Tim Sweeney (founder and CEO, Epic Games) during his SIGGRAPH 2019 Talk, “THRIVE: Foundational Principles & Technologies for the Metaverse.” You’ll hear Sweeney posit the state of the games industry, the rise of creator-centric and social gaming, what the “metaverse” really is, and what it will take to build the medium on a larger scale."

https://blog.siggraph.org/2019/10/siggraph-spotlight-episode...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: