The point is that there are a lot of other things which easily become a problem if you do them by yourself instead of using known good implementations.
> We never take a software methodology, school of programming or
some random internet dude's "manifesto" at face value. Rules
must be broken, when necessary.
In this specific case it seems particularly necessary. I don't think I will take this manifesto at face value.
In some conditions, yes. You need a cluster of points which have good reflectivity and coherence properties to microwaves over some time (months to years). Manmade steel and concrete structures, like bridges, houses, dams, etc, usually work very well.
You can't measure their position to the millimeter range, but with some interferometry techniques you can measure their movement to the millimiter range, relative to close points. Some variation of https://www.sciencedirect.com/science/article/pii/S092427161... was likely used in that work, I've seen it done for many other structures (and I even tried to setup a pipeline for doing that for commercial customers, but in the end we didn't manage anybody to fund us).
You can probably get better measurements with an onsite survey, but using satellite data has the advantage that with a handful of satellites you can map an entire country once every one or two week, and after throwing some computing power at it you can theoretically monitor all the bridges and houses at once and get early predictors of possible problems.
These case studies give you a hint of what can be done: https://www.sarproz.com/case-studies/ (I'm not and never have been affiliated with that product, just linking some cool pages).
The DirectX specs are much better than both the OpenGL and Vulkan specs because they also go into implementation details and are written in 'documentation language', not 'spec language':
If you search for 'D3D12' spec what you actually find is D3D12 doesn't have a specification at all. D3D12's "spec" is only specified by a document that states the differences from D3D11. There's no complete holistic document that describes D3D12 entirely in terms of D3D12. You have to cross reference back and forth between the two documents and try and make sense of it.
Many of D3D12's newer features (Enhanced Barriers, which are largely a clone of Vulkan's pipeline barriers) are woefully under specified, with no real description of the precise semantics. Just finding if a function is safe to call in multiple threads simultaneously is quite difficult.
I don't think that going into implementation details is what I would expect from an interface specification. The interface exists precisely to isolate the API consumer from the implementation details.
And while they're much better than nothing, those documents are certainly not a specification. They're are individual documents each covering a part of the API, with very spotty coverage (mostly focusing on new features) and unclear relationship to one another.
For example, the precise semantics of ResourceBarrier() are nowhere to be found. You can infer something from the extended barrier documentation, something is written in the function MSDN page (with vague references to concepts like "promoting" and "decaying"), something else is written in other random MSDN pages (which you only discover by browsing around, there are no specific links) but at the end of the day you're left to guess the actual assumptions you can make.
*EDIT* I don't mean to say that Vulkan or SPIR-V specification is perfect either. One still has a lot of doubts while reading them. But at least there is an attempt of writing a document that specifies the entire contract that exists between the API implementer and the API consumer. Missing points are in general considered bugs and sometimes fixed.
> I don't think that going into implementation details is what I would expect from an interface specification.
I guess that's why Microsoft calls it an "engineering spec", but I prefer that sort specification over the Vulkan or GL spec TBH.
> The interface exists precisely to isolate the API consumer from the implementation details.
In theory that's a good thing, but at least the GL spec was quite useless because concrete drivers still interpreted the specification differently - or were just plain buggy.
Writing GL code precisely against the spec didn't help with making that GL code run on specific drivers at all, and Khronos only worried about their spec, not about the quality of vendor drivers (while some GPU vendors didn't worry much about the quality of their GL drivers either).
The D3D engineering specs seem to be grounded much more in the real world, and the additional information that goes beyond the interface description is extremely helpful (source access would be better of course).
Your quote is also: "Capitalization matters a tremendous amount". But at least in some positions capitalization is mandated by grammar rules. So, when a noun should appear in such a position (e.g., just after a full stop), how does one distinguish between its two forms (capitalized and not capitalized)? There should be a way, if that "matters a tremendous amount".
> - "Mitochondrial Eve" was not the only woman alive at the time, just the only one who still has surviving descendants
The only one who still have surviving descendants /in a purely maternal line/. I guess most of the women of her time have descendants today, but at some (many, in most cases) point in the descendance passes through a male.
Yes. If we consider genealogical rather than genetic inheritance, then incredibly you only have to go back 5,000 years or so to find a common ancestor.
To me it's not even that incredible, after some thinking. The number of descendants (and ascendants) of a single person grows exponentially with time. Assuming that a certain closed population remains constant, it means that most of the 2^n descendants of a single person after n generations are actually a much smaller group of people counted many times each (i.e., through different lineages), for n large enough (but not even too much). So there is some "pressure" for essentially everybody in the closed population to eventually become descendant of a given person after a number of generations roughly comparable to the logarithm of the population size. Estimating that a generation take 25 years, 5000 years means 200 generations. 1.12^200 ~= 7 billion, so a logarithm base of 1.12 is already around enough to justify the common ancestor age.
The world population is not constant, of course, and it doesn't behave according to the assumptions I made implicitly, but the order of magnitude makes sense. A single person moving from a people to another is enough to unify the two lineages.
It's not very clear to me why in paragraph "Truncation matters" it is claimed that the strlen variant is necessarily better than the strlcpy variant. the strlcpy variant only scan the source and destination string once in the fast case (no reallocation needed), while the strlen variant needs to scan the source string at least twice. I guess in the common case you have to enlarge the destination a few times, then once it's big enough you don't have to enlarge it anymore and always hit the fast case, os it makes sense to optimize for that.
It might also be that in some programs with different access patterns that doesn't happen and it makes sense to optimize for the slow case, sure, but the author should acknowledge that variability instead of being adamant on what's better, even to the point of calling "schizo" the solution it doesn't understand. In my experience the pattern of optimizing the fast path makes a lot of sense.
BTW, the strlcpy/"schizo" variant could stand some improvement: realloc() already copies the part of the string within the original size of the buffer, so you can start copying at that point. Also, once you know that the destination is big enough to receive a full copy of the source you can use good old strcpy(). Cargo cult and random "linters"/"static checkers" will tell you shouldn't, but you know that it's a perfectly fine function to call once you've ensured that its prerequisites are satisfied.
No doubt he is very curious, but there might be some selection bias here.