No one understands everything here. When it comes to software development, everyone I know says they're only really knowledgeable about what they work on. Most other areas, they'll have at best some passing knowledge of. It's like how doctors become specialists in just one part of the human body. There's maybe 5 people in the world are are a qualified dermatologist, endocrinologist, neurologist, and cardiologist at the same time. It's the same thing with engineers.
That's not true. I understand most of things posted and discussed here. And I'm not some amazing 100X programmer. Having a CS degree definitely helps.
However HN is definitely skewed towards the obscure and the nostalgic with a dash of genetics and cosmology. Most of the programming happens in traditional languages - C/C++/C#/Java/Javascript/PHP/Python, and HN would make you believe everyone is writing functional stuff with monads and functors in LISP or Erlang.
The community is great though, very knowledgeable.
Your impressions 1 and 2 can both be correct, and there still be an association between artificial sweeteners and type 2 diabetes.
It has been shown for decades that this association exists, it's establishing cause, and causal direction that's hard. Does consuming artificial sweeteners cause diabetes? Or does having diabetes cause the consuming of artificial sweeteners?
No, the "Potential for reverse causality cannot be eliminated" means that instead of artificial sweeteners causing diabetes, causality can be reversed: Diabetes causes the intake of artificial sweeteners.
All these studies just show an association, but can not prove the direction of causality. For whatever reason, the idea that diabetic and overweight people deliberately seek out zero sugar sweeteners so that they can enjoy sweetness without making their situation worse just doesn't seem like a plausible explanation to them.
Personally, if you can't definitively prove something is bad for you after 45+ years of research, I just don't care anymore. People can occasionally drink Coke regular without issue, I wouldn't be worried. Most of the time, diabetes is not caused by merely occasional consumption of sugars.
I've seen a lot of people getting Xiaomi air purifiers. Had no idea they put DRM on the filters, and they cost more. Really glad I went with Winix years ago, cheaper filters that last longer, and no DRM. I don't get why people can't just put a reminder in a calendar in the future to let them know to replace the filter instead of dealing with this stuff.
AMD consumer cards do compute, but do not have a developed or well supported ecosystem (ROCm is a mess).
Misguided segmentation or a high level architecture problem however, it is not. A specialized product will perform its specialty better than a general purpose product. There's minimal demand for a card that's good at both compute and gaming. These people exist, but are few compared to those who only care about one or the other.
The benefit of splitting GCN into RDNA and CDNA was immediate. Comparing the Radeon VII (GCN 5) vs RX 5700 XT (RDNA 1), you'll see that they're fairly evenly matched on gaming, trading blows with the Radeon VII slightly ahead when averaged. The RX 5700 XT takes a very heavy loss on compute benchmarks. Both are TSMC 7nm, but the RX 5700 XT has fewer shaders (2560 vs 3840), a smaller die (251 vs 311 mm2), and lower power consumption (225 vs 300 w) which shows how much more efficient it is at gaming. That much lower power consumption and noise, and a couple hundred dollars cheaper made it a much more compelling card for gamers.
CDNA cards are apparently missing components needed for gaming, like render output units. Hence, they have no official support for DirectX, OpenGL, or Vulkan. I've never seen anyone get one of these working for gaming. In exchange, their compute performance is so good that a number of companies are buying these cards over Nvidia's, despite the overwhelming CUDA ecosystem. In 2013, a GCN based supercomputer made the top 100. That is the one and only GCN based system to ever make the top 100. Compare to now, where 8 of the top 10 most energy efficient super computers use CDNA accelerators, as does the number 1 fastest supercomputer outright.
> These people exist, but are few compared to those who only care about one or the other.
They're few, but they're very important, because 5 years from now, they will be the ones working for the companies that AMD will want to sell their compute cards to.
The reasons you cite may still be more important. I'm not saying AMD made a bad decision. I'm just saying that neglecting the path from hobbyist to professional (or student to professional) is not insignificant.
AMD's compute oriented cards used to come with displayport output, but I haven't seen one of those in a long time. These cards are definitely GPUs in that they can handle graphical workloads, but I don't think anyone is trying to make them work for video games or the like.
I heard a rumor somewhere that Stadia ran on MI25s - not sure if that's true but certainly there have been a lot of them floating around on eBay in the past year.
My choice as well. I like the colors my camera puts out mostly, so I only really need to make a few adjustments here and there. DarkTable doesn't even conform the RAW to the preview JPEG, so I end up wasting a bunch of time just getting the RAW to look normal in the first place. RawTherapee also seemed far more intuitive to figure out.
Yes, before editing. All the other RAW editors I've seen, including the other FOSS editor RawTherapee, will automatically adjust the sliders to match the embedded preview JPEG. DarkTable doesn't do this, all my RAWs load in looking really strange.
You can set presets, I don't. Because the jpeg version doesn't have to be the "right" version of the photo (there actually is no such thing as right, hence my preference to start from a pure raw file).
The rabbit hole of knife sharpening goes incredibly deep. But if you're just looking to cut food in your own kitchen, the vast majority of it is overkill. Professionals seem to make a bit deal out of knives, but many professionals use their knife as much in one day as a home cook does in one year.
Most expensive knives are made out of difficult to work with metals. They keep their edge for longer, but:
1 They're harder to sharpen. The longer the knife can hold its edge, the longer it takes to grind out that edge. Higher end knives typically require diamond to sharpen at a reasonable pace, or at all.
2 They chip easier. Just like like glass used for smartphone screens, the better it resists scratches, the easier it shatters. High end knives, if they're not thick, need to be carefully handled, or you'll get a surprising number of nicks and chips in the edge. A sign someone doesn't handle their expensive knives properly is if the pointy tips are chipped off.
3 They rust easier. High end knives often use metals with higher carbon and/or lower chromium content. I learned this the hard way, as I air dry all my dishes, and knives with even 440C will rust if you don't towel dry them after washing. Ultra hard knives that use non-stainless steel need to be kept covered in oil to prevent rust.
If you're just getting into cooking, you don't need much. Mercer Culinary is the go to brand for culinary school students. Their Millennia line comes sharper, harder, and more durable than any grocery store knife, while still being highly rust resistant.
If you want to use a whetstone, avoid the soft "beginner friendly" stones. They need to be soaked in water for like half an hour before you can use them, and they wear out very quickly. The only reason they exist is because they provide more feel or feedback. Instead, I'd recommend a Shapton basic 1000 grit. It's a hard and durable stone that cuts fast, only needs a splash of water, and leaves you a very usable edge. Lower grits are for re-profiling the blade, such as if you've got nicks or chips. Higher grits are for polishing the edge, if you want to shave with it or something.
If using a whetstone seems too difficult, and you want an easier to use, but slower option, look into a Spydero Sharp Maker. I run my knife one pass through this thing before I use it, and it keeps the edge consistently sharp. I use my Shapton whetstones for when I'm sharpening my friends' knives.
Learn to slice instead of just ramming your knife straight down into the cutting board. Of course, some things require a chopping motion, but slicing is safer, and your knife edge will last longer.