There's historical reasons regarding per-host connection limitations of browsers. You would put your images, scripts, etc each on their own subdomain for the sake of increased parallelization of content retrieval. Then came CDNs after that. I feel like I was taught in my support role at a webhost that this was _the_ reasoning for subdomains initially, but that may have been someone's opinion.
They seem to reference a paper at "19)"[0] which goes into much further detail. It's odd though reading where they reference it, they talk about nothing below 0 C at all.
Is that because that's the limit of the sensor or an actual reading. It's something that happened with dosimeters at Chernobyl where people didn't realize the levels were so bad until someone pointed out that the sensors were maxed out because the levels were higher than the range the units were designed to display. I would hope that's not the reason, but it has happened in the past
I understand some their complaints, others seem like typical support questions. It's inevitable and regardless of making money, it's going to happen. People suck at reading.
But, I can't look past the AI art used to generate the YouTube thumbnail on the main page. I'm sure some artist wishes they could work on an art full time, and still be able to pay their bills just as you say you wish about your software.
Do you feel they would have paid someone to make the thumbnail if they didn't AI generate it or would we just have had a thumbnail that's less interesting?