From 2020-2024, the UN recorded ~17,000 deaths among children in all measured countries combined. The share of that that is among older children/teens and in developed countries like the US is vanishingly small.
>It seems that he is spending a lot of his time & effort on maintaining his hair (huge kudos for admitting that!). If his body was functioning like an 18-year old he should not need to do that. If Bryan really set back his true aging clock he probably would not be battling with ongoing hair loss (as he most likely did not battle hair loss 20 years prior), or at least it would stop (which does not seem to be the case).
Androgenetic hair loss (what Johnson has and what the medications he takes are designed to treat) is caused by genetic sensitivity to DHT and DHT levels in the body, not aging. People who don't have the genetic predisposition to AGA or who have very low DHT levels can often make it to 60+ with no visible hair loss. It's not like they're "not aging," they just don't have the combination of genes and DHT levels needed for significant hair loss to occur.
Biological aging is a fuzzy concept that we define relative to various social constructs and other objective markers compared to the general population.
I don't see much issue with its usage in this context.
You could define it as "the degradation of cellular maintenance". Androgenic hair loss is more like a non universal signal of hormonal maturity, like pubic hair growth.
I'm not super well versed but my understanding is that that's right, with your actual DHT levels being the third variable. There are other non-hormonal causes of hair loss, like alopecia areata, but they're much less common.
DHT damages the hair follicles that are sensitive to it, causing them to grow back thinner after each growth and shedding cycle (which is typically a few years). As a result most men don't have visible hair loss until years after puberty kicks in.
It's interesting because there are medicines that can block DHT or reduce it's effectiveness, but they have other negative side-effects. It's almost like if you have this gene and are a healthy man, you should loose your hair - I wonder what the evolutionary mechanism of that is.
None, and in fact it has been proposed that men suffering from AGA, especially if it's early onset, may have something equivalent to polycystic ovarian syndrome in women.
The occurrence of a genetic background in the etiology of polycystic ovarian syndrome (PCOS) represents the rational basis to postulate the existence of a male PCOS equivalent. Hormonal and metabolic abnormalities have been described in male relatives of women with PCOS. These males also have a higher prevalence of early onset (<35 years) androgenetic alopecia (AGA).
Hence, this feature has been proposed as a clinical sign of the male PCOS equivalent.
Clinical evidence has shown that men with early onset AGA have hormonal and metabolic abnormalities. Large cohort studies have clearly shown a higher prevalence of type II diabetes mellitus (DM II) and cardiovascular diseases (CVDs) in elderly men with early onset AGA. In addition, prostate cancer, benign prostate hyperplasia (BPH) and prostatitis have been described.
These findings support the existence of the male PCOS equivalent, which may represent an endocrine syndrome with a metabolic background, and might predispose to the development of DM II, CVDs, prostate cancer, BPH and prostatitis later in life. Its acknowledgment would be helpful for the prevention of these long-term complications.
The mechanism of baldness is related to the mechanism of growing a beard isn’t it? For me personally I started going bald in my mid-20s, and since then as I lose hair off the top of my head it seems to reappear on my face. So perhaps it’s just a virility/experience signal.
Maybe in the past the fact that someone had survived long enough to go bald or grow a big beard would signal that they had the skill or hardiness to avoid dying young, which might make them more attractive and likely to pass on the gene.
Lawyers aren't limited to isolated cases. Class actions exist for this exact reason and regularly lead to hundred million dollar+ payouts from banks and other large defendants for harm done to large groups of people.
I don't see cultures mentioned anywhere in the article or the comment you replied to.
If mean that there are no differences in the genes that determine height across populations, that is incorrect. For example, the average male height in Pygmy populations is 5 feet, or about 7 inches lower than Cameroon's Bantu populations who live alongside Pygmies. We know there are genetic differences between Pygmy and Bantu populations that are specifically associated with height, and that average Pygmy height increases along with the percentage of Bantu ancestry (https://journals.plos.org/plosgenetics/article/info%3Adoi%2F...). Similarly, differences in height across European populations have been demonstrated to be partly due to differences in the hundreds of genes associated with height (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3480734/).
Fair, Pygmy genetics are minorly distinct. No other group is, and unless you're suggesting the urban Pygmy population exploded to account for these changes, it's irrelevant.
> Pygmy genetics are minorly distinct. No other group is
One group in all of humanity happened to have measurable differences in genes affecting height?
How does that explain the many other distinct "Pygmy" groups entirely separate from those in West Africa, such as those in the Philippines, who have genetic differences in growth hormones that cause them to be shorter (https://doi.org/10.1515/JPEM.2002.15.3.269)? "Pygmy" simply refers to one of these many groups, they are genetically and geographically distinct.
Pygmy groups show how absurd the claim that there are no differences in height is, but the same kind of variation has been shown within European groups on a smaller scale (consistent with the greater genetic diversity in Africa than Europe). We know that similar differences in genes that influence height exist, to a less extreme degree, across all populations. For example, males in both the Tutsi and Dinka ethnic groups average nearly 6' tall, despite living in areas where malnutrition is/was widespread and alongside other groups that are much shorter on average.
I'm genuinely curious where you got the idea that there are no genetic differences affecting height between ethnic groups/countries. Researchers have identified genetic markers responsible for the majority of variation in human height among Europeans (https://www.biorxiv.org/content/10.1101/588020v1), and shown that those markers vary significantly across European groups just as they do in Africa (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3480734/).
Every single variation in "genetic trends" amongst cultures is negligible and otherwise entirely attributable to environmental factors.
The source of the ideas you're asserting as fact are very dark and extremely thoroughly debunked at this point. Honestly it's kind of shameful to present these 1920s ideas as modern.
Individually yes, genetics contribute to height, but the full range of height genetics exist in every culture.
>Individually yes, genetics contribute to height, but the full range of height genetics exist in every culture.
"The full range of incomes exist in every country, therefore differences in average incomes across countries must be negligible."
You've linked a long list of articles talking about race in America and Brazil, none of which I can see claim that there are no differences in genetic height by ethnic group. One is a book from 1944 that rightly rejects racially discriminatory policies and argues "that experience rather than inborn traits determines standards of social conduct." What part of that supports the idea that the 6' Tutsis or Croats are genetically predisposed to the same height as the Taron who average 4'3?
The MedlinePlus article you linked explicitly says "in some cases, ethnicity plays a role in adult height."
"Race" categories as described by all of your links that I saw ("black"/"white") are crude aggregations of hundreds of ethnic groups. Tutsis and Pygmies are both "African," but at the opposite extremes of human height. Framing the genetics of height in terms of American racial politics makes no sense.
>The source of the ideas you're asserting as fact are very dark and extremely thoroughly debunked at this point. Honestly it's kind of shameful to present these 1920s ideas as modern.
The "source of the ideas" are the peer-reviewed journals of PLoS Genet and Nature Genetics based on landmark genetic research published in 2012 and 2020. As in hundreds of other peer reviewed studies that have identified genetic markers responsible for height and how those markers vary across populations, including articles you yourself linked without apparently reading.
>Every single variation in "genetic trends" amongst cultures is negligible and otherwise entirely attributable to environmental factors.
Is the vastly higher rate of sickle cell anemia among people of West African descent is either "negligible and otherwise entirely attributable to environmental factors"? Is it simply Europeans' "environment" that causes them to go bald and get skin cancer at rates dozens of times higher than people of East Asian descent growing up in the same countries?
There are zero credible peer-reviewed journals that publish the idea that height or any other genetic traits are attributable to a single "racial group", as the biological concept of "race" does not exist.
So no, you have not linked any peer-reviewed study saying what you claim.
That also raises the question of whether increased focus on these issues can negatively affect at least a subset of people.
We know that severe mental (and even physical) symptoms can "spread" in social groups (https://en.wikipedia.org/wiki/Mass_psychogenic_illness), particularly among young women who are now experiencing the sharpest rises in mental illness. There is also strong evidence that affective states (e.g. happiness or depression) spread socially, with even next-door neighbors of depressed people being significantly more likely to be depressed than those on the same block but not next-door (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3830455/).
While the TikTok Tourette's phenomenon is clearly an extreme example, it seems possible that something similar could be happening to kids constantly hearing about anxiety and depressive disorders as an immutable trait that everyone around them seems to have.
Were cities less automobile centric in the '70s, when the obesity rate was far lower? According to the CDC, physical activity has actually risen significantly in recent decades (https://www.cdc.gov/physicalactivity/data/index.html#:~:text....) People tend to blame fast food, "HFCS," walking, big pharma or whatever else, while largely ignoring the simply incredible amount of food Americans now eat.
Exercise is important, but walking can only do so much when the average American eats almost seven Big Macs worth of calories daily (https://en.wikipedia.org/wiki/List_of_countries_by_food_ener...). You can have a Big Mac meal with fries and evil HFCS soda three times a day and still eat fewer calories than the average American. An extra mile a day of walking burns about 100 calories, or just over 1 Oreo.
The lever works the other way. A person who walks for their daily errands or for commuting, will not get overweight. Or otherwise they wouldn't be able to walk. So they won't eat as much. It's not the calories burned by walking that makes them thin, it's the integration of walking into daily life that provides the unconscious cues to eat healthier.
I don’t see your first link supporting the claim “physical activity has actually risen significantly in recent decades”. When I click through that first link, I eventually get at https://journals.humankinetics.com/downloadpdf/journals/jpah..., which is about self-reported physical activity and thus “subject to recall and social desirability biases.”
It also only is about leisure time physical activity. So, if a subject buys a robot lawn mower, stops mowing the garden every week, and starts driving to the gym to do half an hour of moderate exercise once a month, the number measured here goes up.
Ignoring those, it says
“The prevalence of insufficient activity was not significantly different in 2018 compared with 1998 for most subgroups (Table 3), with exceptions of increases among men, adults aged 65 years or older, adults of Hispanic origin, adults with less than a high school education, adults in the Midwest or South Census regions, and adults with obesity, and a decrease among adults with a college degree or higher.”
and
“recent increases in meeting or exceeding the guideline overall are primarily driven by more people reporting sufficient activity to meet the high guideline, not the minimal guideline.”
So, it seems any increase in exercise comes from those already doing it doing more of it.
I agree that much of their use of "holistic" factors is simple discrimination and should be illegal, but considering other non-academic factors (arguably including character) seems reasonable, if not necessary, given the intent of these schools to build future leaders in various fields.
Quantitative measures can only give you so much information about a student. GPA is questionably useful past a point, where it starts to have more to do with grade inflation and gaming the system than differences in hard work or ability.
That leaves the SAT, which is far more of a level playing field than things like extracurriculars or "personal statements" that also end up reflecting your social class and ability to play the admissions game more than ability. Yet admitting students based solely on a single standardized test seems to disincentivize working hard at other pursuits that may actually bring more to the classroom than slightly higher test scores.
If there is a magical way to gauge character in a short amount of time then sure, however I doubt most people will do any better than a coin flip when asked to predict the "character" of someone without having known the person deeply.
> If there is a magical way to gauge character in a short amount of time then sure
This is key. Many universities that claim to do "holistic reviews" don't have the time or resources to actually do holistic reviews. University of Washington is an example. UW gives application reviewers 8 minutes per application. And who are these reviewers? Do they have the knowledge and experience to do a "holistic review" in 8 minutes, including reading essays and personal statements and so on? Nope! They hire grad students, retirees etc. to act as application reviewers [1].
The thing is, getting into a top level school requires maximizing all of GPA, SATs, and what might be called "extracurricular appeal"; this crowds out a bunch of different pursuits, more so than grinding test-taking ability (which has rapidly decreasing marginal returns) alone would. You might hope that those different pursuits being crowded out would instead feed into increased extracurricular appeal, but in practice there's a very particular subset of things that universities care about when it comes to extracurriculars. Indeed, some of them (like leadership activities in 4H, ROTC, or Future Farmers of America) actually hurt your chances of admission, which seems insane if you're looking for a variety of impressive individuals who can bring diverse perspectives.
If I were dictator of college admissions across the US, I'd use tests and GPA to coarsely bucket individuals (into basically capable of doing the work or not at each institution), and use a lottery to distribute spots where demand outstrips supply.
The difference between excluding people from payment processing and physical private property is that payments is clearly an oligopoly for which no viable alternative exists. Treating monopolistic industries that provide essential services differently from competitive ones like drug stores is not a new concept. I would argue the difference should be based on the industry, not the size of a given company.
If 4 companies owned 100% of the property and in a region (as Visa/Mastercard/Discover/Amex do in the credit card industry) and worked together to limit the rights of people they disagree with to protest, you'd have a better point. That's clearly not the case. When you kick someone out of a store, they have the ability to protest anywhere else, and those are very real alternatives. I can grant you that that still limits the protester's speech to a small degree, but the ability to easily protest in many other ways mitigates the harm of that limitation. By contrast, people that are locked out of payments platforms have no alternative.
Similarly, if Walgreen's and CVS had a total oligopoly on pharmacies and colluded to exclude people based on political views, that would be a strong argument to regulate them.
Similarly, I'm not "harmed" as a Stripe user when an OnlyFans model or someone whose politics Stripe's executives disagree with receiving payments through the same platform. If there is a higher objective economic cost, pass that on through fees and stop trying to force the moral views of a few tech executives on everyone in the country.
There is at least some evidence that levoamphetamine is more effective than dextroamphetamine for "hyperactivity and aggressiveness" associated with ADHD (https://jamanetwork.com/journals/jamapsychiatry/article-abst...), and other studies that have found that it has less of certain side effects than dextroamphetamine (though more of others).
A more cynical explanation is that where levoamphetamine and dextroamphetamine alone were not patentable, a new combination of the two (as Adderall is) was patentable, making it more profitable to research.
ADHD is a collection of symptoms that differ enough across individuals that it's almost certain some patients genuinely do respond better to different combinations of drugs.
Interesting research, thank you. Though worth pointing out this study is comparing pure l-amph with pure d-amph. I'm not aware of enantiomerically pure l-amph being used clinically, which is an interesting side note: why not?
Are you aware of studies comparing adderall specifically to d-amph? This has certainly piqued my curiosity.
My impression was that dextroamphetamine also worked better in many patients than drugs that combine it with levoamphetamine such as Adderall. Even when Vyvanse was under patent and expensive, it was preferred by many for this reason (Vyvanse is converted into dextroamphetamine in the body). Levoamphetamine primarily acts on the norepinephrine system, while Dextroamphetamine acts primarily on dopamine, being 3-4 times as strong as Levoamphetamine in that sense (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2670101/). As a result, many patients find levoamphetamine to cause much more physical side effects like racing heart, jitteriness, etc. and be less effective at promoting mental focus than Dextroamphetamine.
Adderall is still 75% Dextroamphetamine, but I haven't seen much literature on why it would be preferred over drugs like Vyvanse that include no Levoamphetamine.
reply