There were 5882 cases of CJD between 2007 and 2020 in the United States (not specific to CWD though). Nonetheless, when I learned of that tally a few weeks ago I didn't expect the tally to be so high. Public health messaging had conveyed to me that CJD was an extremely rare occurrence in the US.
A larger-than-expected percentage of people Dxed with CJD are elderly (mid-70s to mid-80s) women which leads me to wonder if medical equipment was used when delivering babies exposed many women? I wonder if their children (who were being delivered) were also infected? In deer the mother-offspring infection route's quite well studied, but I'm not well informed about the human offspring route.
The incidence of Creutzfeldt-Jakob disease (CJD) has risen considerably from 2007 to 2020, particularly among older women, according to a research letter published in JAMA Neurology.
The progressive, universally fatal prion disease CJD has been stable in the United States (US) between 1979 to 2006. The most common subtype of CJD, sporadic CJD, tends to affect older individuals. As the global population ages, the epidemiology of CJD may be evolving.
To evaluate recent trends in CJD in the US, the researchers sourced data for this cross-sectional study from the Wide-Ranging Online Data for Epidemiologic Research multiple cause of death database. Death certificates between 2007 and 2020 for CJD were assessed for volume and decedent demographics.
The incidence of CJD increased consistently between 2007 and 2020, in which there were 5882 total cases and 51.2% occurred among women.
I have no doubt that this "feature" is a backroom deal worth millions because OpenAI is running out of public internet data with which to improve its models. (See this paper from researchers at MIT and a few other schools which predicts that high-quality text training data will be 'used up' by 2026: https://arxiv.org/abs/2211.04325 )
Think of all of the email, Google Docs, and other data that Alphabet has that it can use to train and improve its models. OpenAI has limited ways to get non-public text data unless Microsoft is giving them some data from Office users, Hotmail users.
Just my two cents. And whatever Dropbox is doing with retrieval augmented generation (RAG) / "new+better search" with the OpenAI APIs: I'm certain it could be done with less latency and probably would cost less if the RAG 'feature' / 'new search' was built in house at Dropbox.
>The restrictions were supposed were supposed to only come into play on Nov. 17, 30-days after the US first announced it. But in a filing on Oct. 24, Nvidia said it was informed that the rules were effective immediately and that it would affect shipments of Nvidia’s A100, A800, H100, H800, and L40S products. The 800 series chips were designed specifically for the Chinese market to circumvent the earlier iterations of the export control rules.
I don't think L40/L40S are allowed to be exported when all the other AD102 (the underlying silicon) variants are banned including lower specced ones like 4090s.
> exceeding certain performance thresholds (including but not limited to the A100, A800, H100, H800, L40, L40S, and RTX 4090).
L40 throughput is technically lower though since it isn't clocked to the stratosphere like the 4090. I was wondering if it was barely at the threshold or something.
This video is incredibly long, dull, and lacking in any technical depth (as new product releases often are) but I watched Zuckerberg's keynote speech - the 45 minute-long one - and the engineering guy's speech in which he talks about LLMs because I work in the language models space and was hoping to hear more news about open-source LLMs and Llama2, Llama3, etc.: https://m.facebook.com/MetaforDevelopers/videos/meta-connect... (I wish that video were on a seek-able platform with a searchable transcript like YouTube.)
In any case, the digital avatars of famous people were introduced in Zuckerberg's keynote speech, along with the absurd RayBans+Snapchat camera sunglasses that now record and transmit audio (great...), as well as the third generation of their AR/VR headset. (It seems they're leaning heavily into the _augmented_ reality instead of a virtual world in the 'metaverse'.) Oh, speaking of the AR/VR headset: Xbox games are coming to it in December 2023. I didn't expect that: Microsoft and Meta joining up on an AR/VR headset.
As someone who works in tech in the United States, but who had previously lived in a country in the EU that is more privacy minded than a lot of other EU countries and is decidedly more privacy-oriented than the United States, I must say that the RayBan + Snapchat video and audio sunglasses thoroughly creeped me out.
This article touches upon but a few of the reasons why I do like those devices: https://www-heise-de.translate.goog/hintergrund/Wie-Facebook... Yes, there's an application for hands-on learning with AR/VR goggles and it makes it easier to connect with one's friends and family on the other side of the world, but I don't want to exist in a society where everyone is wearing a potential surveillance apparatus on their face and where people interact with but a digital simulacrum of the real world.
I'm terribly curious as to if anyone has done market (or academic) research into these notions of 'digital avatars' of not-famous people since LLMs have grown in ability? I'd read some of the literature on the perceived helpfulness or utility in question-answering via digital avatars some years ago, but that was probably a decade or more. Can anyone recommend any recent research in the space? I'd also truly love to read any marketing-focused materials on this flavor of tech product as I'm just not convinced that there's a real market there for digital avatars / digital 'holograms' of real (live or dead) humans.
See my comment elsewhere on this post. Greg Brockman, head of strategic initiatives at OpenAI, was talking at a round table discussion in Korea a few weeks ago about how they had to start using the quantized (smaller, cheaper) model earlier in 2023. I noticed a switch in March 2023, with GPT-4 performance being severely degraded after that for both English-language tasks as well as code-related tasks (reading and writing).
Check out this post from a round table dialogue with Greg Brockman from OpenAI. The GPT models that were in existence / in use in early 2023 were not the performance-degraded quantized versions that are in production now: https://www.reddit.com/r/mlscaling/comments/146rgq2/chatgpt_...
Greg Brockman from OpenAI said a round table chat a few weeks ago that ChatGPT is heavily quantized since the end of Q1/early Q2 2023: https://www.reddit.com/r/mlscaling/comments/146rgq2/chatgpt_... ; I am looking for the source document / source quote from which I read it, but the big switch from 'not so stupid' to 'pretty damn stupid' occurred with the 1 March 2023 model switch.
That's around the time that I noticed `gpt-3.5-turbo` becoming lower quality, whether in the UI (ChatGPT) or programmatically (via `gpt-3.5-turbo`) API calls.
The 10-20x lighter-weight version of the models that they're (OpenAI) running now - the heavily quantized version - allows them and Microsoft to save on far-and-away their largest expense: cloud expenditure. I suspect/expect that the AMD GPU announcement with OpenAI will come to fruition in the next few years as all of these LLM companies depend upon large piles of GPU compute to be able to train their models and no one wants to be beholden to NVIDIA or any one other GPU manufacturer.
A while back I was reading up on private equity as an 'industry' after a prior employer was bought up by PE, carved into bits, with each 'chunk' being sold off to the highest bidder.
> Macellum’s campaign resulted in the addition of 4 of 13 directors. Also, as a byproduct of the pressure applied during the campaign, the company sold non-core assets equal to approximately 75% of the market capitalization. Macellum’s price objective were met and we exited the position at the beginning of 2021. Subsequently, a fractured board and new CEO were unable to oversee and execute the turnaround plan Macellum outlined during our campaign and results deteriorated.
A larger-than-expected percentage of people Dxed with CJD are elderly (mid-70s to mid-80s) women which leads me to wonder if medical equipment was used when delivering babies exposed many women? I wonder if their children (who were being delivered) were also infected? In deer the mother-offspring infection route's quite well studied, but I'm not well informed about the human offspring route.
--- Full paper here (not free): https://jamanetwork.com/journals/jamaneurology/article-abstr...
The article in JAMA, above, has been summarized on this page: https://www.neurologyadvisor.com/topics/neurodegenerative-di... I'll paste the summary text below:
The incidence of Creutzfeldt-Jakob disease (CJD) has risen considerably from 2007 to 2020, particularly among older women, according to a research letter published in JAMA Neurology.
The progressive, universally fatal prion disease CJD has been stable in the United States (US) between 1979 to 2006. The most common subtype of CJD, sporadic CJD, tends to affect older individuals. As the global population ages, the epidemiology of CJD may be evolving.
To evaluate recent trends in CJD in the US, the researchers sourced data for this cross-sectional study from the Wide-Ranging Online Data for Epidemiologic Research multiple cause of death database. Death certificates between 2007 and 2020 for CJD were assessed for volume and decedent demographics.
The incidence of CJD increased consistently between 2007 and 2020, in which there were 5882 total cases and 51.2% occurred among women.
[article continues on the Neurology Advisor page]