I don't know that it's meaningful to throw out numbers like 20 or 30 years anymore. I've seen enough experts make predictions in this field and be wrong that the only prediction I'm comfortable making about their predictions is that they're probably wrong.
Why do you care so much about school shootings specifically? Shouldn’t be the goal to reduce overall child mortality? In which case it doesn’t make sense to focus on school shootings. They’re a rounding error in child mortality. Compare actual death rates between first-world countries to see what I mean; the US lags behind its Western peers, but the presence of school shootings, which is generally referred to as a decidedly American problem, is not nearly large enough to make an appreciable difference [1].
It's about empathy. Publicly stating that you are concerned about children being shot shows empathy and boosts one's image.
Publicly stating that you are concerned about child obesity is dangerous because it's implicitly stating that people are raising their children incorrectly and making poor decisions, which is more of an accusatory than empathetical statement.
The "think about the children" crowd don't actually want you to think about the children.
Maybe because I'm from one of the countries that doesn't have school shootings, so everytime I see that shit on the news I think I sure am relieved I can send my kid to school and not have that particular anxiety hanging over me. It's just so senseless. It's less about the raw numbers and more about how bloody horrific such an event is for everyone involved.
The data present in the conscious experience of human beings is something I've been thinking about a lot lately. It's definitely a very important part of the puzzle and it's missing from the training data, and it's not clear that it ever can be included. Leads me to wonder if the only way that AGI could ever happen for real is to have embodied, embedded, emotional agents that go around making mistakes and learning from them like we do?
I've thought about this as well: What effect might having a machine where all data was collected via colocated sensory systems, over a continuous period of time have on our approach to create a model for AGI?
I don't believe that would be required to create an AGI, but I do believe this experience would be necessary for an AGI to form the similar concepts of 'self' and 'others' the we have.
AGI itself might likely just be a combination of various specialized models, and not exclusive to any concept corporeal existence, individual identity or awareness.
Lawyers know the system is fucked and also basically advise false to give false confessions sometimes too, since it's hopeless after a certain point and trying to fight it is seen as delaying the inevitable.