Are there better numbers? It seems like these track existing conventional wisdom about the recent crime burst, no? You're just trying to throw out the last 2 years showing a decline? Is that really a reasonable argument?
Obviously yes, there's an apples/oranges problem with comparing data sets collected in different countries under different law enforcement regimes, etc...
But between e.g. 2022 and 2017 in San Francisco specifically? I don't see the argument.
(Also important to note that while "Larceny" might be plausibly related to police ignoring crime, other things like "Murder" are very much not if you aren't accusing the police of hiding bodies. And violent crime shows the same trend.)
It is much more likely that the police changed a bit on how they report things than that the population at large changed. The real changes gets lost in the noise of police reporting changes. Police reporting changes not just via bureaucratic decisions but also the feelings of the police force in general because it is the people at the bottom that decides what to report, which is very fickle and can change quickly with reasons like "we catch thieves but they just get released, so we stopped caring".
Covid likely changed crime rates, yes, but it likely changed police reporting rates much more. That goes for all kinds of events. Saying crime is down since police reporting is down is like saying that kids learn more today since they get better grades today than 10 years ago.
Edit: You get much better numbers by asking people if they have been robbed lately, or asking stores how much gets stolen.
"Are there better numbers?" is not a question that justifies trusting bad numbers. If the best numbers you have are known to be unreliable, using them just because you don't have better ones is not justified.
Let's say the real amount per year for the last 10 years is [100, 110, 120, 130, 140], and you have numbers that show [90, 89, 88, 87, 86]. Those numbers are much better than [1, 2, 3, 4, 5]. It would still be absolutely wrong to use them to figure out the trend.
If you know your source of data is bad, you must throw it out, even if it's the best one you have. If our data is bad, we just don't know.
We should be careful about the "it's the best we got" phase of the argument. It usually doesn't add information, but pushes the convo as though it does. Presenting numbers is additive, questioning those numbers relevancy can be additive. Of course the "best we got" might not be good enough to make a call.
In this argument, the sides are something like crime is down, crime is up, and don't have enough info. Roughly speaking, you're arguing for the first over the second whereas the responder is arguing for not enough info.
Even in situations where maybe you have to make a decision and don't have great data, if you don't feel great about the best info you have, then it might be better to use something else like the wisdom or gut instinct of the team or what's cheapest or what you're most able to walk back later. Data tends to make us lazy about digging deeper, it's okay when the data is good, but worse no data when it's not so relevant.
There are probably better numbers somewhere. Likely several sets worth. One of the things that makes SF's numbers especially thorny is that SFPD engages in a daily campaign to deter reporting crime.
This makes the official numbers known unreliable, but also means there's lots of room to debate how much more reliable any alternative set of numbers might be.
Obviously yes, there's an apples/oranges problem with comparing data sets collected in different countries under different law enforcement regimes, etc...
But between e.g. 2022 and 2017 in San Francisco specifically? I don't see the argument.
(Also important to note that while "Larceny" might be plausibly related to police ignoring crime, other things like "Murder" are very much not if you aren't accusing the police of hiding bodies. And violent crime shows the same trend.)