Hacker News new | past | comments | ask | show | jobs | submit login
Analysis of 5.9M PredPol crime predictions left on public server (gizmodo.com)
7 points by blottsie on Dec 2, 2021 | hide | past | favorite | 3 comments



> Residents of neighborhoods where PredPol suggested few patrols tended to be Whiter and more middle- to upper-income

Is this bias in predicting crime though? Or just fits into overall, societal structures where middle- to upper- income neighborhoods report fewer crimes? There’s certainly systemic reasons for why this is, but if the goal of a crime predictor is to predict crimes; and if bias means an inaccuracy then this may not be bias.

If a city has limited patrol resources, is it fair to send them equally to rich and poor neighborhoods?

Obviously, we need to work to remove the underlying reasons causing higher crime for race, ethnicity, and other factors that have systemic bias against those groups. I’m not sure if low income populations will ever commit the same crimes as high income populations as poverty seems to be a big driver to crime. Our resources are likely better spent to eliminating poverty.


I have two thought experiments:

- Imagine the exact same software was used, instead by the localities' fire & rescue departments. I don't suspect anyone would complain about trying to put ambulances as close as possible to populations that tend to need medical care. As far as optimizing a locality's resource use by analyzing historical data, this is the best they can do.

- Next, imagine a case where the software was just flat out wrong, all the time. Would any departments buy it?

I don't have a real solution. If we're limited to exploring software fixes instead of societal ones, my best shot would be to take the algorithm and flip it on its head, reframing the purpose. Instead of "predicting" crime and putting officers where crime is known to exist, why not put them where crime isn't yet known to exist? If officer-visible crime is committed in equal portion by all populations, this at least has the potential to reduce bias.

It would make me much less comfortable having police roaming around looking for crime, as they tend to find—or invent—what they're looking for... but maybe it'd be better to spread that discomfort equally than to limit it to disadvantaged communities.

Finally, I could be easily convinced that the algorithm is more impartial than where officers would choose to spend their spare time on their own[1]. Of course, that depends on being fed accurate crime data, which the article (correctly, IMO) suggests does not exist.

1. eg. if there's a sudden smattering of crime in a white/affluent neighborhood, I suspect any reasonable algorithm would recommend patrols there more readily than the officers would otherwise.


Looks like they didn't do their research. This has been tested before by an app called sketchfactor. It didn't go so well of course https://www.newyorker.com/business/currency/what-to-do-when-...




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: