This story isn’t new, but it’s new to me, and I found it fascinating because it crosses over with two of my personal interests — true crime and algorithm-based problem solving. The use of algorithms, or at least some kind of software, to get a better idea of how to catch serial killers has been growing for some time now. It’s not perfect, of course, but it’s getting better all the time. But it always seemed like it was improving very slowly, given that, at the same time, the world was dumping trillions of dollars into AI algorithms that can determine things like how likely you are to buy a used Toyota in the next 30 days, or how likely an aberration inside an organ is likely to become cancerous, or what awful reality show you’re likely to enjoy due to the fact that you binge-watched these three other reality shows.
It’s a bit hamstrung by the fact that US law enforcement is very fragmented and behind the curve in terms of using centralized databases and technology — and, sadly, by the fact that collectively we stand to profit far greater from the Toyota algorithms than we do solving the murders of the marginalized populations who often fall prey to murderers. But this reporter Thomas Hargrove (a fellow Mizzou J-School alum, I might add) is hammering through the red tape and may just be the guy who builds the algorithms that make it much harder to get away with being a serial killer in the future. Let’s hope so, anyhow.
https://www.bloomberg.com/news/features/2017-02-08/serial-killers-should-fear-this-algorithm