I guess this is going to be the week to talk about predictive policing. There is an article over at the Santa Cruz Sentinel that talks about Santa Cruz, CA Police and their predictive policing program. There was an interesting bit in there that talk about how they use this predictive policing model in their city.
When officers on a shift are not responding to calls for service, they are asked to check on those areas for potential crimes of auto or home burglaries, said police spokesman and crime analyst Zach Friend.
Officers might normally check those problem areas by intuition, but police said the new system adds some statistical backup.
"Many of the locations we suspected would have a high probability, but this adds a verified math component to it. This enhances the officer's intuition," said Deputy Chief Steve Clark said. "We can more efficiently target these locations."
Santa Cruz police typically discuss the predicted hot spots with officers during roll call meetings before each shift, Friend said. Officers receive a map and list of hours and places with corresponding probabilities of crime.
I'm hoping that as these programs are evaluated, the methodology behind the predictive method will start being discussed in the trade journals. I'd love to be able to provide an analytical product like this for my officers but I also don't have a huge budget to buy a commercial software package to do this. I'd probably have to do this with the tools I already have.
Even better would be a DOJ funded application given away for free that crunched the numbers and did the analysis. Similar projects have been done in the past such as CrimeStat III for spatial statistics and crime mapping applications.
Have you given any thought to trying a predictive policing model at your agency?