- A Florida sheriff's office deployed a futuristic algorithm that uses crime data to predict who is likely to commit another crime.
- In a sweeping six-month investigation published this week, The Tampa Bay Times found that the algorithm relies on questionable data and arbitrary decisions, and leads to the serial harassment of people without any evidence of specific crimes.
- Officers reportedly go to the homes of people singled out by the algorithm, charging them with zoning violations and making arrests for any reason they can. Those charges are fed back into the algorithm.
- The new report shines a light on the pitfalls of algorithm-driven policing and casts doubt on high-tech, AI-powered tools meant to fight crime.
- Visit Business Insider's homepage for more stories.
A central Florida sheriff built an algorithm meant to predict which people in his jurisdiction were likely to commit a crime in the future.
But according to a six-month investigation published this week by The Tampa Bay Times, the high-tech tool deployed by the Pasco County Sheriff's Office didn't lead to a reduction in violent crime — instead, dozens of families singled out by the algorithm said they were routinely harassed by deputies, even when there was no evidence of a specific crime.
In September of 2019, deputies reportedly showed up at 15-year-old Rio Wotjecki's door because the algorithm had determined that Wotjecki was one of the county's "Top 5" at risk of committing more crimes.
Before that, Wotjecki had only been arrested one time, a year prior, for sneaking into a carport and stealing motorized bicycles. He served time in juvenile detention for that offense, and hadn't committed any subsequent crimes — but because of the algorithm, police showed up at Wotjecki's house to question him at least 21 times beginning with that September visit, Wotjecki's mother told The Tampa Bay Times.
Shortly after one visit from deputies in January, Wotjecki began experiencing difficulty breathing and collapsed in his home. His mother told the newspaper that she called an ambulance, and that an emergency room doctor later found that Wotjecki was experiencing the effects of extreme anxiety. But the deputies' visits to his family's home didn't stop.
According to the Tampa Bay Times report — drawing on court records, police body camera footage, the testimony of dozens of people targeted by the sheriff's algorithm, and interviews with former employees of the Pasco Sheriff's Office — the predictive policing tool relied on questionable data sources and arbitrary decisions.
People's past criminal records — including charges that were later dropped — were fed into the algorithm to determine potential future offenders. Former employees of the sheriff's office said deputees were instructed to visit the homes of people the algorithm selected, charge them with zoning violations, and make arrests for any reason they could. Those violations and arrests were then fed back into the algorithm.
The report highlights the pitfalls of algorithm-driven policing, sometimes called predictive policing, which relies on past crime data to predict future offenders. Civil rights groups have called the practice unconstitutional, and law enforcement researchers question its efficacy — in recent years, police departments in major cities including Los Angeles and Richmond, Virginia, discontinued their predictive policing programs due to concerns over their fairness and effectiveness.
In a statement to Business Insider, a spokesperson for the Pasco County Sheriff's Office defended the practice, arguing other police departments use similar methods, and accuses the Times of depicting "basic law enforcement functions" as unnecessary harassment. The Sheriff's Office also published a Facebook post criticizing the Times' report.
"Unfortunately, the media outlet responsible for this piece didn't feel compelled to shed light on facts, and instead chose to weave a salacious, fictitious tale," the PCSO spokesperson said.