Artificial Intelligence Courts Culture Intelligent Design Mind

Study: Crime prediction algorithms do no better than a crowd of volunteers

Spread the love
controls for AI/Pbroks13

From Maria Temming at Science News:

Computers get a say in these life-changing decisions because their crime forecasts are supposedly less biased and more accurate than human guesswork.

A comparison of the volunteers’ answers with COMPAS’ predictions for the same 1,000 defendants found that both were about 65 percent accurate. “We were like, ‘Holy crap, that’s amazing,’” says study coauthor Hany Farid, a computer scientist at Dartmouth. “You have this commercial software that’s been used for years in courts around the country — how is it that we just asked a bunch of people online and [the results] are the same?”

There’s nothing inherently wrong with an algorithm that only performs as well as its human counterparts. But this finding, reported online January 17 in Science Advances, should be a wake-up call to law enforcement personnel who might have “a disproportionate confidence in these algorithms,” Farid says.

Farid has his doubts that computers can show much improvement. He and Dressel built several simple and complex algorithms that used two to seven defendant features to predict recidivism. Like COMPAS, all their algorithms maxed out at about D-level accuracy. That makes Farid wonder whether trying to predict crime with anything approaching A+ accuracy is an exercise in futility. More.

Maybe computers would be better at predicting crime among sociopathic robots than among humans. 😉

See also: Math prof asks Rob Sheldon: But how do we know that it isn’t a conscious machine?

and

Why human beings cannot design a conscious machine: Basic physics would suggest that even that single neuron has properties that cannot be duplicated by all the world’s supercomputers running Attoflop simulations.

2 Replies to “Study: Crime prediction algorithms do no better than a crowd of volunteers

  1. 1
    polistra says:

    Cops don’t have confidence that the algorithms will work. They have confidence that using the algorithms will cut down on lawsuits and consent decrees by the Federal dysgovernment.

    Human judgment is illegal.

    It’s better to maintain SOME functionality with the algos than to be shut down and jailed by the Department of Misjustice.

  2. 2
    mahuna says:

    Since we’re actually talking about Recidivism and not the decision of an “innocent” to commit his first crime, I have to believe that the 65% thing is something more like “chance of a parolee being CONVICTED of a new offense”. Career criminals are basically, well, CRIMINALS. They are antisocial and believe they are smarter and deserve more “stuff” than fools who work 40 hours a week. Or as one sociologist concluded, “Programs to ‘re-introduce’ convicts to society fail because the convicts have never been INTRODUCED to society.”

Leave a Reply