Very sensible idea - using data to build models of where and crime is likely to occur allows the police to put extra resources there. Of course, police always know where some crime is likely to occur (e.g. after dark in a red light district), but AI can spot patterns not immediately obvious to the human eye.
AI can also monitor CCTV to spot situations where trouble is likely to occur, such as a fight or argument and so allow far more CCTV to be monitored without even needing human intervention.
Just another version of Facebook et al misusing my data. I can just about live with CCTV as it does catch criminals but believing that the government won't either misuse or mess this type opf thing up is just naïve.
An inevitable application of big data. Surprisingly accurate in its predictions, as it is based on machine learning the biases and assumptions of the programmer shouldn’t be a problem but the programme and the basis of decision becomes opaque which in turn can become a problem, at least for mear humans!
Police nowadays seem to have to have abandoned the age old practice of patrolling the streets "keeping an eye" on things---now they sit inside fiddling their thumbs and drinking tea. They might as well use the computers.
The dangerous thing about these kinds of algorithms is that once they're tested enough to look like they reliably work, people stop checking them closely. And they don't realize that the algorithms often pick up the biases and preconceived assumptions of the programmer.
9 Comments
Login to comment
Ah_so
Very sensible idea - using data to build models of where and crime is likely to occur allows the police to put extra resources there. Of course, police always know where some crime is likely to occur (e.g. after dark in a red light district), but AI can spot patterns not immediately obvious to the human eye.
AI can also monitor CCTV to spot situations where trouble is likely to occur, such as a fight or argument and so allow far more CCTV to be monitored without even needing human intervention.
Toasted Heretic
I forsee problems.
Wakarimasen
Just another version of Facebook et al misusing my data. I can just about live with CCTV as it does catch criminals but believing that the government won't either misuse or mess this type opf thing up is just naïve.
ArtistAtLarge
Very dangerous.
englisc aspyrgend
An inevitable application of big data. Surprisingly accurate in its predictions, as it is based on machine learning the biases and assumptions of the programmer shouldn’t be a problem but the programme and the basis of decision becomes opaque which in turn can become a problem, at least for mear humans!
fxgai
I predict more geriatric robberies of convenience stores and the like, and more public money scandals committed by people with connections.
I didn’t spend tax payer money writing a computer program to bring this to you.
seadog538
Police nowadays seem to have to have abandoned the age old practice of patrolling the streets "keeping an eye" on things---now they sit inside fiddling their thumbs and drinking tea. They might as well use the computers.
katsu78
The dangerous thing about these kinds of algorithms is that once they're tested enough to look like they reliably work, people stop checking them closely. And they don't realize that the algorithms often pick up the biases and preconceived assumptions of the programmer.
Garthgoyle
Even the "Minority Report" had it wrong so. Ask Tom Cruise.