Take our user survey and make your voice heard.

Voices
in
Japan

have your say

What do you think about police using computer programs to predict where crimes may occur, and who might commit them?

9 Comments

©2024 GPlusMedia Inc.

9 Comments
Login to comment

Very sensible idea - using data to build models of where and crime is likely to occur allows the police to put extra resources there. Of course, police always know where some crime is likely to occur (e.g. after dark in a red light district), but AI can spot patterns not immediately obvious to the human eye.

AI can also monitor CCTV to spot situations where trouble is likely to occur, such as a fight or argument and so allow far more CCTV to be monitored without even needing human intervention.

-1 ( +0 / -1 )

I forsee problems.

0 ( +0 / -0 )

Just another version of Facebook et al misusing my data.  I can just about live with CCTV as it does catch criminals but believing that the government won't either misuse or mess this type opf thing up is just naïve.

0 ( +1 / -1 )

Very dangerous.

1 ( +2 / -1 )

An inevitable application of big data. Surprisingly accurate in its predictions, as it is based on machine learning the biases and assumptions of the programmer shouldn’t be a problem but the programme and the basis of decision becomes opaque which in turn can become a problem, at least for mear humans!

1 ( +1 / -0 )

I predict more geriatric robberies of convenience stores and the like, and more public money scandals committed by people with connections.

I didn’t spend tax payer money writing a computer program to bring this to you.

2 ( +2 / -0 )

Police nowadays seem to have to have abandoned the age old practice of patrolling the streets "keeping an eye" on things---now they sit inside fiddling their thumbs and drinking tea. They might as well use the computers.

3 ( +3 / -0 )

The dangerous thing about these kinds of algorithms is that once they're tested enough to look like they reliably work, people stop checking them closely. And they don't realize that the algorithms often pick up the biases and preconceived assumptions of the programmer.

4 ( +5 / -1 )

Even the "Minority Report" had it wrong so. Ask Tom Cruise.

4 ( +4 / -0 )

Login to leave a comment

Facebook users

Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts.

Facebook Connect

Login with your JapanToday account

User registration

Articles, Offers & Useful Resources

A mix of what's trending on our other sites