tech

Apple to scan U.S. iPhones for images of child sexual abuse

64 Comments
By BARBARA ORTUTAY and FRANK BAJAK

The requested article has expired, and is no longer available. Any related articles, and user comments are shown below.

© Copyright 2021 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.

©2021 GPlusMedia Inc.

64 Comments
Login to comment

This can go awry pretty badly.

Having millions of parents with pictures of their own kids in the bath and infants or very small kids who might not have a shirt on or whatever is going to pinging an alarm at some office in the US where some people can look at them and make decisions?

No way.

38 ( +39 / -1 )

Who owns these photos?

2 ( +7 / -5 )

The thin end of the wedge

18 ( +18 / -0 )

Apple said its messaging app will use on-device machine learning to warn about sensitive content without making private communications readable by the company. The tool Apple calls “neuralMatch” will detect known images of child sexual abuse without decrypting people's messages. If it finds a match, the image will be reviewed by a human who can notify law enforcement if necessary.

So covert operators who use their phones to document scenes of death and destruction on missions, or law enforcement themselves, will the software also flag data on their phones? Or will certain encryption protocols be enabled? How about animal abuse? Will the tool be available on GitHub? I think not.

4 ( +5 / -1 )

So all the creeps have to do is not use apple products to get away with it.

4 ( +6 / -2 )

Big Brother IS watching. The camel's nose poking into the tent is followed by the whole body.

18 ( +18 / -0 )

I'm all for protecting children, but scanning phones the people own is a step too far.

23 ( +23 / -0 )

Creepy. I sometimes get messages from people I don't know which I just ignore but anyone can send a photo/image.

16 ( +16 / -0 )

Land of the Free, huh?

Free to do what we tell you to, more like.

10 ( +10 / -0 )

China is going to love this!

17 ( +17 / -0 )

Big tech spy. Reason why you don't want to be storing your pictures in the cloud. Get a real digital camera pocket size so it stays locally on a mini ssd stick. All these smart devices the company has a backend door in place to access your devices when they feel like it. No privacy at all.

7 ( +7 / -0 )

Our freedom NOT to be followed, tracked etc has already been disappearing fast with facial recognition etc etc within a decade its going to be scary, we will all essentially have our entire lives being recorded & tracked......

I would hate to be young now they will NEVER know freedom in all likelyhood!

10 ( +10 / -0 )

howabt you will use these money to fight against online scammers instead

7 ( +7 / -0 )

Sounds like a good way for the Russians or whomever to set someone up for prosecution.

4 ( +4 / -0 )

Scary. Maybe go back to flip-open cell phone.

3 ( +3 / -0 )

Imagine,

Police come to your door. Maam, we have a search warrant to search all PC's Laptops, Cell Phones HDD's,

It seems apple picked up on some photos that seem questionable.

Nothing found her guys.

Well guys wrap it up just your normative bubble bath rubber ducky parenting photos here. False alarm. Meanwhile the rumors are out in your community. Your ruined!

When is enough enough?

This is the most garbage kaka doo doo thing I have ever read. Yeah I am all for catching predator's that prey on children, But this is not the way to do it.

5 ( +6 / -1 )

@Reckless, Hell ya!

0 ( +0 / -0 )

Apple said its messaging app will use on-device machine learning to warn about sensitive content without making private communications readable by the company. The tool Apple calls “neuralMatch” will detect known images of child sexual abuse without decrypting people's messages. If it finds a match, the image will be reviewed by a human who can notify law enforcement if necessary.

I'm not sure if that's entirely accurate. The messaging feature and the detection of known CSAM content on iCloud appear to be separate things (see link to Apple's site below). The messaging feature will warn parents if a child views a picture that is thought to be sexually explicit. The notification to law enforcement happens if you upload "known" images of child sexual exploitation to iCloud. I'm not sure if that makes thing better or not, but it would help if we got a clear explanation of what is intended to happen.

https://www.apple.com/child-safety/

0 ( +0 / -0 )

The only people who would object to this are those who have something to hide.

-8 ( +2 / -10 )

This is a tough one. The use case is an excellent one, but the potential for abuse is concerning.

2 ( +2 / -0 )

The only people who would object to this are those who have something to hide.

Slippery slope theory suggests otherwise.

5 ( +5 / -0 )

Might as well get a Huawei phone and let China scan your device through backdoors. Sometimes when I'm talking about buying an item I've never searched nor on my computer or phone, just talk, few days later I get ads about that specific item or service on Google, Facebook or IG. Meaning your phone's always listening. And now Apple wants to scan your phone for "child porn" images? Nooooo thanks.

3 ( +3 / -0 )

The Big Brother approach.

FB, blocked an image of president Obama, I posted - saying it violated 'community standards'. It was a current photograph related to his 60th birthday.

That illustrates the refined nature of their 'algorithm'.

FB, also blocked photographs of the War in Vietnam.

Pedophilia is abhorrent. As is any manner of sexual abuse.

However. Private entities empowered to conduct surveillance of the public at large and act as agents of law enforcement, will eventually violate and abuse such powers on behalf of the state. Mass surveillance should never be allowed under any circumstance.

Totalitarianism, will arrive at the hands of techno-fascists. Under a liberal guise. Wielded by elites. It will initially assert benign intentions.

4 ( +4 / -0 )

Good news and thank you Apple,

""governments looking to surveil their citizens""

remember what Mr. Snowden once said, "If you got a smart phone anyone can monitor every move you make even when the phone is turned off. so I am not worried about Apple new scan, it keeps the CREEPS at check.

-6 ( +0 / -6 )

Next Google Advert: "American Pedos Buy Android"

-1 ( +0 / -1 )

I hate Apple since mid 1980s when I was a programmer and would newer own any Apple product!

0 ( +3 / -3 )

while I love the iPhones and iPad devices, I cannot wait for the Google Pixel 6 and 6 pro devices I heard that Google is going all out and making them an ultra-premium device and they even made a custom chipset to go with. and the new information we are getting is that the camera is the best we have ever seen when compared to other top deceives.

-2 ( +0 / -2 )

Encryption needs to be secure and trusted. This breaks that trust. It will go wrong and will be exploited by dictatorships. On the plus side, it highlights a serious issue that most users do not know about.

It is a huge wake-up call to anyone that has commercially or politically sensitive material on a system using cloud storage. At the very least, Apple systems should not be used by governments and other entities unless they are happy to share everything with Washington. Apple is a US company and has to comply with national security-based requests. If there is a back door, the USG will use it. Other governments will demand to use it too, especially in China.

It raises the issue of what else is scanned on your device/PC. Is Microsoft scanning the contents of your HDD? Is Google checking your attachments?

This is the ultimate back door for governments to lift information about dissidents, other governments or trade secrets.

If you are handling sensitive information, you may need to do it offline, encrypt it using a proprietary encryption routine, transfer it to an internet connected system, and then e-mail it. At the other end, you need to decrypt it on an offline system. Perhaps we can no longer trust the major OS providers (who are all American).

Note that increasingly, some software will only work with a live internet connection, so you may never be able to secure your data using it, if it is auto-scanning. This is a huge security risk.

If you are developing next generation technologies, pre-patent, or discussing government policy with your political colleagues, you may not be able to trust online 'software as a service' products as offered by the major vendors. You may need to find a product that works on an offline system and switch to it. Now. Today.

The continued Japanese passion for the fax may spread.

There are other ways to chase perverts which do not strip the internet of secure encryption.

Any governments and industrial entities who do not wish to share everything they do online with Washington, really need to wake up to the implications of this.

1 ( +2 / -1 )

This can go awry pretty badly. 

Having millions of parents with pictures of their own kids in the bath and infants or very small kids who might not have a shirt on or whatever is going to pinging an alarm at some office in the US where some people can look at them and make decisions? 

No way.

I agree. Very slippery slope and Apple doesn't have the right spy on anyone which is a separate issue because this can spread out to other things. What happens if we get to the point where these phones will share information and use it to what ends? Entrapment, extortion, bribery? This could destroy potentially a lot of lives.

0 ( +2 / -2 )

Postscript. The 'next big thing' in tech may the widespread adoption of secure, sandboxed computing environments, offering a Works/browsing package protected from the host OS.

These exist, usually to run older OSs, but more attention will now have to be paid to how effectively the data within them can be protected from being scanned.

-1 ( +0 / -1 )

Apple has also been scanning user files stored in its iCloud service, which is not as securely encrypted as its messages, for such images.

And this is the very reason I bought my own cloud that way Google and Apple can keep their service and do as they please and I can have my own privacy with a private company, I felt like these giants were slowly going to do this and infringe on everyone's civil liberties and it's finally here.

-2 ( +0 / -2 )

The only people who would object to this are those who have something to hide.

Did your parents ever take photos of you as a baby taking a bath or just running around the house naked? Now that could land you in jail for a very long time followed by a life spent registering as a sex offender. And Apple will help put these dastardly "law breakers in the slam and ruin their lives. I hope that makes you very happy.

1 ( +3 / -2 )

Do the hustleAug. 7  04:25 pm JST

The only people who would object to this are those who have something to hide.

Please post your usernames and passwords for all your accounts, you have nothing to hide, so just let everyone look. Or, leave your front door unlocked; better yet, remove the door - you have nothing to hide.

The 'nothing to hide' argument is bs. We all hide things all the time. We close curtains, wear clothes, keep our ID in wallets instead of hanging from a lanyard around our necks, the list is pretty much infinite.

How about we genetically create some telepaths to scan everyones brain every day, to check they're "being good". /s

0 ( +2 / -2 )

GoGoGo,once you connect your Bluetooth device, anybody near can detect the device you are connected too

0 ( +0 / -0 )

I stopped using Google mail and using them as a default search engine, I stopped using Facebook, looks like Apple is next on the list. It’s not their job. Why not scan for sex traffickers, drug dealers, inside trades, kids skipping school, looters, running red lights as well. Better yet, why not just give the Apple Corporation total control over law enforcement?

Rollerball (1975)

0 ( +1 / -1 )

So, unless you have registered child porn images or your child is in the database for sexually abused children you have no reason to expect a knock on the door from law enforcement. One doesn't need to be Einstein to comprehend that although some are already churning out conspiracy theories because that's what they do in their psychotic state of mind. How it's used by foreign powers is a different question with possible negative answers. More time will reveal more about this terrific initiative by Apple. By the way, I read that Drop Box and some other cloud services already scan for child porn.

1 ( +1 / -0 )

Totally agree, invasion of privacy, and ... such a false positive occur... a potential issue where investigations of phone content could lead to Social stigma - Apple is not one to be trusted, you can't sue them, you can't complain against them, they just do what they want. It also goes a bit further... what happens with any bootleg music that you may have on your phone ? Or any documents, etc...

Having some 3rd party review your personal media on your iPhone is like inviting them into your bedroom to watch you whilst you have sex with your Wife/Girlfriend. Undoubtedly, they'll use their own phone to record whatever they see - and where will that end up ? And who do you sue thereafter ? Apple ?

Also, on that matter, "Big Tech" is in bed with the CCP... so guess who's watching you.

-1 ( +1 / -2 )

Many tech companies are scanning photos. Apple is unlikely to go ahead after the user outcry over spying on them.

Would something like this be just America based servers or a worldwide scan. Different countries have different laws.

1 ( +2 / -1 )

Apple will do what they please, they already mess around with your phone remotely.. ever wonder how some behaviors change without an upgrade ? (Sadly a Little known/unpublicized fact...[source: internal] )

-1 ( +1 / -2 )

@Mat - that's where we're heading.. thought control... look at North Korea/& China's CCP ...

-1 ( +1 / -2 )

Many tech companies are scanning photos. Apple is unlikely to go ahead after the user outcry over spying on them.

Will they actually go in your phone? Or just in your synched cloud, Google already does this on their servers and not your devices looking for disturbing stuff like child porn. It's really nothing new (unless they intend on going through your phone which is puzzling)

1 ( +1 / -0 )

 I bought my own cloud 

How does it work? Did you build your own apps too? You can still get nailed for stuff if you are caught in a certain jurisdiction

2 ( +2 / -0 )

Apple can not access its iPhones or iPads. The data is encrypted.

0 ( +1 / -1 )

Slippery slope. I bought my iPhone for my personal use. Apple shouldn’t have the power to control what I send and receive.

If I use it for crimes, that’s my choice. A company I paid shouldn’t have a say to it.

0 ( +1 / -1 )

Big brother and the thought police in action. Wonder when it will be possible to scan peoples brains for "criminal" thoughts?

-1 ( +0 / -1 )

Soon computer implants will be available for the human brain. Will Apple or the government scan peoples brains for improper thoughts or criminal memories?

-1 ( +0 / -1 )

So, unless you have registered child porn images or your child is in the database for sexually abused children you have no reason to expect a knock on the door from law enforcement.

Number one I don't believe that for a nanosecond. The cops go psycho when they think they have someone. Number two, I have people leaving messages with attachments every single day. I don't open them but nonetheless they are there. I didn't ask for any of these, don't even know the callers but if they sent an image on somebodies naughty list I'm the one who could go to jail.

-1 ( +0 / -1 )

Time to sell my I phone 8! The question is what to replace it? Will these companies announce scanning of their phones as well? I agree with Desert Tortoise, American police would just break down the door guns drawn.

-1 ( +0 / -1 )

So, a picture of a puppet shelf at a toy store brings you directly into handcuffs, police interviews, searching your whole place. Just only for prevention, of course.

-1 ( +0 / -1 )

What people aren't saying is that the tech companies have been doing this for years. U.S. law requires tech companies to flag cases of child sexual abuse to the authorities.

Apple has historically flagged fewer cases than other companies. Last year, for instance, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million, according to the center’s statistics.

That enormous gap is due in part to Apple’s decision not to scan for such material, citing the privacy of its users.

However, this is the first time it is done on device. But you can bet that Google and Facebook follow suit.

Is this a slippery slope? I hope not. I guess we shall see.

0 ( +0 / -0 )

Well, there goes Apple's iPhone business in America...

-1 ( +0 / -1 )

Electric eyes are everywhere!! Nothing is private anymore!

-1 ( +0 / -1 )

Desert Tortoise

Did your parents ever take photos of you as a baby taking a bath or just running around the house naked? Now that could land you in jail for a very long time followed by a life spent registering as a sex offender.

Ummm. No. Only if those pics are in the CSAM (Child Sexual Abuse Material)  a.k.a. child pornography  maintained by the NCMEC (National Center for Missing and Exploited Children).

-1 ( +0 / -1 )

Ummm. No. Only if those pics are in the CSAM (Child Sexual Abuse Material) a.k.a. child pornography maintained by the NCMEC (National Center for Missing and Exploited Children).

I don't begin to trust either Apple or a law enforcement agency to be that discriminating. What I expect is the zealous jihadi crusader mentality when it comes to anything they can twist into a sex crime and arrest all and sundry. There are documented cases of men urinating outside behind a bush because there was no rest room available being charged with and subsequently convicted of indecent exposure and being forced to register as a sex offender. What sex? The guy just had to go. My wife and I have both been caught on rare occasion having to take an alfresco pee when there was no other alternative. It happens. To conflate that into a sex crime is ridiculous, but happens. That's why I don't trust anything these people say. Not for one second. They will do the same to your baby pictures and gleefully ruin your life while thinking they did something good for society. Do not trust these people or their intentions.

0 ( +1 / -1 )

I don't begin to trust either Apple or a law enforcement agency to be that discriminating.

Apple is focused on personal security more than any other platform. They stand up against law enforcement demands for information (see https://www.adn.com/nation-world/2021/04/14/to-unlock-a-terrorists-iphone-the-fbi-turned-to-an-obscure-company-in-australia/) and against information scrapers like Facebook. And Unlike Google their revenue is not based on advertising, so they don’t have any reason not to protect your information as best they can, and in fact it’s in their best interest to protect it, as they market their ecosystem based on their protection of your information.

1 ( +1 / -0 )

 and in fact it’s in their best interest to protect it, as they market their ecosystem based on their protection of your information.

Never trust a multinational corporation. They never have anyone's interests but their shareholders and top managers at heart. Please don't be fooled by their advertising. Those are all nice sounding lies.

-1 ( +0 / -1 )

Maybe it is time to dump the smartphone and Ipad as well. If they can scan it, images could be planted on the phone. It would save me a lot of money going to a prepaid flip phone.

0 ( +1 / -1 )

Apple can not scan your iPhone. They can scan the iCloud so just stop using iCloud and use another service provider. Even if they scan the iCloud they are only looking for already registered images posted by National Center for Missing and Exploited Children.

A nude photo of yourself or your children in the bath will not be scanned.

1 ( +1 / -0 )

Never trust a multinational corporation. They never have anyone's interests but their shareholders and top managers at heart.

I don't trust them. I'm a businessman in the IT industry, and I know how it works. It is in Apple's best interests to do their best to protect their user's information, due to it being their point of marketing. By providing higher security, they sell more devices at a higher cost, increasing their bottom line, and that of their investors and top managers.

I trust economic motivation, not Apple. I also look at past history, remember, Apple was one of the first companies to implement end-to-end encryption of messages in iMessage. And Apple stood up to the FBI when the FBI demanded a backdoor into an iPhone. Apple wasn't even willing to build one, and in fact, later additional protections to prevent the method FBI and other agencies from using the means they used to get into that phone from being used on newer iPhones. They also have consistently stood up to American government demands to build backdoors and/or weaken security. And they recently took measures that angered Facebook by preventing Facebook from "fingerprinting" users to identify them even when not logged in.

These are actions Apple has taken in the real world. So they have a financial motivation to protect their users' information, they have no particular motivation not to protect their users' information, and they have a history of protecting their users' information.

But you claim they shouldn't be trusted. Ok, I'm open minded. Tell me what Apple has done to show that they cannot be trusted that they actually want to protect user information. You must have something beyond just a general distrust of large corporations right?

1 ( +1 / -0 )

One thing the non-technophiles in this thread clearly don't understand is that images won't even be scanned. Images will be hashed, which is when an image (or any file) is converted to an alphanumeric code, which, unlike encryption, cannot be reversed. In other words, it is impossible to reverse engineer a hash to determine the image used to created the hash.

The hash of the image will then be compared against a list of known child-porn image hashes. If there is a match, they know the uploaded image from which the hash was created, is the same as a known child-porn image.

Nothing in the image itself will be scanned, as it is impossible to glean any information about the original image from the hash. Hashing algorithms by definition (and testing) must not provide any methods of determining information contained in the hash.

Incidentally, secure logins on websites, cryptocurrencies, NFTs, and all digital signatures work using one-way hashing algorithms).

The only worry here is that, as others have mentioned, government agencies could somehow slip hashes into the list that contain anti-government imagery, rather than child-pornography. The government may then be notified of the ownership of the image. I would hope Apple has come up with some way to mitigate this, though I don't see what it would be. If I were a dissident, I certainly wouldn't be using the cloud to store my images.

1 ( +1 / -0 )

Desert Tortoise

I don't begin to trust either Apple or a law enforcement agency to be that discriminating. What I expect is the zealous jihadi crusader mentality when it comes to anything they can twist into a sex crime and arrest all and sundry. There are documented cases of men urinating outside behind a bush because there was no rest room available being charged with and subsequently convicted of indecent exposure and being forced to register as a sex offender. What sex? The guy just had to go. 

But this is different. It's just a digital hash of an already flagged image that is matched.

0 ( +0 / -0 )

A lot of people on this thread are commenting on the article, which is rather short on details. So some conclusions are wrong. If you want to understand exactly what Apple are doing, here is an excellent explainer: https://www.youtube.com/watch?v=Dq_hvJI76IY&list=TLPQMTAwODIwMjEp1DxVeaeoGA&index=3

0 ( +0 / -0 )

But this is different. It's just a digital hash of an already flagged image that is matched.

I receive unsolicited texts with attachments pretty much daily. I don't open them fearing malware. What happens if someone maliciously sends a text with one of the flagged images attached? Sounds like the makings of a long legal nightmare even I were to eventually be acquitted of wrong doing.

0 ( +0 / -0 )

What happens if someone maliciously sends a text with one of the flagged images attached?

You probably shouldn’t save it to your camera roll then to the cloud if that happens.

I’m not sure why you would save the image to your camera roll if you found it distasteful though. I’d more likely report the sender to the police myself, or at the very least just delete the message they sent without saving the image.

And it does seem to me that the act of choosing to save that image indicates your intention to keep it, so I do believe there would be justification for prosecution of possession of child pornography in such a case anyways.

1 ( +1 / -0 )

Login to leave a comment

Facebook users

Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts.

Facebook Connect

Login with your JapanToday account

User registration

Articles, Offers & Useful Resources

A mix of what's trending on our other sites