national

App developed to automatically delete children's nude selfies

73 Comments

The requested article has expired, and is no longer available. Any related articles, and user comments are shown below.

© KYODO

©2024 GPlusMedia Inc.

73 Comments
Login to comment

Zero chance of anything going wrong with this one. Zero.

26 ( +33 / -7 )

how did they test it to know if it works? suspicious

12 ( +19 / -7 )

it suggests that the problem of child porn are the children themselves...

or their phones...

26 ( +30 / -4 )

I just experienced every possible negative emotion reading this headline...

8 ( +11 / -3 )

Let's see...

What's the rate of false positives? Will the app delete perfectly normal photos and then alert parents over nothing?

Does the app run completely in the phone that it's installed or does it send any data to a remote server?

How did this 24 year-old wannabe tech bro get a hold of data i.e. photos of nude children to train his "AI"?

Even if we assume pure intentions, there are too many ways this thing could fail catastrophically.

12 ( +17 / -5 )

It was born through a collaboration between app developer Smartbooks Inc. based in Tokyo, Fujita Health University in Aichi Prefecture and the prefecture's local Nakamura police station in Nagoya.

Government and industry working together in the field of information technology development in Japan already have a great track record right?

4 ( +10 / -6 )

Perhaps removing child porno magazines and innapropriate anime from every convenience store in Japan might be intelligent move first.

8 ( +24 / -16 )

An app that I’ll never have to use..,

-5 ( +1 / -6 )

I’m just curious where this new app draws the line between children and adults. Is a 17-years old person a child or an adult?

8 ( +10 / -2 )

it suggests that the problem of child porn are the children themselves...

or their phones...

No it doesn't!

It suggests that the people behind developing the app saw a way to try and reduce the number of images in circulation by removing said images before they leave the device.

There probably isn't one golden solution to the 'problem of child porn', more lots of smaller protections, barriers, legislation etc. in many different areas (mobile devices, cameras, comic books, publishing companies, search engines etc.).

-2 ( +6 / -8 )

Circling the bowl or what!

0 ( +5 / -5 )

The question now for the app's collaborators is to figure out how to encourage children to download the app on their smartphones.

This is a though one, how to convince someone to download an app that is constantly monitoring the media in your phone and deletes automatically everything it considers should be deleted.

Just by security reasons this is not something that can be recommended. One hack away and you could have a huge problem with nude selfies being send instead of collected or other sensitive/important information being arbitrarily deleted.

6 ( +7 / -1 )

I dont really understand the negative comments here. At least there are Japanese that are taking a proactive stance in looking to eliminate sexually explicit photos of underage children from the internet and are working to stop their exploitation in the process too.

I applaud them for their work, instead of just sucking through their teeth, wringing their hands, and calling meetings, they actually created something that can be used to help, not hurt!

This isnt a "silver bullet" that will eliminate child porn here, but it certainly is a step in the helpful direction!

-7 ( +7 / -14 )

A couple of points. Persuading teenagers up to 18 to upload this, and app staff/programmers have access, as well as hackers, to all the videos and photos on children’s smartphones.

do they copy and store them? Investigation purposes as a pretext?

5 ( +7 / -2 )

What about the nude selfies of consenting adults? Will the app erase them, too?

2 ( +5 / -3 )

Perhaps removing child porno magazines and innapropriate anime from every convenience store in Japan might be intelligent move first.

They're already gone! They disappeared from the magazinge racks of all of Japan's convenience stores before the 2020 Olympics to give Tokyo a "family-friendly" face!

Plus, removing all the ecchi Anime will only anger Japan's Otaku further!

-4 ( +3 / -7 )

The question now for the app's collaborators is to figure out how to encourage children to download the app on their smartphones.

How about encouraging parents also?

0 ( +1 / -1 )

'AI' is treated like the application of magic.

So, do tell. this 'AI', how many photos of naked kids were used to train it, where did they get them from, and did they ask permission?

You can imagine how many spoof apps will now appear that require the user to upload an image to 'teach' it what to block.

Good intentions, albeit likely to go awry. A better idea is to educate kids not to do this at school and at home. No app required. Parents and teachers have a duty to teach basic life skills to kids. This is a basic life skill. We should not outsource the education and protection of children to apps, and certainly not to 'AI'.

3 ( +5 / -2 )

What about the nude selfies of consenting adults? Will the app erase them, too?

Why don't you try it lol

3 ( +5 / -2 )

A couple of points. Persuading teenagers up to 18 to upload this, and app staff/programmers have access, as well as hackers, to all the videos and photos on children’s smartphones.

do they copy and store them? Investigation purposes as a pretext?

Completely agreed, Rodney. While the premise seems a good one, my thought were exactly the same. The app has access to a child's photos, and can censor or delete them? What happens then? Is it really any different if these photos are going to the app programmers/staff, rather than another adult? Could you imagine if this was just a front to get child porn legally?

0 ( +3 / -3 )

I dont really understand the negative comments here. 

That's just how some minds work.

-5 ( +6 / -11 )

Can I still send pictures of our toddlers messing about in their birthday suits to family members? Where do you draw the line? IMO a better solution would be to stop giving young kids smartphones, but then I guess people would actually have to start raising their kids properly, so probably not a popular option for many...

4 ( +6 / -2 )

How did this 24 year-old wannabe tech bro get a hold of data i.e. photos of nude children to train his "AI"?

I would want the highest level of security checks ever done on this app to make sure the photos arent just getting forwarded to some giant database. Just recently a Chinese police database was hacked for information on around 1bn peoples information. Does tech-bro have that level of security expertise? The centralised nature of anything relating to such a sensitive topic as this is worrying.

1 ( +2 / -1 )

Can I still send pictures of our toddlers messing about in their birthday suits to family members?

Probably not if you install it on your device

But why would you install it on your phone?

4 ( +5 / -1 )

I can just see the look on their face when this app deletes all photos on their phone.

1 ( +3 / -2 )

I mean, yeah it's an idea but there are already workarounds for pretty much every parental monitoring app out there (Google Family link is still pretty good).

Also, if you take a picture through an app (twitter, line, etc.) you can share it pretty much instantly.

They have commercials, tv spots and even citywide PA announcements about people scamming the elderly. Why not actually shine some light on the dangers instead.

-2 ( +0 / -2 )

Or……..

they can cut the crap make a public National sexual predator and pedophile list, actually enforce it?

nah……too logical

1 ( +10 / -9 )

I'm guessing they developed the app to detect child/human genitalia (both male and female) in photos. Sounds quite perverse. Not going to guess how they tested the app.

2 ( +4 / -2 )

Not going to guess how they tested the app.

I will, on a whole load of CP.....

2 ( +4 / -2 )

I will, on a whole load of CP.....

I guess its a more creative excuse than "I was drunk", or "I was stressed from work"

4 ( +5 / -1 )

They should add some unrelated, but attractive features to get more people to download it.

0 ( +2 / -2 )

I am not opposed to the idea but I wonder why it's always the victim side that has to take measure in this country and they don't really do anything to try to regulate the sexual predators and pedophilias in more strict way. We should at least be able to share the information of those who with sexual criminal record.

0 ( +2 / -2 )

The app, which uses artificial intelligence, was created as there has been a sharp increase in cases of adults contacting children through social media to have them send their naked pictures.

They should make this a crime if not already, asking children to send naked pictures

2 ( +4 / -2 )

But what are the children doing on social media? who allowed them?

0 ( +3 / -3 )

""The question now for the app's collaborators is to figure out how to encourage children to download the app on their smartphones.""

Simple, have the providers install the app. in children smart phone till they reach 18.

-1 ( +0 / -1 )

This is the epitome of victim blaming. By creating this app, they're implying that children and their phones are to blame here, not the perverts manipulating them for nude pictures. What about creating an app or software that catches these adults sending messages asking for nude selfies instead huh?

-3 ( +5 / -8 )

Again Japan is addressing the effects and not the cause. How about developing an app that tracks these perverted deviates grooming children in the internet? And then put some severe punishments into place for those who get caught. In Australia, using the internet to groom minors carries a 7-10 year jail sentence. And, there is a cyber strike force who pretend to be children on the net to catch these deviates. Stop focusing on the victims and do something about the predators.

1 ( +5 / -4 )

Do children take selfies of themselves while nude ?

I would say no, unless it has been requested by someone else, in order to be sent.

-3 ( +2 / -5 )

Lots of negativity here.

You don't have to install the app if you don't want to, you know?

-1 ( +3 / -4 )

A lot of people calling out victim-blaming, which is just a bit weird. Not letting your kid out after dark is not victim-blaming, it's parenting. Dangerous things can happen to kids and we aren't blaming them by looking for measures to prevent dangerous things from happening.

App seems like a good idea to me. I'm sure a few of the evil nutcases will find a way around it eventually but I can also see it preventing a lot of misery - a lot!

-1 ( +2 / -3 )

This is a though one

Tough not though !

Knucklehead !

-6 ( +0 / -6 )

But what are the children doing on social media? who allowed them?

lockdowns and mask mandates have had massive psychological effects on children. They are forced to use social media.

-5 ( +0 / -5 )

Hope all the people against this are getting flagged by the isp lol

-3 ( +0 / -3 )

Simple, have the providers install the app. in children smart phone till they reach 18.

Any kid can google how to find and deactivate an app, without the cooperation of the user it would not be possible to have it running, and since it would be obvious that what they are doing is wrong/unacceptable/etc. that would be the first thing they would do if they wanted to send nudes.

-1 ( +0 / -1 )

I don´t understand.

As a parent, I did not let my kids have a smartphone until they were mature enough not to do things such as sending a photo (or making it accessible) to any unmet person.

Same as with the use of a knife to make it clear. If unsure, you supervise until sure.

There will always be children´s predators.

Of course as mentioned already, prevention is first needed before action against pervs (police watchers, public list of such persons...).

Education first.

Think also it could happen through any device (another smartphone, computer, etc...) so either the app is embedded everywhere, or useless. Because pervs will ask to use another way in such case.

-1 ( +0 / -1 )

Sounds kind of sketchy. What, did they go through pics from interpol’s child sex trafficking database? Just don’t give ur kid a phone. I lived without one for the first 18 years of my life and I was fine.

-1 ( +0 / -1 )

And here is another of the many non sense waste of money by the Japanese government and police to solve absolutely nothing.

A deep routed social problem like this won't be reduced by an application but from the core.

These oyajis at the helm should instead remove all the semi pedo-pornography magazines and manga from the market and invest in the media in stopping idolizing teenager girls dressed up like dolls and behave submissively.

But Japan being a centered man society won't do such kind of approach to definitely eradicate the problem.

Again,as usual nothing will change.

-3 ( +3 / -6 )

And they already have child friendly phones. They basically can’t do anything other than call and send texts to parents and don’t have a camera. They’re tiny and you can see little children walking around with them on their necks.

1 ( +2 / -1 )

Absolutely pointless app. If the child is adept enough to download an app, they are just as adept to delete or disable the app. It's not the child's phone that needs it, it's the creep's phone that needs it. And as others have pointed out, how many false positives is this thing going to send out and is there a server somewhere this data is being sent to? So many unanswered questions.

-3 ( +0 / -3 )

Exactly this!

Anshin anzen again not so anshin is it.

Helluva a con job gettin press for this crap.

Poor child rapist criminals, let's not hammer em. No let's go target the kids...

Super Creepy app. Posters and database with names faces numbers.

Bag and tag the child rapers. Every last one of em.

It suggests that the problem of child porn are the children themselves...or their phones...

-1 ( +2 / -3 )

Even if many feel that it's ok not to give their kids a phone in this day and age that cat's already sailed.

Article says cases of adults asking kids to send nudes pics are rising.

The kids already have phones.

Maybe take it away?

-3 ( +0 / -3 )

So many unanswered questions.

Maybe you should reserve judgement until you know more

-2 ( +0 / -2 )

they can cut the crap make a public National sexual predator and pedophile list, actually enforce it?

Great way to address the overall issue of pedophilia but likely will not affect muchthis particular issue of adults asking for nude pics over social media

-2 ( +0 / -2 )

@ian

Maybe you should reserve judgement until you know more

Maybe the article should provide the info. If you're going to report on something, you might want to provide such important info.

0 ( +1 / -1 )

Yubaru

I dont really understand the negative comments here.

Me neither, but I suspect the complainers are either feeling frustrated about this technology denying them access to child porn in the future, or simply complain about absolutely anything positive coming out of Japan. Some don't understand how this technology works and try to find fault with the author (or with anything, really) just to complain about. I'm really happy this app is developed and really hope this is expanded in the future not only to creation (where the child himself takes the photo) to distribution and storage as well. And yes, means to redflag and warn the parents (IF THEY CARE, that is)..

-1 ( +2 / -3 )

Guess again.

> ebisenToday 02:10 pm JST

Yubaru

I dont really understand the negative comments here.

Me neither, but I suspect the complainers are either feeling frustrated about this technology denying them access to child porn in the future, or simply complain about absolutely anything positive coming out of Japan.

1 ( +2 / -1 )

Maybe the article should provide the info. If you're going to report on something, you might want to provide such important info.

But they don't know your questions

-1 ( +1 / -2 )

Perhaps removing child porno magazines and innapropriate anime from every convenience store in Japan might be intelligent move first.

Having an obligatory panty shot in just about every anime is pretty gross, especially when it's so often a schoolgirl.

1 ( +2 / -1 )

Lots of comments from people who apparently don't understand how parental control apps work, or how this particular app might identify the prohibited photos.

-Parental control apps are installed by the parents/guardians, and are password protected to prevent disabling or uninstalling.

-The app would not be installed on all phones, just the phones of specific children chosen by their parents.

-The app does not need to differentiate between naked pics of adults and naked pics of children. It can just delete or sandbox all naked pics, as their shouldn't be any naked pics on a child's phone.

That being said, knowing how horrible Japanese mobile apps are, I can't help but wonder how many inappropriate pics won't get tagged? Or, even more likely, how many innocent pics of kids in swimsuits at the beach/water park, or kids in shorts and midriff-baring tops (or no tops for boys), will get tagged as inappropriate?

0 ( +1 / -1 )

In Japan? land of the failed apps and zero cybersecurity? i give it 3 months of use before a news article says the app was hacked and all the pictures were sent to an anonymous account instead of deleted.

1 ( +2 / -1 )

Another nonsense overengineered measures for a simple problem.

Just use the child-lock capabilities build-in all modern phones to control which apps can they use.

-2 ( +2 / -4 )

In the United States, there was a program called "To Catch a Predator" where the cops would stage sting operations in which adults/authorities would pretend to be children and lure child predators into homes using social media. They must have caught hundreds, if not thousands of pervs this way. Perhaps there are laws in Japan that prevent catching criminals this way, but I bet they'd catch loads. So many pervs in this country.

3 ( +3 / -0 )

I'm curious how the app knows, and what it will do with detected images in terms of its servers. I mean, Japan has a bizarre fetish for dressing women as little girls in junior high school "sailor" outfits which they then take off in porn, and also, since 19-year-olds are still children in many cases, will they, too, be barred? Again, how do we know in a society which sexualizes even 12-year-olds on the cover of mags like Weekly Jump?

-5 ( +1 / -6 )

Despite all of the negative commentary, this app is another step in the right direction. Grooming has not only been a recent issue in America with American corporations enabling and promoting the sexualization of children, but Japan continues to have its share of groomers from Japan’s beginnings to the present day. More needs to be done.

-2 ( +1 / -3 )

Somewhat like preemptively wearing ankle monitor to reduce overall crime rate... The potential harm is actually higher than good. Highly doubt this will be next big thing.

0 ( +0 / -0 )

Just don't let them have a phone!?

Interesting watching people without kids tell parents how to do it.

1 ( +1 / -0 )

Do children take selfies of themselves while nude ?

I would say no, unless it has been requested by someone else, in order to be sent.

Sure, but many, many of them are requesting those images, and teens sending nudes to other teens is extremely common. It's not like they're being blackmailed for the most part - they do it willingly.

2 ( +2 / -0 )

How will a parent be able to get those great 6-in-a-tub bath photos which are so great to show potential spouses 20-30 yrs later? That's what my parents did. Got everyone under 7 yrs old lined up in the tub. Fortunately, I wasn't there.

Ah .... teens ... well, that's completely different.

I was innocently thinking of "nude" 2 yr olds in a bathtub with suds on their heads .... or I have one of a neighbor 4 yr old peeing in the street towards the house of a former friend with his parents laughing in the background. It was a really funny moment, that's all. Any parent knows that small kids will run around the house and dance nude as a game because it freaks out the parents while they are giggling the entire time. Nothing nefarious, just kids being kids.

0 ( +1 / -1 )

I'm not sure what is worse, the disingenuous suggestion that there's a significant problem with strangers finding out the mobile phone numbers of minor's at random or the subversive and convert manor with which they plan to deploy a spy tool on the communication devices of our children? If this is what it's cracked to be then it should be marketed to parents of susceptible children.

I think it's more likely than not that they've discovered while riffling through people's file's while using covert hacking tool's that some kid's have been experimenting with their phones and their anatomy. And the real fear is that others have these tools as well and may exploit this vulnerability.

0 ( +0 / -0 )

Login to leave a comment

Facebook users

Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts.

Facebook Connect

Login with your JapanToday account

User registration

Articles, Offers & Useful Resources

A mix of what's trending on our other sites