Japan Today
world

Deepfake porn crisis batters South Korean schools

13 Comments
By Claire Lee and Hieun Shin

The requested article has expired, and is no longer available. Any related articles, and user comments are shown below.

© 2024 AFP

©2024 GPlusMedia Inc.


13 Comments
Login to comment

The recent arrest of Pavel Durov shows that S Korea isn't the only country struggling with Telegram-related issues. The app really needs to step up its act so that things like this don't happen. Freedom of expression is one thing. Child pornography is another.

11 ( +12 / -1 )

specifically to humiliate female classmates -- or even teachers.

Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn't matter. Just look at the upskirt photo problem in Japan for examples.

Men/boys have always wanted to see women/girls. It is how we are built/created.

Steve : I'm a bloke. You bent over, I looked. Shoot me.

and

Steve : It is the four pillars of the male heterosexual psyche. We like: naked women, stockings, lesbians, and Sean Connery best as James Bond, because that is what being a boy is.

and

Steve : that does not stop me wanting to see several thousand more naked bottoms before I die, because that's what being a bloke is. When man invented fire, he didn't say, "Hey, let's cook." He said, "Great, now we can see naked bottoms in the dark." As soon as Caxton invented the printing press, we were using it to make pictures of, hey, naked bottoms! We have turned the Internet into an enormous international database of naked bottoms. So you see, the story of male achievement through the ages, feeble though it may have been, has been the story of our struggle to get a better look at your bottoms.

None of this should surprise anyone.

-5 ( +2 / -7 )

What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made.

Not too cool .

12 ( +14 / -2 )

Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn't matter. Just look at the upskirt photo problem in Japan for examples.

There are countless ways for people to see nudes, if this was only about this there would be no need to deepfake it. These cases are about faking nudes of specific people with the purpose of humiliating them as well, the article clearly describes how the spaces are said to be used to "punish" the victims, instead of private enjoyment that nobody could prove the criminals are sharing the fakes for everybody to see.

The famous quote of Michael Cunningham still applies "Everything Is about sex except sex. Sex is about power"

2 ( +7 / -5 )

What is "sexualizing" about five fully clothed young women?

Hmmm..... I suppose you think that only "nudes" are the definition of "sexualization". Also it is obvious you are oblivious to who these young women are.

They are a rather famous K-Pop group called "New Jeans" and their "image" is that of the "girl next door".

You really dont understand the meaning and intent of the word "sexualization" in that context if you think it only refers to a nude or semi nude person.

I hope you learned something here.

0 ( +8 / -8 )

schoolboys steal innocuous selfies from private Instagram accounts and create explicit images to share in the chat rooms

You can see people easily identifiable in random crowd shots at sites like this.

SK is the canary in the coal mine with its hyper digitization but this is a worldwide situation. A digital Bill of rights is needed

In this late stage capitalism society the only way people will be protected is if it is prohibitively expensive to use their personal data like images and writing. And people are well rewarded if it is used.

0 ( +3 / -3 )

This is pretty scary, and surely the same problems will crop up everywhere.

But six out of seven alleged perpetrators were teenagers, police say, which complicates prosecutions as South Korean courts rarely issue arrest warrants for minors.

Maybe time to rethink that policy about teenage perps.

3 ( +3 / -0 )

In his statement after release from custody in France, Pavel Durov wrote:

All of that does not mean Telegram is perfect. Even the fact that authorities could be confused by where to send requests is something that we should improve. But the claims in some media that Telegram is some sort of anarchic paradise are absolutely untrue. We take down millions of harmful posts and channels every day. We publish daily transparency reports. We have direct hotlines with NGOs to process urgent moderation requests faster.

> However, we hear voices saying that it’s not enough. Telegram’s abrupt increase in user count to 950M caused growing pains that made it easier for criminals to abuse our platform. That’s why I made it my personal goal to ensure we significantly improve things in this regard. We’ve already started that process internally, and I will share more details on our progress with you very soon.

> I hope that the events of August will result in making Telegram — and the social networking industry as a whole — safer and stronger.

There should be a way for Korean authorities to contact Telegram to have the content taken down. That would be the logical way to tackle the problem. Have a hotline set up for these channels to be taken down as soon as they're discovered. I'm the government of Korea would be able to negotiate such a hotline, and set up a special police section to monitor Telegram for these obscene channels.

-2 ( +0 / -2 )

What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made.

The picture is not irrelevant or meant to sexualize the issue. If you read the caption on it, you'll know they're the K-pop group NewJeans that were victims of deepfake porn and took legal action against it.

5 ( +6 / -1 )

South Korea not the only country with a porn problem.

1 ( +4 / -3 )

Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn't matter. Just look at the upskirt photo problem in Japan for examples. 

Men/boys have always wanted to see women/girls. It is how we are built/created.

Speak for yourself.

Hope this deepfake stuff doesn’t take hold in Japanese schools. Sadly, with J-children being far more naive than most, I would hazard to guess that this type of thing would be more believe-able.

As soon as Caxton invented the printing press……….

And I have to correct this: William Caxton did not invent the printing press. Thought everyone knew it was Johannes Gutenberg, many years earlier.

1 ( +3 / -2 )

I get that it isn't socially acceptable. That isn't being disputed. Countries are still trying to decide if AI generated images are porn or not. Are AI generated images of child pornography illegal or not? Hopefully, that will be determined as true. Also, converting a harmless image/video into porn without a written release by the subject should also be illegal. There's little need to say any of that. The only people who will disagree are the people creating the images/videos. The laws need to catch up with technology, wild not being so restrictive that freedoms to take photos in public of people and things in public are allowed too.

The crime happens when the images/videos are transformed, whether they are shared or not.

And I have to correct this: William Caxton did not invent ...

Those were all quotes from a comedy TV show called "Coupling" that was popular in the early 2000s in the UK. It wasn't any statement of fact. Rather it was a verbatim rant from a TV character. Sorry that you didn't catch the reference. It seemed apropos to me.

Good deepfakes have been around about a decade. It is only in the last 3 yrs that commercialization and very easy to use websites have existed to convert normal images with a description into socially unacceptable images by unsophisticated users. There are apps for cell phones to upload and transform images. Just a checkbox that says you have permission from the owner of the photo to use it is required. You don't even need a login on many of those sites for initial trials that make low resolution images. Low resolution is fine for phone viewing or websites.

Posting them on the web, even with encrypted groups, has been possible for over 40 yrs. Remember usenet? Again, it is just the technology that has changed. Usenet was a key part of the internet for many decades. It had great and terrible uses, just like all technology.

Our laws need to be updated to reflect what society demands. That's the point. https://www.washingtonpost.com/technology/interactive/2024/ai-bias-beautiful-women-ugly-images/

1 ( +2 / -1 )

Login to leave a comment

Facebook users

Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts.

Facebook Connect

Login with your JapanToday account

User registration

Articles, Offers & Useful Resources

A mix of what's trending on our other sites