The requested article has expired, and is no longer available. Any related articles, and user comments are shown below.
© 2024 AFPDeepfake porn crisis batters South Korean schools
By Claire Lee and Hieun Shin SEOUL©2024 GPlusMedia Inc.
The requested article has expired, and is no longer available. Any related articles, and user comments are shown below.
© 2024 AFP
13 Comments
Login to comment
raincloud
The recent arrest of Pavel Durov shows that S Korea isn't the only country struggling with Telegram-related issues. The app really needs to step up its act so that things like this don't happen. Freedom of expression is one thing. Child pornography is another.
theFu
Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn't matter. Just look at the upskirt photo problem in Japan for examples.
Men/boys have always wanted to see women/girls. It is how we are built/created.
None of this should surprise anyone.
Yubaru
What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made.
Not too cool .
virusrex
There are countless ways for people to see nudes, if this was only about this there would be no need to deepfake it. These cases are about faking nudes of specific people with the purpose of humiliating them as well, the article clearly describes how the spaces are said to be used to "punish" the victims, instead of private enjoyment that nobody could prove the criminals are sharing the fakes for everybody to see.
The famous quote of Michael Cunningham still applies "Everything Is about sex except sex. Sex is about power"
Yubaru
Hmmm..... I suppose you think that only "nudes" are the definition of "sexualization". Also it is obvious you are oblivious to who these young women are.
They are a rather famous K-Pop group called "New Jeans" and their "image" is that of the "girl next door".
You really dont understand the meaning and intent of the word "sexualization" in that context if you think it only refers to a nude or semi nude person.
I hope you learned something here.
dagon
You can see people easily identifiable in random crowd shots at sites like this.
SK is the canary in the coal mine with its hyper digitization but this is a worldwide situation. A digital Bill of rights is needed
In this late stage capitalism society the only way people will be protected is if it is prohibitively expensive to use their personal data like images and writing. And people are well rewarded if it is used.
BB
This is pretty scary, and surely the same problems will crop up everywhere.
Maybe time to rethink that policy about teenage perps.
Mike_Oxlong
In his statement after release from custody in France, Pavel Durov wrote:
There should be a way for Korean authorities to contact Telegram to have the content taken down. That would be the logical way to tackle the problem. Have a hotline set up for these channels to be taken down as soon as they're discovered. I'm the government of Korea would be able to negotiate such a hotline, and set up a special police section to monitor Telegram for these obscene channels.
Goat
The picture is not irrelevant or meant to sexualize the issue. If you read the caption on it, you'll know they're the K-pop group NewJeans that were victims of deepfake porn and took legal action against it.
Legrande
South Korea not the only country with a porn problem.
aaronagstring
Speak for yourself.
Hope this deepfake stuff doesn’t take hold in Japanese schools. Sadly, with J-children being far more naive than most, I would hazard to guess that this type of thing would be more believe-able.
And I have to correct this: William Caxton did not invent the printing press. Thought everyone knew it was Johannes Gutenberg, many years earlier.
theFu
I get that it isn't socially acceptable. That isn't being disputed. Countries are still trying to decide if AI generated images are porn or not. Are AI generated images of child pornography illegal or not? Hopefully, that will be determined as true. Also, converting a harmless image/video into porn without a written release by the subject should also be illegal. There's little need to say any of that. The only people who will disagree are the people creating the images/videos. The laws need to catch up with technology, wild not being so restrictive that freedoms to take photos in public of people and things in public are allowed too.
The crime happens when the images/videos are transformed, whether they are shared or not.
Those were all quotes from a comedy TV show called "Coupling" that was popular in the early 2000s in the UK. It wasn't any statement of fact. Rather it was a verbatim rant from a TV character. Sorry that you didn't catch the reference. It seemed apropos to me.
Good deepfakes have been around about a decade. It is only in the last 3 yrs that commercialization and very easy to use websites have existed to convert normal images with a description into socially unacceptable images by unsophisticated users. There are apps for cell phones to upload and transform images. Just a checkbox that says you have permission from the owner of the photo to use it is required. You don't even need a login on many of those sites for initial trials that make low resolution images. Low resolution is fine for phone viewing or websites.
Posting them on the web, even with encrypted groups, has been possible for over 40 yrs. Remember usenet? Again, it is just the technology that has changed. Usenet was a key part of the internet for many decades. It had great and terrible uses, just like all technology.
Our laws need to be updated to reflect what society demands. That's the point. https://www.washingtonpost.com/technology/interactive/2024/ai-bias-beautiful-women-ugly-images/