Korean Police Announce 7-Month Crack Down on AI Deep Fake Sex Images
|Deep fake sex images are now the biggest story in South Korea and it is problem that is going to be highly difficult stop due to the spread of AI technology that makes this possible:
Police will carry out an intensive crackdown on deepfake sex images as a recent series of such crimes stoked fears that any woman could fall victim, officials said Tuesday.
During the seven-month crackdown set to begin Wednesday, police will aggressively hunt down those who produce and spread such images, especially those of children and teens, the National Police Agency said.
According to the police agency, 297 cases of deepfake sexual exploitation crimes were reported nationwide from January and July. Among 178 people charged, 73.6 percent, or 113 individuals, were known to be teenagers.
Political parties and rights groups also called for stern punishment and an active investigation.
“Fear is growing, with nearly 220,000 members estimated to have participated in these deepfake porn chat rooms on Telegram,” said Son Sol, a co-chair of the minor opposition Progressive Party’s task force on the issue.
You can read more at the link, but the AI deep fakes have also been linked to the ROK Army with a chatroom that specializes in passing images of female Soldiers as well. It makes me wonder how long before this becomes a problem in the U.S. military as well?
Why are these folks so gormless, feckless, and pusillanimous? I was always busy at my job or out with my friends. The online culture is creating weenies.