[ad_1]
NEW YORK, April 16 (AP) Imaging with artificial intelligence could be used to create art, try on clothes in a virtual fitting room or help design advertising campaigns.
But experts worry that the dark side of these easily accessible tools could exacerbate something that primarily hurts women: non-consensual deepfake pornography.
Read also | Nepali New Year 2023: Thimi is painted red to usher in the Lunar New Year.
Deepfakes are videos and images that have been digitally created or altered through artificial intelligence or machine learning. Porn created using this technology began to circulate on the internet a few years ago when a Reddit user shared a clip of a female celebrity being placed on the shoulder of a porn actor.
Since then, deepfake creators have spread similar videos and images targeting online influencers, journalists, and others with public profiles. Thousands of videos exist on numerous websites. Some have been offering users the chance to create images of themselves — essentially allowing anyone to turn anyone they want into a sexual fantasy without their consent, or use the technology to harm an ex-partner.
Experts say the problem is growing as it becomes easier to create complex and visually appealing deepfakes. That could get worse, they say, with the development of artificial intelligence-generating tools that are trained on billions of images from the internet and spit out new content using existing data.
“The reality is that this technology will continue to proliferate, it will continue to evolve, and it will continue to become as easy as pressing a button,” said Adam Dodge, founder of EndTAB, an organization that provides tech-enabled abuse training.
“As long as this happens, people will undoubtedly … continue to misuse the technology to harm others, primarily through online sexual violence, deepfake pornography, and fake nude photos.”
Noelle Martin in Perth, Australia, lived through this reality. The 28-year-old discovered deepfake porn of herself 10 years ago when she Googled her photos one day out of curiosity.
To this day, Martin says she has no idea who created the fake images or videos of her having sex, which she later found out. She suspects someone may have taken the photos posted on her social media pages or elsewhere and doctored them to make them pornographic.
Horrified, Martin has been contacting different websites over the years to try to have the photos taken down. Some did not respond. Someone else took it down, but she soon found it again.
“You can’t win,” Martin said. “It’s something that’s always there. Like it ruins you forever.”
The more she talked, the worse the problem became, she said. Some even told her that the way she dressed and posted images on social media led to harassment — essentially blaming her, not the creators, for the images.
Ultimately, Martin turned his attention to legislation, advocating for a national law in Australia that would fine companies A$555,000 ($370,706) if they fail to comply with takedown notices from the online safety watchdog for such content.
But governing the internet is nearly impossible when countries make their own laws about content that is sometimes produced around the world. Martin, who is currently a lawyer and legal researcher at the University of Western Australia, said she believed the problem had to be brought under control by some kind of global solution.
Meanwhile, some AI models say they are already restricting access to explicit images.
OpenAI said it removed explicit content from the data it used to train its image generation tool, DALL-E, which limited users’ ability to create these types of images. The company also filters requests and says it blocks users from creating AI images of celebrities and high-profile politicians. Another mode, Midjourney, blocks the use of certain keywords and encourages users to flag questionable images to moderators.
Meanwhile, startup Stability AI rolled out an update in November that removed the ability to create explicit images using its image generator, Stable Diffusion. The changes follow reports that some users are using the technology to create celebrity-inspired nude photos. (Associated Press)
(This is an unedited and auto-generated story from a Syndicated News feed, the content body may not have been modified or edited by LatestLY staff)
share now
[ad_2]
Source link