The exposed database of AI image generators reveals what people actually used it

The exposed database of AI image generators reveals what people actually used it


According to Fowler, similar to CSAM, the database had pornographic images generated by adult AI and potential “face swaps” images. In the file he observed what appears to be a photograph of real people. This is likely used to create “explicit nude or sexual AI-generated images,” he says. “So they were taking real photos of people and swapping faces there,” he claims some generated images.

When it was live, the Gennomis website allowed explicit AI adult images. Many of the images featured on the homepage, as well as the “Models” section of AI included sexual images of women. It also included the “NSFW” gallery and “Marketplace” where users can share images and sell albums of AI-generated photos. The website’s catchphrase stated that people can “generate” “unlimited” images and videos. Previous versions of the site in 2024 state that “uncensored images” can be created.

Gennomis’ user policy states that “explicit violence” and hate speech are prohibited, and that only “respectful content” is permitted. “Child pornography and other illegal activities are strictly prohibited by Gennos,” the community guidelines say accounts posting prohibited content will terminate. (Researchers, victim advocates, journalists, and tech companies have largely phased out the phrase “child pornography” in favour of CSAM over the past decade).

It is unclear how well Gennomics uses moderation tools or systems to prevent or prohibit the creation of A-Generated CSAMs. Last year, some users who posted on their “Community” pages posted that they were unable to generate images of people having sex and were blocked from non-sexual “dark humor” prompts. Another account posted on the community page that says “NSFW” content needs to be addressed as “the federal government may be watching.”

“If you can see these images with nothing but a URL, they indicate that they are not doing all the steps necessary to block that content,” Fowler claims the database.

Deepfake expert and founder of the Consultant Infiltrating Space Advisory, Henry Ager says that the branding of the website refers to the “unlimited” image creation and the “NSFW” section, even if the company does not allow the creation of harmful and illegal content.

Ajder says he is surprised that the English website links to Korean organizations. Last year, the country was plagued by a deep, non-consensual fake.”emergency“It was targeted girlbefore taking measures I’ll fight wave Deepfake abuse. Ajder says there needs to be more pressure on every part of the ecosystem that allows AI to be used to generate non-consensual images. “The more we see it, the more it forces lawmakers, tech platforms, web hosting companies and payment providers to ask questions.

Fowler says it also published files that the database appears to contain AI prompts. Researchers say user data such as logins and usernames are not included in the public data. Screenshots at the prompt show the use of words such as “Tiny”, “Girl”, and references to sexual behavior between family members. The prompts also included sexual behaviors of famous people.

“It seems like the technology raced ahead of either guidelines or control,” Fowler says. “From a legal perspective, we all know that explicit images of children are illegal, but that didn’t stop technology from generating those images.”

Generated AI systems have greatly enhanced how easy it is to create and modify images over the past two years, which has led to an explosion of AI-generated CSAM. “The web pages containing AI-generated child sexual abuse material have more than four times as many times as of 2023, and the photorealism of this terrifying content is also refined.

I have an IWF Documentation How criminals develop methods to create and use to create CSAMs that are increasingly generated in AI. “It’s too easy for criminals to use AI to generate and distribute sexually explicit content for children at a large scale and at a rapid pace,” says Rayhill.

admin

Leave a Reply

Your email address will not be published. Required fields are marked *