Clothoff, a leading app in the controversial realm of deepfake pornography, is reportedly planning a global expansion, aiming to solidify its dominance in the online market. Known as a nudify app, Clothoff has been at the center of legal battles, including a lawsuit filed last August by San Francisco’s city attorney, David Chiu. Despite these challenges, a former employee turned whistleblower has revealed to Der Spiegel that the app’s operators remain unfazed by legal threats, instead focusing on acquiring a network of similar apps to expand their reach.
According to the whistleblower, Clothoff now owns at least ten other nudify services, collectively drawing monthly views ranging from hundreds of thousands to several million. The whistleblower, who was granted anonymity, described a shift in the company’s culture towards a more profit-driven mindset, as the app evolved from an “exciting startup” to a major player in the deepfake industry. The app reportedly operates on an annual budget of approximately $3.5 million, with marketing strategies heavily relying on platforms like Telegram and X channels to target potential users.
Marketing Tactics and Legal Challenges
Clothoff’s marketing strategy, as outlined in Der Spiegel’s report, includes a “large-scale marketing plan” targeting the German market. The campaign allegedly involves creating fake nude images of celebrities to attract ad clicks, using the provocative tagline “you choose who you want to undress.” Some celebrities named in the plan have denied consenting to this use of their likenesses and have indicated potential legal action if the campaign proceeds.
Despite the legal hurdles, Clothoff’s reach continues to expand. The app has become a popular tool in the United States, where it is implicated in both the San Francisco lawsuit and a separate case involving a New Jersey high schooler. The latter involves a boy who used Clothoff to create and distribute a fake nude image of a classmate, highlighting the app’s troubling use among young people.
Targeting Young Audiences
Clothoff’s marketing budget reportedly focuses on advertising in niche online communities, such as special Telegram channels and Reddit subforums. The app targets men aged 16 to 35, appealing to interests ranging from memes and video games to more controversial topics like right-wing extremism and misogyny. This approach raises significant ethical concerns, as it appears to exploit young and impressionable audiences.
“Chiu was hoping to defend young women increasingly targeted in fake nudes by shutting down Clothoff, along with several other nudify apps targeted in his lawsuit.”
International Expansion and Ethical Implications
Der Spiegel’s investigation into Clothoff’s operators led to Eastern Europe, where a database inadvertently left open online revealed key figures behind the app. The findings suggest a connection to former Soviet Union countries, with internal communications conducted in Russian. A spokesperson for Clothoff, identified as Elias, denied knowledge of the individuals identified and disputed the reported budget figures.
Despite these denials, the app’s expansion plans continue unabated. Clothoff’s alleged strategy includes similar celebrity-focused campaigns in the UK, France, and Spain. The app’s ability to generate explicit content from a single image has reportedly attracted over a million users, further complicating efforts to regulate its use.
Legal and Social Consequences
The rise of apps like Clothoff has prompted legal and social challenges, particularly concerning the protection of minors and non-consensual pornography. The Take It Down Act, recently passed in the US, aims to make it easier to remove AI-generated fake nudes from online platforms. However, experts warn that the law may face legal challenges over censorship concerns, potentially limiting its effectiveness.
“Jane Doe is one of many girls and women who have been and will continue to be exploited, abused, and victimized by non-consensual pornography generated through artificial intelligence.”
Future Outlook and the Path Forward
As Clothoff’s influence grows, the ethical and legal implications of its operations remain a significant concern. The app’s ability to evade legal repercussions and expand its user base highlights the challenges faced by regulators in addressing the misuse of deepfake technology. Moving forward, increased collaboration between legal entities, technology platforms, and advocacy groups will be crucial in developing effective strategies to combat the harmful effects of nudify apps.
The ongoing legal battles and societal debates surrounding Clothoff underscore the urgent need for comprehensive policies that address the complexities of AI-generated content. As the technology continues to evolve, so too must the frameworks designed to protect individuals from exploitation and abuse.
About The Author




