ClothOff Профиль Публикации Комментарии Подписки
О себе
As of December 2025, ClothOff continues to operate as a leading AI "nudify" service, primarily through clothoff.net, with apps available for Android, iOS, and MacOS, plus a Telegram bot. The platform uses sophisticated neural networks, including GANs and diffusion models, to digitally remove clothing from uploaded photos, generate realistic nude images, or create custom undress videos with adjustable parameters like body features, poses, face swaps, and effects. It offers free trials, one-time purchases of VIP Coins for premium features (such as queue skipping, multi-uploads, and advanced poses), and emphasizes quick processing.
The site claims robust privacy measures—no data storage, automatic deletion of uploads, technical blocks on processing images of minors (with account bans for attempts), and prohibitions against non-consensual or illegal use. It also states it donates funds to support AI victims through a partnership with ASU Label (asulabel.com).
Despite these assertions, ClothOff remains highly controversial for enabling non-consensual deepfake pornography and child sexual abuse material (CSAM). The landmark October 2025 federal lawsuit in New Jersey (Jane Doe v. AI/Robotics Venture Strategy 3 Ltd.)—filed on behalf of a minor whose clothed social media photo was used to create and distribute fake nudes—invokes the TAKE IT DOWN Act, seeking image removals, data destruction, AI training bans, and potential platform shutdown. Whistleblower reports link operations to former Soviet Union countries, while investigations reveal acquisitions of rival nudify apps amid growing scrutiny.
Critics, including reports from Der Spiegel, Bellingcat, and Ars Technica, argue the tool inherently facilitates abuse like harassment, bullying, and revenge porn, with millions of users worldwide intensifying calls for global AI regulation. ClothOff denies liability for misuse and maintains operations.
Подписки
Не подписан ни на один комикс.