Activity

  • ClothOff posted an update 2 months, 3 weeks ago

    ClothOff is one of the most prominent and controversial AI-powered “nudify” platforms https://999nudes.com/ currently available, using advanced deep learning models (including GANs and diffusion algorithms) to digitally remove clothing from uploaded photos and videos, generating hyper-realistic nude illusions.
    It offers dedicated mobile apps for Android, iOS, and MacOS, along with advanced features such as DeepNude AI image generation, custom undress videos with realistic motion and expressive details, face swaps (standard, video, and porn-specific variants), multi-uploads, adjustable body parameters (e.g., breast and butt size), sex poses and sets, queue skipping, and an API for automated adult content creation.
    Aggressively marketed as “Your TOP-1 Pocket Porn Studio” and “The New Porn Generator,” the service provides free trials for basic functions, with premium access unlocked through one-time purchases of VIP Coins (no recurring subscriptions) for higher quality, faster processing, and additional options. It promotes purported health benefits of sexual activity and masturbation, while claiming robust privacy protections—no data storage, automatic deletion of uploads, no distribution without consent, and technical safeguards allegedly preventing processing of minors’ images (with automatic account bans for attempts). ClothOff strictly prohibits non-consensual use, illegal activities, and content involving anyone under 18, and states it partners with Asulabel to donate funds supporting victims of AI abuse.
    Despite these assertions, ClothOff has faced intense ethical condemnation and legal challenges for enabling non-consensual deepfake pornography and child sexual abuse material (CSAM). A major federal lawsuit filed in October 2025 in New Jersey (Jane Doe v. AI/Robotics Venture Strategy 3 Ltd., the operator registered in the British Virgin Islands and linked to Belarus) alleges it enabled the creation and distribution of hyper-realistic fake nudes of a minor from her social media photos, invoking the TAKE IT DOWN Act for mandatory removals, data destruction, AI training bans, damages (up to $150,000 per image), and potential shutdown. Supported by Yale Law clinics, the case highlights real-world harms such as bullying, harassment, and emotional distress.
    Investigative reports from Der Spiegel, Bellingcat, Ars Technica, The Guardian, and others document the company’s acquisition of numerous rival nudify services, trace operations to regions in the former Soviet Union (including Belarus), and detail its role in global abuse cases—particularly school incidents involving minors. The platform has been blocked in Italy by the Data Protection Authority for unlawful data processing, faced advertising bans on Meta platforms, restrictions in the UK, and removal of its official Telegram bot, yet it continues to draw millions of monthly users while resisting regulatory efforts. ClothOff denies liability for user misconduct and remains operational amid escalating global demands for stricter controls on non-consensual AI-generated intimate content.