A clinic at Yale Law School has filed a lawsuit demanding the shutdown of an AI application called ClothOff. The app creates deepfake pornographic content without user consent, and the clinic accuses it of using a 14-year-old girl's photos to produce child abuse material. Although it has been banned on major platforms, the app can still be accessed online. This lawsuit highlights the legal challenges of holding such platforms accountable and compares ClothOff to general AI tools like Grok from xAI, which are more complex in determining user intent.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
A clinic at Yale Law School has filed a lawsuit demanding the shutdown of an AI application called ClothOff. The app creates deepfake pornographic content without user consent, and the clinic accuses it of using a 14-year-old girl's photos to produce child abuse material. Although it has been banned on major platforms, the app can still be accessed online. This lawsuit highlights the legal challenges of holding such platforms accountable and compares ClothOff to general AI tools like Grok from xAI, which are more complex in determining user intent.