Watchdog group Public Citizen demands OpenAI withdraw AI video app Sora over deepfake dangers
Public Citizen demands OpenAI pause Sora 2 citing racist deepfakes, nonconsensual content, and insufficient safeguards that risk public safety and democracy, the group said.
- The tech industry watchdog group Public Citizen demands OpenAI withdraw its AI video app Sora over deepfake dangers.
- Public Citizen is concerned about Sora's potential threat to democracy, as people cannot trust what they see with the AI generating realistic videos.
- The group also raises concerns over nonconsensual images and harassment targeting vulnerable populations online.
59 Articles
59 Articles
If User-Generated AI Content Comes to Disney+, the Possibilities Are Exciting and Terrifying
So much of the fear around artificial intelligence is that movie studios are going to use generative-AI to replace writers, directors, visual artists, and even actors and pump out entire films and shows with Sora 2 or other models. Cooler heads have had to prevail and show there are right ways about approaching generative-AI and doing so with an artist focus, insisting that jobs will evolve rather than be replaced and that there’s no replacing g…
De-Coding Discrimination: What’s being done to dismantle AI stereotypes
As artificial intelligence (AI) becomes increasingly woven into our daily lives, a troubling truth is emerging: The technology designed to advance humanity is also amplifying its oldest prejudices. From OpenAI’s Sora 2 video generator – which has been used to produce racially mocking portrayals of Black people – to ChatGPT and Google’s Gemini exhibiting bias in speech, the promise of innovation is colliding head-on with the persistence of racism…
Watchdog group asks OpenAI to withdraw app over deepfakes
The tech industry is moving fast and breaking things again — and this time it is humanity's shared reality and control of our likeness before and after death — thanks to artificial intelligence image-generation platforms like OpenAI's Sora 2.
Advocacy group calls on OpenAI to address Sora 2’s deepfake risks
Throughout 2024, OpenAI teased the public release of Sora, its new video generation large language model, capable of creating lifelike visuals out of user prompts. But due to concerns about the tool being used to create realistic disinformation during a critical U.S. election year, the company delayed its release until after the elections. Now, a year later, critics are warning their fears about Sora’s reality distortion powers have come to pa…
Deepfake danger: Watchdog group Public Citizen demands OpenAI withdraw AI video app Sora
The tech industry is moving fast and breaking things again - and this time it is humanity's shared reality and control of our likeness before and after death - thanks to artificial intelligence image-generation platforms like OpenAI's Sora 2.
Coverage Details
Bias Distribution
- 59% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium





















