Artificial intelligence is entering a new phase with the rise of tools like Meta’s “Vibes” and OpenAI’s Sora 2, which allow anyone to instantly generate short, highly realistic videos. Meta describes Vibes as a TikTok-style feed of fully AI-generated clips. OpenAI’s Sora 2 goes a step further, allowing users to insert “cameos” of themselves or others (with permission) into AI-generated scenes. The feature is currently invitation-only, but OpenAI says wider access is coming.
At first glance, Sora seems like a fun novelty. Type a prompt such as “a man rides a horse which is on another horse” or “figure skater performs a triple axle with a cat on her head,” and within seconds the system produces a convincing video. History has shown that when a tool is invented, it rarely takes long before someone finds a way to abuse it. Society is often left scrambling to catch up.
Even in these early stages, Sora has already drawn controversy. Entertainment industry groups have raised concerns about users generating clips that incorporate copyrighted characters or trademarks without permission. OpenAI initially placed the burden of enforcement on content owners, but after pushback, CEO Sam Altman said the company would give rights holders more granular control. That has not satisfied everyone. The Motion Picture Association accuses OpenAI of allowing infringing content to spread across its platform and onto social media, calling for “immediate and decisive action.”
Copyright disputes are only one part of the issue. The looming wave of low-quality, mass-produced AI content, which critics have started calling “AI slop,” threatens to flood social platforms and services like YouTube. Meta says all Vibes videos will carry invisible watermarks and “AI Info” labels. OpenAI says Sora videos include visible and invisible provenance signals. Experts warn that labels will not solve the problem if the volume of synthetic content becomes overwhelming. Once AI-generated media appears alongside authentic footage in everyday feeds, it becomes harder to decide what is trustworthy.
Copyright infringement or intellectual property misuse may not be the worst outcome. One of the most concerning uses of AI video is in misinformation campaigns. AI image generators have already shown that people will believe almost anything if it looks real enough. Viral hoaxes built from still images have fooled the public repeatedly. Full-motion, photorealistic video will only make that vulnerability worse. A single well-timed fake clip could manipulate public perception, influence elections, or even spark real-world panic.
There is also the likelihood of AI video being used for harassment and personal defamation. A tool like Sora could make it easy to fabricate fake footage of ordinary individuals and present it as real. False accusations, revenge plots, and character attacks could be created in minutes. Clearing one’s name might become harder than being framed.
The consent problem goes beyond famous people. So far, conversations around deepfakes have focused on celebrities. Once high-quality AI video tools are available to anyone with a smartphone, fame will no longer be a requirement for exploitation. A classmate, a neighbor, or an ex-partner could have their likeness copied and placed into content they never agreed to appear in.
Another issue that often gets overlooked involves legal and forensic standards. Courts, news outlets, and police investigations still treat video as credible evidence. If AI-generated footage becomes indistinguishable from real recordings, that trust will begin to erode. The problem will not only be fake videos being believed. Real videos may start getting dismissed. AI video has the power to make lies look like truth and truth look like lies.
The tech industry promises safeguards, labeling, and responsible use. History suggests those measures tend to arrive after the damage has already been done. Sora may be an impressive creative tool, but the real question is not what it can generate. What matters is what happens when people decide to weaponize it.
—By Greg Collier