DIY AI Movie Making Circa July 15, 2025: This is Getting Really Fun! (1/3)
A “Marvel‑Sized” Fanboy Film Will Arrive Before Disney/Universal v. Midjourney Ever Even Reaches Initial Discovery
Are you new to my R&D and analysis on AI, economics, politics, strategy, constitutional restoration in The Second Bill of Rights, medicine, and the future of work? Do you like stirring up things with woke Hollywood? Ha!
If you'd like to follow along with my on-going work and actual execution on these enterprises — click the “subscribe now” button just below here. It's all free!
Generative‑video tech keeps moving very quickly since Douwe Groenevelt debuted his 8:26-long short-film “Barney — The End of Lawyers” on June 17, 2025 — one day short of a month ago. Based on my calculations, Douwe stitched together between 500-600 separate 8-second-long clips from Veo 3 for shots and ElevenLabs for voice to create what was then the most state-of-the-art functionality of that technology. It was awesome to see! (Read my astonishment at Barney here: "Barney—The End of Lawyers" & The Rate at Which AI Is Accelerating Daily.)
In the four weeks since, it looks like we have had two more movie-making additive leaps:
Luma Ray 2: Why the visuals matter
Long‑form shots. Ray 2’s Extend‑to‑60 s switch and its loop‑seam tool let you turn a ten‑second base clip into a one‑minute take or an endlessly cycling establishing shot without obvious resets.
Key‑framing and spatial moves. You can pin the camera path, then prompt the scene to evolve—crucial for walk‑and‑talks or fight sequences.
1080p ceiling, but physics and motion are already solid. Every second you don’t have to up‑rez or stabilize in post saves hours over the span of around the 900 required shots in a 120 minute long feature film.
If Veo stays capped at eight seconds, Ray 2 is the practical workaround for scenes that would otherwise take a dozen separate Veo renders and a painful amount of editing glue.
HeyGen (or OpenAI Voice Engine): Why the audio matters
Dialogue at scale. A two‑hour script is roughly 20,000 spoken words. HeyGen clones a voice from a 15‑second sample and spits out hours of synced audio in any of 70‑plus languages.
Automatic lip‑sync. You paste that track onto Ray‑2 or Veo footage, run HeyGen’s lip‑sync pass, and the mouth shapes line up—no frame‑by‑frame hand work.
Character roster. Need ten distinct speakers? Clone ten friends’ voices and keep your cast “employed” for free.
Without something like HeyGen, you’re either going to pay professional voice actors (blowing our micro‑budget) or settle for text on screen (killing immersion).
So, now, anyone with a tiny little “production” budget can generate minute‑long, 4‑K takes. Stitch nine hundred of those clips together and you have a 120‑minute feature film ready for your Oscar. The only real “skill” is a spreadsheet that tracks which prompt belongs in which scene.
That is why some Marvel, Star Wars or Star Trek super‑fan who is sick of how “woke” all those classics have gone the past 10-15 years, I bet, is already cutting a guerilla, feature‑length epic true to the original canon. Douwe Groenevelt could do it today just by looping his existing workflow—and we all ought to know it. When that fan film drops, it will go viral precisely because it cost less than a single day of union wages on a studio lot. Let’s get the popcorn ready, ready, ready!
Hollywood’s Lawsuit Strategy: Playing Chess Against a Swarm of Drones
Disney and Universal’s June 11, 2025 complaint against Midjourney clocks in at 110 pages, leans on side‑by‑side Elsa and Darth Vader comparisons, and demands statutory damages plus a permanent injunction. A month later Midjourney answered by . . . shipping a video model that still generates branded characters. And, as of today, Midjourney has until August 6, 2025 to file its answer—that’s three weeks from now and look what has happened in the past four weeks. Litigation moves in years; model releases ship in weeks.
Compare all that to the RIAA’s 2001 kill‑shot on poor old Napster. Back then, a single centralized server let a preliminary injunction shut the lights off within months. Generative AI is the opposite: decentralized, forkable, borderless. Kill one corporate node and the same weights re‑appear on a home GPU in Shanghai or a torrent bundle on Reddit.
Hollywood, you have no prayer this time!
TV‑Quality? Try “Already.” Movie‑Quality? Here’s a Fun, Speculative Clock . . .
Now (July 16, 2025): 1080p, ten‑second shots; decent lip‑sync; cloned voices; manual assembly. Output already matches lower‑budget reality TV B‑roll.
90 Days: 4‑K minute‑long takes (Veo 3 public tier); multi‑track audio baked in; key‑frame continuity tags. That equals late‑2010s streaming originals.
Six Months: Automatic script import with scene breakdown; cloud orchestration keeps character rigs and LUTs consistent across an hour of footage. At this point, you’re flirting with straight‑to‑video Marvel spinoff quality.
One Year: Local‑GPU “director’s co‑pilot” lets you tweak camera and performance on a paused render. Real‑time regeneration means reshoots are keystrokes, not re‑lighting days. Call it early‑2010s blockbuster‑adjacent—and that’s without a studio’s VFX farm.
The curve is steep enough that betting against it is like betting against phone cameras after the first iPhone: technically possible, financially suicidal. Exciting!
Unions and Studios: Guardrails Inside, Free Chaos Outside
WGA/SAG‑AFTRA Win “Consent + Residuals”—but only for productions that actually sign union contracts. TikTok creators don’t and never will. Silly Hollywood unions!
Studios Need Clean Chains for global distribution, so they’ll eventually license their own IP into the same models they’re suing . . .maybe . . . I think it has everything to do with how quickly the technology advances I describe above actually come to market.
Everyone Else will keep remixing. The legal perimeter covers Hollywood payrolls, not living‑room laptops.
Napster vs. Midjourney: Why History Won’t Repeat
2001 Record Biz:
Central server (Napster) + court injunction = instant shutdown.
CD sales still 90 % of revenue—litigation bought labels a decade to pivot to streaming.
Fans had no easy tools to create music.
2025 Hollywood:
Open‑source weights + global GPUs = whack‑a‑mole impossible.
Box‑office <50 % of studio revenue and dropping; streaming already cannibalized margins.
Fans now generate cinema‑grade video and audio in a browser.
This time there is no single valve to close and no physical‑media cash cow to protect while the legacy industry regroups. The legal hammer falls slower than the tech goes exponential, so every quarter of delay widens the power gap by orders of magnitude.
The Brutally Honest Outlook
The “traditional” options for legacy Hollywood include:
License & Monetize—sell official, watermark‑verified assets into public models. But, Midjourney and the wave of the present know they don’t need to even consider that foolishness. So, that’s never going to happen.
Accelerate—fold AI into their own pipelines fast enough to compete on novelty. Lots of good that will do them after poisoning the fanbases with “woke” garbage for the past 10-15 years. No thanks, Disney and Paramount. The fans will just do it themselves out of love for the canon.
Litigate & Lament—and watch a Barney‑style full-length fan feature film headline Reddit, YouTube and every con panel in early 2026, at the latest.
Option 3 is apparently their current strategy; it leads to the same place Napster’s plaintiffs eventually landed—embracing streaming, but ten years later and on someone else’s platform. Only this time the audience and the “I know I can do it myself” creator types won’t wait: they’re already binge‑watching home‑brew Star Wars short-films (Barney-style) rendered on consumer GPUs.
Bottom line
For a few thousand bucks and a few weekends (presently), a determined fan can deliver a feature‑length, Marvel‑grade bootleg that will light up social feeds—and no federal docket will move fast enough to head it off. Hollywood isn’t facing Napster‑style disruption; it’s staring at a Cambrian explosion of unlimited, decentralized visual storytelling. Fight it, license it, or get run over by it. The timeline ends in months, not years—and the next Barney will prove it. Gimme my popcorn!
If you found this article actually useful, SUBSCRIBE to my channel for DAILY long-form analysis on AI, economics, politics, strategy, medicine, constitutional restoration in The Second Bill of Rights, and the future of work. Also, please SHARE this piece far and wide with anyone thinking seriously about these issues, and leave a COMMENT down below—especially with the names of people you know who want to make Marvel, Star Wars or Star Trek:TOS fan films faithful to canon ASAP!
READ more about The Second Bill of Rights on its website.
JOIN its mailing list.
SPREAD its message everywhere.