Lights, Camera, Algorithm

Simon BrownAI in the Wild, Latest

AI in the Wild

AI in the Wild

From drive-ins to deepfakes

I was seven when I saw Star Wars at the drive-in. It blew me away, not the story, which went over my head, but the spaceships and light sabres. Those 1977 special effects were rudimentary at best, but they did what mattered, transported us to another world for an hour or more.

The first Star Wars didn’t even use computer-generated imagery (CGI). Its iconic opening title sequence was done by physically laying out the text on a long board and filming it with a camera slowly tracking along in a darkened room.

CGI had arrived the year before in a movie titled Futureworld. It featured a computer-generated hand and face done by Edwin Catmull and Fred Parke of Utah University. A decade later in 1986 Pixar was founded by Catmull and Steve Jobs and then in 1995 they released the world’s first full feature movie made entirely from CGI, Toy Story.

Fast forward to today and CGI is everywhere. Open Instagram and see all those cute, but clearly fake, animal videos? Social media is full of amazing scenes generated by Chinese app Seedance that look like real big budget movies with recognised celebrities, yet pricing starts at under $10 a month.

350 AI shots and counting

The House of David, an Amazon Prime TV series, used AI-generated scenes in the sixth episode of season 1, “Giants Awakened”. Creator Jon Erwin said that without the use of AI, the scene “would have been just far outside the budget parameters of the show”.

Season 2 took it even further, using over 350 AI-generated shots including a sequence that opened with David’s slaying of Goliath. This was a large-scale desert battle of which the majority of wide-angle shots were generated or augmented by AI.

Make no mistake, the decision has been controversial, but Erwin says no jobs were lost and it really was about getting more bang from the budget.

Meet your new AI co-star

Jobs are going to be an issue here and the reality is that jobs will be lost. We already have our first fully AI actress, Tilly Norwood, created by Xicoia, the AI division of Particle6 Group. She hasn’t had any starring roles, but that’s just a matter of time. Think scripted weather or news TV shows, or an extra in the background. Heck, even a small one-line walk-on role will work as she builds herself up for larger starring roles.

This brings us to James Earl Jones, the voice of Darth Vader in Star Wars. He passed away in September 2024, and before he died he had signed over the rights to use his voice to Lucasfilm and a Ukrainian AI-audio company, Respeecher. So now, for a fee and with permission, producers can use the famous Darth Vader voice. Another voice acting job lost they didn’t even know about.

How long before a famous actor licenses their image for future use? Or a long dead actor is brought back to life with AI? A Marilyn Monroe movie in 2026 would surely break box office records?

Lights, camera, redundancy

Actors and crew are well aware and while the 2023 Hollywood strike was primarily about streaming residuals, it included concerns about the use of AI. The strike was resolved but no real progress was made on AI except from the promise not to use an actor’s likeness, or voice, without permission. Frankly the bare minimum.

A large blockbuster movie will use thousands of crew members and actors and while the big name star can’t be replaced (well T&Cs apply here) extras can be replaced by Tilly Norwood clones. Writing teams can be reduced as AI reworks the story, location scouts can be done away with as backgrounds are AI generated and even costume design can be done by AI. And so the list goes on.

The studio arms race

The recent news that Paramount Skydance has won the bidding war for Warner Bros. Discovery has a lot to do with AI and ultimately jobs. Paramount Skydance is backed by Larry Ellison who founded Oracle (his son David runs Paramount Skydance). They’ve paid (overpaid?) a fortune for Warner Bros. Discovery and will need to cut costs to try and recoup some of the $110 billion.

Ellison is also the lead investor in TikTok US and talks about their new media company being a “hybrid media+tech company”. You know that means lots of AI and less jobs, a lot less jobs.

Disney has invested $1 billion in OpenAI and licensed over 200 characters across Disney, Marvel, Pixar and Star Wars to its Sora video tool for a three-year period. But the Sora hype seems to have already died.

They’ve also said they’ll enable user-generated AI content on the Disney+ app, some sort of AI generated social network? Sounds like a horror show.

Disney has also been testing AI and recently selected startup Animaj for their 2025 Accelerator programme. Animaj demonstrated how AI could reduce the production time for a five-minute animated clip from five months to under five weeks. New CEO Josh D’Amaro has said AI will “boost, not replace” human creativity. Maybe, but with Paramount Skydance going all-in on replacing humans, Disney and the others risk being left behind carrying the old-school movie bag, and its costs.

But the real threat to Disney and the other big studios is that AI tools make us all movie-makers. They’ll struggle to rise above the noise unless they’re releasing one of their giant brand movies. Those giant brands such as Lucasfilm, Pixar and Marvel all have great IP. But the barrier to entry as movie-makers has disappeared and been replaced by my iPhone and an app.

Streaming, YouTube and the viewing public

The big winners here will be the likes of Amazon Prime and Netflix who will be able to produce more with less. Consider Squid Game, made on a budget of just over $20 million. Season 2 cost almost triple that, yet season 1 alone generated “impact value” of almost $900 million for Netflix. A massive return on investment and with more AI usage (including potentially AI generated actors) Netflix can churn out more content, take more risks and get lucky more often with giant hits.

On the smaller side AI could produce choose your own ending movies or auto dub into other languages. Oscar-nominated movie The Brutalist used Respeecher to fine tune the spoken letters and vowels of its Hungarian-language dialogue. This small ‘tweak’ was controversial and while it did win three Oscars, it did not win Best Picture for which it had initially been one of the front runners.

Perhaps the biggest winner here will be YouTube. We’re already seeing a lot of AI generated content being uploaded onto YouTube. Pretty much all of it is horrid and most have very few views. But the tools are improving. Seedance (owned by TikTok owners ByteDance) can do amazing high-quality video from just a simple word prompt or photo. Talented people the world over (check out the Dor Brothers) will start to make quality, enjoyable AI generated movies and YouTube will be the medium of distribution.

The big question is the viewing public. Will we accept AI into our movies and TV shows? Of course we will, but slowly. Way back in 1991 Terminator 2 showcased the T-1000’s liquid metal effects. This was a breakthrough in realistic CGI characters and audiences raved rather than revolted.

Movie and TV producers will step slowly into this AI maze. The use of AI will have to be seamless and for good reason. But every person who has loved a superhero movie is already loving CGI. AI is just the next evolution, albeit a big one.

Hollywood has always been about special effects. AI now makes that accessible to everybody.

Simon Brown


AI in the Wild

AI in the Wild is a regular column from Simon Brown.

AI is everywhere and only getting better. Record capital raises and valuations, and competing LLMs are all fun, but meaningless to our every day lives. This column will focus on how is it impacting us in the real world.

5th | 15th | 25th every month