Andrew Robinson, Director of Runningbird and Mindcorp discusses the use of GenAI for pitching drama through from “synopsis to pitch deck, storyboard to sizzle reel.”

I want to talk about a couple of projects where we believe AI has given producers a huge advantage over more traditional methods of pitching, fund raising, characterisation and scene setting.

TV and film is rife with the issues surrounding passing off, fabrication and blatant copying of other people’s assets or even their voice or persona.

We hold four principles to task over anything that needs to be created for any production purposes. They are categorically

No pretending generated content is real or passing off.

You have permission or consent to use assets from the rights owner or family.

The money earned/paid by AI must be the equivalent to that of the rights holder.

Every asset that is used must have a rights owner that is not being ripped off from a creation.

We take these points into account every time we create something using AI.

This is why a lot of people feel confident enough to come to us with their AI projects.

But when it comes to an elevator pitch, a synopsis, a deck or similar, which is not going into production, but is instead created to get the funds in the first place, the story is very different.

AI has a massive role to play in being agile and fast and allowing a great understanding of a story to unfold with minimum effort and budget.

Here are two examples which illustrate that exact point.

Lucio’s Treasure. Our first thought was, where do we start with this? We had already been experimenting with AI facial recognition previously and this gave us the perfect opportunity to see how far we could push the existing AI technology. So, we began to play…

With seven key characters in the series and no talent attached other than Miguel Ángel Silvestre it was up to us to bring these characters to realisation using the short bios provided.

Using image references and a number of precise prompts we were able to create a complete character set. These all fitted perfectly within the environments they worked in within the series. That’s not to say that there weren’t prompts and characters that didn’t make the cut.

Whilst working on this project the AI landscape felt like it was changing by the second. It was important that we stayed on the crest of the wave and move with these AI enhancements. This then allowed us to push what we had already created to a whole new level.

And the most difficult element of creating the characters? Being able to hold their features in a number of scenarios.

Now we decided to further test the limits by finding out how they would move within the scene. 
 This was at first extremely difficult as the AI software was interpreting our prompts in a completely different way. This led to us trying a number of different types of AI software in order to find out which was best for creating motion. We were quick to realise that there was no “one-size fits all” AI software to achieve the results we wanted.

As our AI skills continued to grow, we were able to create “scenes” that suited the series look and feel. This allowed us to knit multiple generated AI clips together to create a 30 second sizzle (https:// www.mindcorp.co.uk/lucios-treasure/) from our original generated images.

Driving’s Deadly Toll. As the technology advanced and we understood the perfect prompts to use, we were asked whether our creativity through AI extends to unscripted content. The answer to that was ‘Absolutely!”.

To that end, we quickly decided to demonstrate this with the client’s project and help them get the funding they need to get this documentary off the ground.

From the initial written brief, we crafted some hand-drawn storyboard sketch scenarios that captured our vision. Taking this into Midjourney to refine the frames and content and finally animating them into short scene segments.

The key thing was to keep the movement realistic and steady. For example, we’d keep the camera static to lock off on the scene and have elements within the scene moving like a stroller being carried out from an ambulance or people walking into the scene.

Using AI generative audio software, we were able to stitch the visual content with suitable AI generated sound clips resulting in a striking and immersive experience which perfectly put across the feel and mood of the proposed documentary film on Driving’s Deadly Toll. 
 https://www.mindcorp.co.uk/drivings-deadly-toll

And so, you can see that you can remain creative, ethical and kind to the pocket of the production company. AI in its current state is good for some establishing shots and also extreme close-ups. Getting dragons to fly out of magical books and unbelievable scene scenarios to work in AI is relatively easy. What is not, is the mundane. The locked off shot without much happening, a person, a face, an expression.

But Lionsgate have seen this coming and have recently ingested their content in order that other content ideas can be more readily generated quickly. We are picking up clients for the scenarios above, but we are also doing small seminar presentations, just to show the executives, what is possible in the creative department.

For our own business, we have had to adapt rapidly to the new order. No creative could join our company now unless they have a working knowledge of AI. That’s how quickly it has changed.

If you want and need to create anything using AI, ask creative people, not production film or tech. And ask creative people who are not using just the stock effects but are asking AI to create unique experiences that match your creative vision as a producer.

You speak to us at Mindcorp. (https://www.mindcorp.co.uk/mindcorp-contact/)

To view our AI projects, click here (https://runningbird.co.uk/bringing-storyboards-to-life/)

Staff Reporter

Share this story

Share Televisual stories within your social media posts.
Be inclusive: Televisual.com is open access without the need to register.
Anyone and everyone can access this post with minimum fuss.