Televisual brought together a roundtable panel comprising some of the UK’s leading post and production technologists and operational management working on high-end television and film and all working with international studios and streamers. The discussion at the Groucho club over dinner took in the diverse ways companies currently manage the pipeline, remote working, the increasing use of a private cloud and the degree to which the public cloud was desirable. Our thanks to ERA and Hammerspace for their support and perceptive contributions. (You can read about ERA and Hammerspace after the discussion.)

The discussion at first seems to have much that is familiar, and you might expect – and there is – a commonality of experience and approach across the businesses participating in this discussion (for example, the use of private cloud). However, there are significant points of difference, in part reflecting how each post house operates and legacy investment but also some of the operational, financial and cultural uncertainties that run in parallel with the move towards the public cloud, changed workflows and the advent of AI.


Main picture: Dreamland – Fifty Fifty, grade Joe Srtabb and audio Gavin Allingham


The Roundtable Panel

  • ADRIAN BULL CEO, Cinelab
  • SEAN BAKER Managing Director, ERA
  • JAINA DESAI Sales Manager, ERA
  • ADAM DOWNEY Director of Post, Sky
  • MALCOLM ELLISON Head of Technology, Picture Shop
  • DAVID KLAFKOWSKI Founder & CEO, Racoon
  • TOM MITCHELL Technical Director, Mission
  • MOLLY PRESLEY Head of Global Marketing, Hammerspace
  • ADAM SHELL Head of Technology, Digital Orchard
  • ROBERT SIDDAL VP of Technology, UK, Streamland
  • JOE STABB Head of DI and Colour, Fifty Fifty
  • RYAN TAYLOR Regional Sales Director, EMEA Hammerspace
  • TOM WOODALL Director of Post Production, Molinare

Wheel of Time – Picture Shop, Supervising Colourist Jodie Davidson

Remote implementation

ADRIAN BULL Remote working has been more about us doing more work in Slough for projects that are shooting internationally, or remotely, than it is about our people working remotely. The film aspect of our work keeps it being very physical – film needs to come to the lab before we can do anything with it.

On the data side, for anything that’s shot in the home counties, it’s still mags that are coming into us at the end of the day and that in many respects are being processed in the same way. But when we have a project that’s shooting in Atlanta or Vancouver, we put a basic kit capability there with a relatively low skilled data op who can copy the data across and back it up, with us connecting into that, transcoding, doing the QC, sound sync, grading and distributing content.

And in that world, there’s no reason why the operational staff that are connecting to that couldn’t be potentially anywhere. Because the technology around what’s happening with the data flow is being managed.

ADAM SHELL Digital Orchard currently uses the public cloud for three things: dailies reviews, client review and approval of post-production work, and for file transfers.

We’re seeing an increase in demand for cloud sync services for sending editorial files such as SalonSync and Hireworks Connect. The demand for these services will only increase as they significantly speed up the process. Occasionally we will be asked to upload original camera files (OCF) to an S3 bucket. So far this has only been for Amazon Original shows.

“We have had to pivot to create hybrid solutions in nearly all our 55 offline suites.”


TOM WOODALL We use Teradici to allow offline editors who want to work remotely access to an Avid within our closed network. They have access to their project and media on NEXIS as they would in Soho but with the freedom to continue work globally with access and permissions controlled centrally.

Our online editors can remote, again Teradici, into either Flame or Avid. They can have setups at home using our MoliStream platform, which can stream – over NDI or SDI – an HD SDR image. We don’t use any other finishing applications within the public or private cloud given the collaborative nature of the work, which is generally client-attended at Molinare.

We are governed by what demands our clients put on us, we have had to pivot to create hybrid solutions in nearly all our 55 offline suites but also consider keeping ours and our client’s data both secure, within a tightly controlled internal system, and accessible, when and wherever they want it. The advantage of MoliStream is we can stream NDI from anywhere, or if a client is fully remote, from a series of Z4Rs. As a service provider we have to be reactive and agile whilst remaining cost effective.

MALCOLM ELLISON Some of our clients who might visit London will want remote sessions. Most clients still want to be with the operator.

There are projects where LA will hop on and come and take over on conform and shot dropping. If we’re getting material late at night there’s a review in the morning, the Americans will jump in. But not much more.

“Remote working is an old expression which we should all move on from, it implies the workforce is not connected.”


Succession, S4 – Cinelab, 35mm processing and scan

DAVID KLAFKOWSKI Access to a global talent pool has been a reality for several years. There are many tools that enable collaboration for these disparate teams, but our industry has a habit of “re-inventing the wheel” from project project. Tech producers and coordinators pick and choose from the various new (and old) options available, often resulting in a frustrating learning curve, a multitude of new passwords, MFA levels – Multi-factor Authentication – and disparate non-interactive software components.

All of which leads to lots of wasted time and rather than the sense of enablement, most of those involved are left with an underlying sense of “wouldn’t it be easier if we were all in the same place”, particularly those not able to spend the afternoon on the beach. (This being one of the major upsides of the modern disparate workforce, for those clever enough to have pulled it off.)

Our Tanooki platform links people with their work and their preferred tool sets, providing a single unifying gateway. Leaving plenty of space and time to refine and re-invent the workflow (when required) without the frustrations of access control, data location, hardware allocation and compatibility.

When handling the large volumes of data involved in long form productions, centrally locating storage and compute (in a cloud like fashion) makes obvious sense. Why spend precious time and money moving data (or really copying it) from location to location. Get it all somewhere accessible and work from it there.

Applications like LucidLink sort the data, “cloud like” compute (public or private) sorts the GPU and CPU. Using public providers for all these processes can be very liberating for a technology team (in essence, outsourcing all the hardware architecture). In theory, the tech team will no longer be spending time nursing old hardware as they sweat CapEx assets, they can spend their time supporting the production teams, refining workflows, drinking coffee and in the process shifting this team from a cost centre to a billable profit centre. That’s the theory at least.

The bottleneck is and will continue to be transfer and copy time. Superfast and super secure affordable access to the internet (up and down) from any location on the globe is the panacea, enabling true camera to cloud workflows from anywhere and the end of removable storage devices (not happening anytime soon I fear).

Remote working is an old expression which we should all move on from, it implies the workforce is not connected. Work smarter and use technology as the enabler. Bring the team together when they need to be together, the only thing that should change locations – when they want to, or need to – is the team.

“Every editor, director, producer, AP and EP, they all want to work in a slightly different way.”


Hatton – Sky Post Production, grade Mark Mulcaster

ADAM DOWNEY Some clients want to come in, others don’t. The limitations are often on what they need to achieve and the speed they need to achieve it. There’s no ‘one size fits all’.

Every editor, director, producer, AP and EP, they all want to work in a slightly different way. The biggest challenge I have with working remotely, which brings it right back to humans, is communication.

It’s the logic of my daughter sending me a text, all in caps and my response is, “are you shouting at me?” As creative people we learn a lot through body language and how someone associates with what we do.

I don’t care where the data sits. (How it changes the billing model is a completely separately thing.) People talk about networks, then they talk about tools, and on top of that sits a person that’s working that tool. It’s that human factor, that’s still the important bit.

“We can’t change the culture; we need to learn to thrive within it.”


MARK MALTBY Technology continues to evolve at pace, presenting a remarkable level of choice and flexibility to film and television producers. This flexibility can engender a trend towards over-burdening the decision-making process, creating delays well into the closing weeks of post.

Like most post-production facilities, we realised it was fruitless lamenting missed deadlines. We had to learn to adapt to survive in this ‘last-minute’ culture.

The main challenge with such delays is the effect they have on efficiently managing our finite resources. Having our compute and storage already hosted in the private cloud means we can leverage the power of Infrastructure as a Service (IaaS). This allows us to grow our infrastructure quickly and temporarily in response to unexpected – or expected – demands.

We strive to ensure uniformity across all workstations so that transitions between machines are seamless. Jumping onto an IaaS workstation to expedite a task feels as natural to the operator as being on a ‘local’ machine, with no loss in efficiency due to unfamiliar settings or missing plugins.

Our commitment to cloud and IaaS is about shaping the facility for the future. We can’t change the culture; we need to learn to thrive within it so that, even under the tightest deadlines, our clients’ visions can continue to be realised with no detriment to the quality or our team’s work life balance.

JOE STABB We recently switched over to pixstor for our central WIP storage. We’ve also got ngenea giving us that cloud component. We’re more futureproof and our clients can send us media, or we can collect it, from anywhere in the world.

I can see this workflow developing over time and more camera to cloud workflows are coming around the corner.

Wanting a single source of truth was a key driver for our investment. Making multiple copies of high bandwidth data and project work adds up quickly and it can be a pain to keep track of.

ROB SIDDAL We have bound machines and storage to our domain and use AD to manage user access and permissions internally. We have a dedicated IO and Pulse team who manage the data across all volumes, projects, and locations. They are responsible for client deliveries using solutions like Pulse, Aspera and MediaShuttle and for media management including any updates from production, VFX vendors and so on. We have all major live streaming services available for client review, this has been driven by client preference.

All data is secure within our own production network. We manage and support most storage in house and anything that requires third party vendor support we have secure VPN tunnels. Clients or other third-party productions do not have access to any of our storage due to security studio agreements and requirements. We also have a dedicated security council who review all existing and additional internal/external network access requests.

“We can just spin up infrastructure on demand. That often works well for start-ups.”


SEAN BAKER Although there’s quite a lot of difference in approaches, I think most post and VFX houses are enacting the same processes and have got similar challenges from an infrastructure perspective. It’s storage in a data centre somewhere, or in your own facility. Ultimately, it’s down to what delivers the right level of performance for your need and what makes the most commercial sense from a cost perspective.

The more established post and VFX businesses with more predictable requirements are happy to look at hosted services with a longer-term commitment, so they can define their own requirements in more detail. We can provide hosted infrastructure within our data centre, like we do for The Look, who want to run their own private cloud, but with infrastructure as a service. They can be more flexible, and it frees them up to focus on what they do best.

JAINA DESAI And we provide a more comprehensive service to those who want to outsource and run on an OpEx model. We can just spin up infrastructure on demand. That often works well for start-ups where capital investment is already stretched, especially now that borrowing costs are so high.

“We’ve embraced the power of both private and public cloud.”


Landscapers – The Look, grade Thomas Urbye and online Mark Maltby

Up in the clouds

TOM MITCHELL At Mission, we’ve embraced the power of both private and public cloud in our dailies, DIT, video playback and digital lab services. Our labs can be remotely hosted, and operators can access them from anywhere. We use the cloud to host dailies and live feedback from the set, and our DITs leverage it to back up and share metadata on set. The cloud also provides robust security measures, access to AI tools, and the ability to leverage metadata for media use.

Our software tool, Origami Phoenix, which my team has developed, plays a crucial role in this process. It offers any post and VFX vendor self-managed automated VFX and DI pulls and the flexibility to work in a distributed manner, speeding up project turnover and granting access to high-performance computing and storage resources in the cloud.

“We see our usage of public cloud increasing, especially in the unscripted side of the business.”


ROB SIDDAL We see our usage of public cloud increasing, especially in the unscripted side of the business. We have been working on some POC’s and changes in workflows to accommodate a shift to a more hybrid storage approach. We have a view to move more client facing software solutions to the cloud, such as our own review and approve, and archive solution, Fred. Currently our front-end services use public cloud for our automated dailies solution, Pulse.

MALCOLM ELLISON The public cloud comes down to cost. That’s the biggest challenge. It’s still very expensive to move data in and out of the cloud. To burst into it? Yes, absolutely. But it’s still a question of how you’re pulling that data down, of egress fees and API calls.

TOM MITCHELL Contrary to popular belief, I would argue that cost is not the primary downside of the cloud. If you’re merely replicating your existing processes in the cloud, it will be more expensive. For instance, having 150 VFX artists rotoscoping in workstations in the cloud would be costly. But if you leverage the power of AI and automation tools that can do 90% of the work in a few minutes, the cloud can significantly save time and money. Therefore, the key to cost- effectiveness in the cloud lies in reimagining and optimising your workflows to take full advantage of the cloud’s unique capabilities.

Heartstopper S2 – Molinare, grade Lee Clappison

“There is a good deal of upside in the public cloud, especially if you are building from scratch.”


ADAM SHELL Our use of the public cloud will undoubtedly increase over the next five years as the 2030 plan starts to come into effect. I don’t think this will involve us investing much in the public cloud as the studios will build their own islands and we will hook into them as required. I do see us needing to invest heavily in connectivity. A 100GB internet connection would not be overkill if every asset needs to be uploaded into the cloud.

What is less clear to me is how independent film will be affected by this move. There may be an opportunity to build cloud infrastructure that the independents can use but whether the economics of this will be viable I am not yet sure.

There is a good deal of upside in the public cloud, especially if you are building from scratch. The ability to easily scale is very beneficial especially if you have space constraints. One area that will see huge change is deliverables. Being able to generate deliverables through a publish function and using AI to generate captions, subtitles, translations and versions may take a lot of work away from the traditional post-house.

We may see a move to a more freelance model where post-talent is directly employed by the studio (in much the same way as on set talent is now). How post-houses attract and retain talent when this model becomes the norm poses some interesting challenges and opportunities. Is the post-house of the future more like an agency but with some nice rooms for client-attended sessions and very little hardware? At that point does it need to be in London, or do we see the rise of residential post houses like we did with recording studios in the 80s (Air Studios Montserrat, anyone)?

The Power – Mission, on set dailies and DIT

The single source of truth

MALCOLM ELLISON With everyone delivering in, the single source of truth must be that final post facility that has all the data. Otherwise, someone will create a new thing that you then copy and work on… and then you have a new thing… and they still think they have the new thing… We used to pass a tape around that had the original camera negative recorded on it. That is about as pure as you can get. Now, we build an island and break it down from there.

“There is a better way to achieve a global workflow where you’re unifying your islands into one namespace, one data set and everybody’s working on that data set.”


MOLLY PRESLEY There is a better way to achieve a global workflow where you’re unifying your islands into one namespace, one data set and everybody’s working on that data set.

It can sit on your pixstor or your Nutanix or your S3 or whatever it is. It doesn’t matter because you have a name space presenting it and remote access to it. You don’t have to worry about unifying VFX projects as everyone is working on the same data set. This is what Hammerspace is doing.

It’s like when you back up your iPhone to your iCloud. When you have a new generation iPhone, you just enter your user ID and all your data’s there. Accessing the same data is as fast on your iPhone, as fast as your Mac and as fast as your iPad. All have very different performance capabilities, but it’s the same data set. And when you change something on one it presents on all your other devices.

That’s essentially what Hammerspace is doing for content across different storage architectures and whatever operating systems. It doesn’t matter if it’s Windows or Linux or Mac.



ERA has been delivering IT and workflow solutions since 1998.

Today, the company is one of the UK’s leading independent providers of IT workflow solutions for clients in the media, VFX, post-production and broadcast industries, providing Infrastructure as a Service (IaaS) and other managed services to meet the needs of demanding projects.

Many of our top customers have been with us for over ten years; in that time, we have invested in infrastructure that enables us to serve customers with operations across the UK, India and Canada.




Hammerspace delivers a global data environment which spans across data centres, AWS, Azure, and Google cloud infrastructure.

With origins in Linux, NFS, open standards, flash and deep file system and data management technology leadership, Hammerspace delivers the world’s first and only solution to connect global users with their data and applications, on any existing data centre infrastructure or AWS, Azure and Google services.


James Bennett

Share this story

Share Televisual stories within your social media posts.
Be inclusive: Televisual.com is open access without the need to register.
Anyone and everyone can access this post with minimum fuss.