Subscribe Online  


Top colourists on the art of the grade

Leading movie and high end TV colourists - Adam Inglis, Adam Glasman, Asa Shoul, Jean-Clément Soret and Paul Ensby - reveal the secrets of their craft, and explain the techniques they used to grade movies like The Danish Girl, Peterloo, Baby Driver, Slumdog Millionaire and The Lady in the Van

Adam Inglis
Freelance Colourist
Credits include
Churchill, Sherlock Holmes, Peterloo, V for Vendetta, Dredd, Swallows and Amazons, The Nice Guys, Billy Lynn’s Long Halftime Walk, Blue Planet II, Wolf Hall, Poldark

I feel that a ‘good” grade’ is an appropriate grade. By appropriate I mean one that should always serve and be dictated by the story.

I don’t think you should really notice the colourist’s work, even if it is quite extreme and you certainly shouldn’t be able to work out what tools they used to achieve it. If it seamlessly integrates with the style, themes and other elements of the production, then while you may feel it, you’re not going to notice it as being out of place.

I’m normally off doing other grades while the shoot is going on; the shoot side of things is taken care of by the DIT. I might work with the DoP on tests and LUTs before the shoot but then we only start the grade properly once we get into the suite.

The film itself is the biggest influence on what we do in the grade. I have often said that the film will tell us how it wants to be graded. However, the film does not work in an artistic and cultural vacuum. Other films, paintings, photographs and the natural world all contribute and inform the visual language that we are working with. I try to see as much of these as I can to enrich my understanding.
I like to work in a series of passes, rather like drafts of a script, coats of paint or sanding a piece of wood. I don’t feel that you can always get it right first time and that it’s best to get the basic structure in place first before building upon that and allowing the grade time to evolve naturally.

I start with the big picture decisions and then work all the way down to smaller details. It gives you freedom to change direction and feel how things flow much like a piece of music. Get the scenes in place first, then the shots and finally individual elements of the shots.
Everyone on a film should be working towards what the director wants at all times. I don’t feel there is such a thing as ‘my’ vision of the grade; the film is the director’s vision. The role of the colourist is to find the best way to get to our destination, much like being a taxi driver – you use your knowledge and experience. There’s no point taking your client to the museum if they want to go to the airport.

I don’t feel that the director and DoP have to be in the room with me at all times. There’s a certain amount of work I can do, and often do faster, on my own. It can be efficient and productive to take a direction from them and then have some time alone with the material to then present something to the director or DoP that they can react to with a fresh set of eyes.
I am a big proponent of HDR, especially for wildlife films where it finally allows us to achieve the ‘window on the world’ feeling that we have been working towards for so long.

I don’t see any difference between the way I approach a TV drama or a feature film, nor even why there should be any difference. Wildlife documentaries however are different – the images should look and feel as real as possible, which is actually a very precise target to hit. In the fiction film world, artifice is part of the storytelling, it’s what separates ‘art’ from ‘reality’.

I love seeing film prints but there’s no question that digital exhibition has made a colourist’s job easier. I am an ACES advocate. It’s not unusual for films to be shot on multiple cameras and have multiple delivery formats, so the more order we can apply to that chaos, the better.

I work on Baselight, Resolve, Nucoda and Lustre, with Baselight being my first language system. You can’t really restrict yourself to just one system as a freelancer.

I think communication is the most important skill. Getting inside the creative minds of the director and DoP is the key to efficiently crafting an effective final product. You need to get to that synthesis of vision. In an ideal grading scenario, a colourist doesn’t need to be asked to do something since they’re already doing it anyway.

Adam Glasman
Senior Digital Intermediate Colourist
Goldcrest Post
Credits include
The Danish Girl, Three Billboards Outside Ebbings, Missouri, Cold Mountain, Anna Karenina, The Best Exotic Marigold Hotel, Skyfall, Casino Royale, World War Z, Rush, Marley, In Bruges, The Death of Stalin, The Boy in the Striped Pyjamas, Macbeth, The World’s End, King of Thieves

I come onboard in pre-production, where I am generally involved in camera/lens and hair/makeup tests. At this point I may be given a script and discuss the look with the DoP. I often work with the test material to create look LUTs, allowing the DoP to preview material onset in real time with an approximation of the intended grade. This can be applied to the dailies for editorial, or references can be given to the DIT or onset colourist on larger productions.

With most projects, I grade previews in advance of the main DI. Although the material is generally of a lower quality and resolution (HD MXF from editorial) than available in the final grade, previews can be a useful part of the look development process. They allow filmmakers to see grade concepts in context for perhaps the first time.

Vfx-heavy productions often require grading work during the shoot, so the VFX vendors have the ability to preview their work with something close to the final look.
I generally prefer to work with the DoP in attendance. My role as a colourist is to achieve the DoP/directors vision, so I won’t try and impose my taste. However, I will attempt to guide the grade in a direction suited for the material and narrative. With regular clients, I tend to be more vocal as they obviously trust my judgement.

The most important step in the process is watching an up-to-date cut of the movie before commencing the final grade. Next, in collaboration with the DoP, I create look references for all key scenes in the movie and get this approved by the director. At this point I grade all material chronologically using the references. When complete, I review and adjust with the DoP, making sure everything is consistent and working in context and then show this completed version to the director and producers for final sign off.

Inspiration can come from other films, commercials, photography and art. I am often asked to look at the work of certain photographers and artists to use as grading references. Recent examples were photographer Saul Leiter and the paintings of Francisco Goya.

I’m usually given two to three weeks to grade a feature. This can increase greatly with VFX-heavy tentpole features. As an example, I recently graded Jurassic World: Fallen Kingdom which required over six weeks to complete. This is partly due to the number of delivery formats, which in this case involved DI, 3D, SD (rec709), Dolby domestic HDR, Dolby theatrical HDR, Dolby theatrical 3D, and domestic 3D. Time constraints are generally much greater with broadcast projects.

A broad knowledge and love of cinema is essential for features colourists. A good technical understanding of the process is also a great help. I studied image science and worked in stills before I moved to features post, and my background has proved invaluable. Perhaps the most important skill is the ability and confidence to control the grading room. This really comes from practice and experience.

Asa Shoul
Senior Colourist
Warner Bros. 
De Lane Lea
Credits include
Mission Impossible: Fallout, Isle of Dogs, Annihilation, Ex_Machina, Baby Driver, Kick Ass, Yardie, The Constant Gardener, Layer Cake, United 93, Clash of the Titans, ‘71, Brooklyn, The Crown, Generation Kill, Tin Star, SS-GB and The Alienist

A ‘good grade’ is one that is consistent, doesn’t bump from shot to shot and feels appropriate to the subject matter, genre and medium. A ‘pushed’ grade, where environments, and skin tones might look unnatural or heightened might work with a fantasy or comic-book story, but it might jar the viewer if it was applied to a documentary or more natural feeling film.

I try to get involved as early as possible. Ideally at script stage, but definitely at camera and lens testing, then costume hair and makeup. We’ll start to exchange looks via images from different places, (these might be paintings, photographs or stills from other films). For The Crown, we did a number of tests just looking at lipstick so that we could nail it and avoid me having to adjust it in the grade for hundreds of shots. Testing for costumes to see how they appear with particular looks applied is important particularly with a pushed grade.

Inspiration is also dependent on the subject matter, the period the film is set in but also the mood and emotional content of the film; we might decide to attach broader tones to emphasise character arcs and beats within the film.

The DoP or director might brief me about the project, but I am prepared for this to change. The initial brief from director Matthew Vaughn for Kick Ass was to go for a naturalistic almost documentary look, as the main character did not have super-powers. This eventually changed, and we referenced Dick Tracy and similar comic-book looks, creating a vibrant airbrushed film.

The starting point of any grade is to play with the image and see initially where it wants to sit. It’s important to understand the latitude that the exposure has given you to play with and to find how far the image can be pushed. I’d balance scenes without diving in to adjust face colour or skies and so on.

My first pass is to balance shots together, then a pass to work on the look of the film, then further passes to shape the images, applying grads and shapes to highlight areas of the frame or push them back into shadow.
I can develop a shorthand with some previous clients. I can usually get pretty close to their desired look with an unattended first pass. With new clients it takes time, perhaps a day to understand how they see an image, their preferred levels of contrast and saturation and the terminology they might use to describe colour or an emotion.

I don’t try to drive anything to my preferred look. I like diversity and to explore new looks anyway, but if asked I feel it’s important to be honest if you feel the image is being compromised or the look is working against the narrative.

I use Baselight. It was designed by colourists and colour-scientists and continues to be developed by Filmlight in a collaborative and inclusive way.

I think it’s important to be able to start again on a scene if you think a new approach will give a better result. Patience is a key skill for colourists, as is not being precious about what they may have offered.

Jean-Clément Soret
Supervising DI Colourist
Technicolor London
Credits include
Slumdog Millionaire, The Other Boleyn Girl, 28 Days Later, Mandela: Long Walk to Freedom, T2 Trainspotting 2, You Were Never Really Here, Serenity, Steve Jobs
127 Hours, The Twilight Saga Eclipse, Sunshine

On movies you get involved early as you are asked to give material for the dailies. It’s mostly last minute on commercials, but some projects are all about the grade, in that case you have pre-grade sessions before the big one. Also, the look can evolve during post and is finalised at the end.

A brief from the director and/or DoP is usually done with references of previous work done together, or from other people’s work, as well as movies and photos, but it’s rarely to be taken literally, there is always room for improvisation and input.

My influences come from my own culture and background, or my back catalogue of work, anything I do when I don’t sleep.

When film was around the starting point in grading was a print from your camera negative at 25 points RGB print value. This was a true representation of the DoP’s work without any creative intention.
I try to emulate this before we start grading, a flat raw image doesn’t mean anything other than the range of exposure you have on your image.

I tend to present something I believe is working for the project to the director/DoP as I go along. I try a few things, but options narrow naturally very quickly. I don’t like too much forced looks unless there is a good reason.
I respect all opinions in the room but put a good degree of my vision in a grade, that’s what keeps me going after so many years.

A ‘good’ grade is one that suits the cinematography, is well balanced, adds production value, and meets the client’s requests, but with subtlety.

Different types of project command a difference in approach. Drama is a race against the clock, commercials are a luxury, movies are in between; the more time you have the more you can explore, think and give attention to detail.

Some clients have embraced HDR, and it almost becomes a second grade. Others haven’t and the HDR looks almost the same as SDR. There is a bit of skepticism about it. We will see how it develops.

One current trend in grading is that everyone wants their digital to look like film. Print looks have been popular for a while. On the technology side, new kit makes [the job] easier but new technology can also make workflows unnecessarily complicated.

Two pieces of advice for colourists: Always go too far, you can always back down later but at least you will not regret missing an opportunity. Attention to detail is key.

Paul Ensby
Senior Colourist Company 3 (Deluxe)
Credits include:
The Lady in the Van, Allegiant, Amy, Mary Queen of Scots, Great Expectations, A Quiet Passion, The Man from U.N.C.L.E., The Riot Club, Electric Dreams (TV series), Watership Down (TV series), Origin (TV series), Eric Clapton: Life in 12 Bars

There isn’t a right or wrong grade. I look for things such as, does the grade help enhance the narrative, and technically is it well matched within scenes?

When I get involved varies with the project. Sometimes I am involved in camera testing, right through dailies and finally through final DI. In some cases, I only get involved after the project is edited. I’m not normally hands-on again until the finishing process, unless asked to specifically look at a problematic set up during the shoot.

Defining the look is a collaborative process. [At the beginning] working closely with the DoP and director, we would normally try to form some early reference images of the look or looks. I then prefer to show them something I have worked on, albeit even a basic grade, based on what we have discussed previously. Ultimately, the director/DoP will have the final say but I wouldn’t be doing my job if I didn’t contribute based on previous experiences.

When looking for inspiration, I love the early Technicolor films such as The Red Shoes and Black Narcissus, right through to the Sergio Leone westerns.

I tend to do a basic matching pass initially to gauge the flow of the film, then go through the project adding more secondary corrections, then a third pass and so on. With this way it’s easier to know how the film feels on the run, rather than getting bogged down in smaller details too early.

I’ve used Davinci Resolve since I moved to Company 3 in 2014. Blackmagic has added many features that help but the excellent tracker for windows has saved me a lot of time recently.

How long you typically get to grade a project varies. Movies tend to take anything from five days to four weeks depending on the show, while for broadcast drama we get a lot less time. Maybe two to three days per episode!

I’ve delivered several HDR DCPs. HDR opens up the limits where SDR can’t reach. With enough care and attention, it can look amazing! Some scenes explode with contrast and detail which is great. Other scenes require a lot of containment as often the image needs a subtler, softer approach.

I am not seeing an overall trend in film grading, but personally I have pulled back a little from everything having to look like 35mm film!

Diplomacy, patience and time management are essential skills for a colourist. An ability to identify how or where to improve an image quickly helps. Plenty of experience dealing with all sorts of formats and a creative eye for detail is important.

Posted 07 December 2018 by Michael Burns

Let's get high: the route to HDR

From the shoot to post to final delivery, Michael Burns discovers the best route to HDR

The long-awaited promise of high dynamic range (HDR) TV series and programmes is now being delivered, appearing on (compatible) screens with greater regularity. But if you’re thinking of going down this route, how should mastering to HDR change the way you approach projects from planning and production management, and from shooting on set to the
post workflow and grading? And why go to all 
this trouble?

Post planning for HDR
Streaming services, particularly Netflix and Amazon, are throwing a lot of weight behind HDR, and facilities are picking up more and more work on the format. “We’ve been mastering shows in HDR for Netflix, Amazon, Fox, Hulu, Sony, among others,” says Encore md Morgan Strauss.

The Farm has mastered several high-profile projects in HDR for Amazon and Sky. Peter Collins, head of scripted pipeline at The Farm Group, says the main impacts in terms of budgeting are those of storage and delivery, throughout every stage of the project lifecycle.

Production requirements include shooting at higher resolution, meaning larger data sets from the cameras, while post requires larger storage allocations to store Digital Intermediate files at both a higher resolution and higher bit depth, says Collins. “Then there’s the additional network overhead that comes along with that, while the archival delivery masters to the commissioners requires more storage allocations again.”

investment needed
Goldcrest CTO Laurent Treherne says being able to deliver 4K HDR content requires substantial investment. “First you need to be able to display HDR content, and a grade 1 HDR monitor is quite expensive,” he says. Goldcrest uses the Sony X300 monitor for all video SDR and HDR deliverables. “You also need to be able to playback and grade uncompressed UHD resolution 16bit files,” he adds. To give an example, an uncompressed 16bit UHD/HDR file is nearly six times heavier than a standard uncompressed 10bit HD file.

Accommodating HDR has been the biggest investment the company has made, says The Look CEO and senior colourist Thomas Urbye. “We combined the need for 4K workflows, monitoring and delivery with HDR support. We have installed a 100Gb network to move 4K 16bit data like HD 10bit data, and the machines are using very fast graphics cards – we can’t afford for these big dramas to be waiting for copying or rendering.”

HDR flavours
One key factor in planning seems to be the flavour of HDR being used. Encore’s Strauss says: “Both of the prominent flavours, HDR10 and Dolby Vision, are very similar in terms of the colour science.” But their workflows can vary. Dolby Vision is a simplified, prescribed workflow using [Dolby] proprietary software and hardware. HDR10 is delivered through off-the-shelf platforms like Resolve, Baselight, and Lustre. “There are a few extra hoops to jump through depending on whether it’s Dolby Vision or HDR10.”

Poduction budgets are starting to reflect these kind of investment, says Peter Collins who adds: “The feedback from our clients is that the creative opportunities that the technologies have to offer have been well worth the budgetary increases.”

Shooting and lighting in HDR

Though attention is focussed on HDR as a post process for now, other stages of the production need to take heed of the difference in approach.

Fleabag, the 2016 foray by the BBC into HDR, saw Liz Pearson take on the role of post-production supervisor (working with The Look). “A lot of people think [HDR] is just a new flavour of HD. And it’s not,” she says. “It’s more like you are shooting film, than data.” Departments like camera and make up
need to be briefed beforehand, “It’ll show everything
up – every makeup line, every hair, every shadow.”

Monitoring on set
Although it’s an expensive extra, Pearson recommends an HDR monitor on set, and having someone assigned to checking for the differences – the hard-pressed DIT can expect to be even more busy as HDR begins to roll out.
“HDR reveals things that you may not have been aware of in an SDR version,” agrees Molinare’s Chris Rogers. “If HDR isn’t considered on set, then lights and other bits of kit that were lost in ‘blown out’ windows may become obvious in HDR.”

Films at 59 project manager Miles Hall says noise reduction is a key issue as HDR exposes a lot more. “The greater dynamic range also requires more accurate graphics and VFX; they have to be spot on.” Camera choice is important too, particularly in genres like natural history where a wide range of cameras are used. “HDR is pretty unforgiving of certain camera formats,” says Hall.

Maximum latitude
The Farm Group’s head of scripted pipeline, Peter Collins, says files should either be recorded as a raw format, or as a de-Bayered log image using the widest colour gamut possible. This ensures the maximum colour latitude is available in post and for archive.

“There are certain conditions where a combination of shutter speed and motion within the frame can introduce visible judder in high dynamic range images at current common frame rates [24/25/30 and so on],” Collins adds. “So it’s advisable to consider this when shooting motion in scenes with a lot of light, or when panning shots.”

Technicolor head of broadcast Louise Stevenson says her facility will do tests with the camera department to check how HDR process will impact lighting and composition. “For example, practical lights in the frame can become very bright and distracting.”

She also warns that HDR should be used effectively. “Highly saturated colours may not benefit a period drama, for example,” she adds.

“It is even more important to capture well-exposed images, as a 4000 nits HDR grade can be unforgiving with noise or clipped highlights,” continues Stevenson. “Shooting with high dynamic range cameras such as those from RED, Alexa, Sony or Panasonic will give the best possible results.”

Encore MD Morgan Strauss says production values will shine through with HDR. “Something you may not have noticed in the background due to contrast or gamut limitations may become clearly visible in the larger colour gamut HDR pass. It could be that a translight behind a window is more obviously not an actual cityscape.”

Sensitive shooting
Encore senior colourist Tony D’Amore also stresses how much HDR is sensitive to highlight ratios: “In production, if you establish a strong ratio, you have to make sure to balance that with translights or backdrops, and outside lighting. The outside level has to match, or it will be obvious. Also, be careful not to clip highlights in camera, especially if you plan to shift the colour drastically in post.”

“Ideally all images would be viewed on an HDR monitor on set,” stresses Peter Collins. “However, any unforeseen issues caused by the higher contrast can be somewhat mitigated or completely eliminated in post, while retaining the original creative intent of the scene as shot.”

The post workflow
At Films at 59, most of the HDR experience to date has been with the natural history film making community in Bristol.  Due to the very long lead time on these productions, the HDR requirements have been an added extra rather than being part of the original deliverables.

Project manager Miles Hall says: “As natural history productions tend to be shot in the log formats that we need for HDR anyway we didn’t need to change to our approach too radically.”

HDR or SDR first?
The main issue is whether to take a programme through post in HDR and make the SDR pass at the end, or whether to post in SDR first and to treat the HDR version as separate. After all, the SDR version is the primary deliverable and one that 99% of people will see.

“Either way raises its own issues and we work with production teams to determine the best approach for them,” says Hall.

Films at 59 has delivered Planet Earth 2 and Blue Planet 2 using the HLG standard. This was a relatively straightforward process, says Hall. Film at 59 could adapt its hardware to meet the standard and still maintain its typical workflows so everyone including clients were comfortable with the process. However, the Dolby Vision standard, and the archival requirements of broadcasters like Netflix, have meant a significant change in workflows. “We have had to adopt a more film/DI based workflow that allows us to finish in the kit that supports the Dolby standard. This means having a workflow that manages the colour pipeline between graphics, VFX, online and grading platforms to make sure we maintain the integrity of material through the process.”

The Farm’s Peter Collins agrees that post production workflows have had to adapt to a DI pipeline with HDR. “This involves efficient and flexible colour management to facilitate the additional requirements for grading and final mastering,” he says. “In addition, the size of the data sets necessary for HDR work require more resources from an infrastructure perspective including storage, networking and monitoring.

“Any kind of technical compromise on the picture quality is visible in HDR,” continues Collins. “The grain or noise needs to be managed more closely. A limited bit depth or any sort of compression on the media could be ok in SDR but could become a problem in HDR. So it’s very important to optimise the image pipeline quality throughout the full post workflow in order to guarantee the best quality for the HDR masters.”

HDR10 vs Dolby Vision
Goldcrest typically works with HDR10 and Dolby Vision for both domestic and theatrical deliverables. “The process is slightly different between HDR10 and Dolby Vison domestic deliveries,” says CTO Laurence Treherne. “With HDR10 you are only delivering a 1000 nits HDR version of the content. The SDR version is a separate process with its own grade. For Dolby Vision, you are delivering a single ‘package’ which could include multiple delivery formats – 4000nits HDR, 1000nits HDR, SDR, and so on. All those versions are usually done from the 4000nits HDR grade. You then use the Dolby Vision process to derive and trim the other versions.

“We do not own a Pulsar monitor or a Dolby Vision HDR projector so the main challenge for us was to find a way to do those HDR passes without having to move the media and projects to Dolby in Soho Square,” continues Treherne. “We discussed a few options with Dolby and agreed to look at a remote solution based on a dark fibre link between our two buildings. This setup allows us to remotely control any of our editing and grading platforms from the Dolby building and send the HDR uncompressed 12bit video signal back from Goldcrest to Dolby in real time. Our colourist is sitting at Dolby, but all the hardware, media and project is still located at Goldcrest.”

Trim passes
“The setup is slightly different depending on whether we’re delivering for the Dolby Vision system or HDR10,” says Technicolor’s Louise Stevenson. “When working on a Dolby project we do the HDR 4000 nits grade first and then create the SDR 100 nits trim – and possibly 1000 or 600nit after that. The trim passes require the use of Dolby’s own CMU analysis system. For HDR 10, we typically do the SDR version first and then do a separate 1000 nits pass. If we are doing the HDR version first, then this becomes the primary grade, so we will discuss the setup for this during pre-production.”

Prepping for the future
“As we are doing a lot of Netflix shows and have to use Dolby Vision it’s a prerequisite that we have to do the HDR first, especially as we aren’t even delivering an SDR deliverable,” says The Look CEO and senior colourist Thomas Urbye. “Having now done this on Netflix’s The Innocents series using Dolby Vision, I would actually always start with the HDR and use the Dolby technology as a fast track to the colour space conversion, even if we still dug in and did some more shapes and tweaks along with it for an SDR file export.”

“So far we have always been asked to start from SDR and then create the HDR master as an additional delivery,” says Treherne. “Saying that, doing the full post from online editing to grading and mastering in HDR wouldn’t be a problem for us. I wouldn’t be surprised to see the process flipping the other way around relatively soon. Most [new] TVs are now HDR compatible, but I think the real trigger for doing the full post in HDR will come from having the onset grade and offline editorial done in HDR. It will then make sense to post in HDR and make the SDR delivery as an additional pass at the end.”

Grading HDR
The BBC’s Planet Earth 2 and Blue Planet 2 were both delivered in HDR formats by Films at 59, so project manager Miles Hall knows well how imagery can benefit from HDR. “You can exploit specular highlights and environments with wide variations in light,” he says. “For example, light coming through a jungle canopy, or the sun reflecting on water or ice flows. HDR in these instances give you a much closer representation of real life than we are used to.”

The Look CEO and senior colourist Thomas Urbye sees the creative benefits in scenes with dynamic range and colourful landscapes, costumes or interiors. “The images can be absolutely stunning - far more impressive to the eye than REC709 SDR,” he says. “We have been showing our clients who have HDR dramas coming up some HDR material, so they understand what will happen in post. They have been blown away by it – the word ‘breath-taking’ is used a lot.”

Time to be creative
Narduzzo Too colourist Vince Narduzzo’s most recent HDR project was Save Me (World Productions) for Sky Atlantic. “It allows you to be more creative,” says Narduzzo. “Less time is taken trying to find detail in the image and more time can be used just enjoying the amazing range you have at your fingertips.”

“The big difference is the latitude and available information within the image,” he adds. “This is fantastic but at the same time care needs to be taken to ensure you retain an image that works.”

Narduzzo also warns about overdoing the grade. “You still want a piece that flows and doesn’t shout out,” he says. “Also, extra detail is great but sometimes not what you want. These are things that can be addressed. I think most programme makers still crave for the look of a 35mm frame and this can be achieved. Don’t end up with something eye popping but crude.”

Senior colourist at Molinare Chris Rogers is currently working on The Widow, a new HDR drama for ITV and Amazon: “The images are simply more striking and are a closer representation of how we perceive the world,” he says. “The additional dynamic range and gamut give us the opportunity to push images in new ways. This doesn’t mean that we will want to grade every scene to the limit, but it’s certainly nice to have the additional scope.

“There are some important considerations when approaching an HDR project,” he adds. “Some of the techniques that colourists have traditionally used to shape the look may only work well in SDR. For example, conventional ‘print emulation’ LUTs are problematic at best, so we now need to consider how to approach a grade that will translate easily and maintain the intent between various display colour spaces.”

Bolder colours
“Certain colours are bolder in the wider colour gamut,” says Encore senior colourist Tony D’Amore. “From a storytelling standpoint, there’s a much wider colour spectrum to work with. Cyan, blue and red will be bolder. If you’re going for a stylised look, those colours tend to be really striking and can evoke emotion. But that vibrancy doesn’t necessarily translate to SDR, which can be a challenge. It’s a reason to preview both versions as you go.”

Most recently D’Amore has graded Marvel’s Jessica Jones season two, Luke Cage season two, Iron Fist season two and Daredevil season three for Netflix in Dolby Vision. Additionally, he is currently mastering Legion season two for FX in HDR10.

Planning the grade
“When grading in HDR, I lay out a plan, starting with a custom wide colour gamut LUT for the specific camera/project,” continues D’Amore. “Next, I set a few looks in HDR then immediately preview them in SDR to ensure that I haven’t lost any potential highlight detail and to see if the look holds up. This approach is especially important with HDR10; with Dolby Vision, I do the HDR and SDR grades simultaneously. This is nice because it saves me a step and is easier on the eyes. Also, Dolby Vision currently allows highlight detail up to 4000 nits – four times that of HDR10. This is a variable that is crucial when balancing between shots. You will be surprised how much colour hides up in the highlights in Dolby Vision HDR.”

D’Amore says the Dolby Vision Content Mapping Unit (CMU), which applies the grading decisions as dynamic metadata, does a good job at previewing the SDR grade. “But it does require some HDR finessing to make sure both versions look perfect,” he says. “This is an important step to ensure that the look holds up in both HDR and SDR before committing to it. The goal is get as much latitude out of camera material as possible.”

Technicolor’s Louise Stevenson also stresses the importance of taking time with this SDR ‘trim’ during Dolby Vision grading. “This version will be seen by a large percentage of the audience, so it is essential to take care with this pass to make sure all of the creative team are satisfied with the results,” she says. “It can take time to ‘tune-in’ to the lower contrast look of the SDR version when everyone has been used to the HDR grade, especially if the HDR version has been pushed creatively, so it is important to manage clients’ expectations of what can be achieved in SDR. For HDR10, we typically do the SDR version first and then do a separate 1000 nit pass.”

“It is essential that the post house is geared for this new format [HDR], it needs great care and attention to ensure a smooth post experience,” says Vince Narduzzo. “I am pleased to be in on the ground floor as it’s an amazing advance, I would say bigger than the transition from SD to HD; this is actually giving better pixels not just more.”

Posted 17 August 2018 by Michael Burns

Live and Kicking: The Outside Broadcast Report

Major technical changes such as UHD, HDR and IP are driving big changes in the outside broadcast market. Michael Burns reports

The UK’s outside broadcast firms tend to work in two-year cycles – odd years are busy, while even years are manically busy. In 2018, then, the OB sector is as frenetic as an anthill.
The technology suppliers are pushing for higher resolutions, as well as HDR, while IP continues to be touted as a panacea for signal transport pressures, but from talking to OB companies, it appears demand is still lagging behind the curve. However, many see 2018 as being a watershed year, particularly when universal standards such as SMPTE-2110 are adopted.

On the playing field
The OB sector isn’t in quite as much flux as in the past. “The market remains solid and stable after several years of consolidation,” says Richard Yeowart, MD of Arena Television. “Investment remains key for any company. You can’t sweat your assets forever and when you have a fleet as large as ours, you need to be rolling out several new trucks every year. Membership of the OB club will set you back about £5m per large UHD-IP truck and there is little point in not going UHD and IP if you want to make your money back.”

Mike Ransome, CEO at Presteigne Broadcast Hire, says spending is tight with everyone. “OB remains an expensive business and everyone is doing what they can to trim the cost.”

An interesting direction of travel seems to be towards what’s referred to as simplified production, represented in some OB companies’ kit list by the Vibox from Simply Live. This ‘OB-in a box’ can control up to 12 cameras and offers a video switcher, audio mixer, graphics, slow motion and replay/highlights via a touchscreen control interface.

“[Simplified production] is used widely by European sports broadcasters, especially for the OTT and second screen market. This is an area we expect growth for OB operators in the UK,” says Adam Berger, COO of CTV. “We view that the small to medium sized work may migrate into simplified production rather than high-end UHD.”

This is perhaps what has really changed – where the OB is transmitting to. “The streaming, OTT part of a broadcast was initially something of an aside, but it’s steadily migrating to becoming the primary way that content is delivered and consumed, especially by younger generations,” says Ransome. “The primary content of a live broadcast is in some ways becoming increasingly incidental.”

Ed Tischler, MD for Gearhouse Broadcast UK, feels the industry is modernising too. “It’s becoming more corporate. There’s a lot more framework in place,” he says.

There has not been a dramatic increase for UHD, rather there’s been a slow growth in demand both in the UK and globally. Alan Bright, director of engineering at IMG Studios, says: “There are lots of challenges in terms of storage and transmission formats but maybe the biggest problem is that on 55-inch TVs, it is very hard to tell the difference between HD and UHD.”

 “The demand for UHD has increased but is still a small fraction of the output of the majority of broadcasters,” agrees Quinn Cowper, head of vision – outside broadcasts, at Timeline.

ES Broadcast Hire says it has seen steady year-on-year growth of UHD hire, with around 20% of jobs being UHD broadcasts. Warren Taggart, MD of ES Broadcast Hire says: “Even when broadcasts aren’t in UHD, we have seen several instances of 4K technology – with its superior imaging capabilities – being used. We expect that trend only to grow in the next 12 months.”

All of Timeline’s OB fleet is now UHD capable. “We anticipate that more and more premium sports and other events will be shot and distributed in 4K UHD and HDR,” says Cowper.

 The demand for 4K UHD is growing steadily, agrees Eamonn Curtin, commercial manager of Telegenic. “More projects are requesting it across all sectors, from sport to light entertainment.” Three of its trucks are UHD/4K capable with a fourth coming soon.

There is a lot of talk about HDR, but producing HDR live comes with its challenges. NEP’s director of technical operations Chris Cannon describes several technical challenges around implementing a HDR workflow, including the grading/racking of cameras, the simultaneous production of multiple colour spaces, and the issue of providing replays when working in the sports sector.

Vicky Holden, MD of Procam Projects, feels the standardisation of what broadcasters require for HDR is the biggest challenge at the moment. “Most productions shoot SDR as well as HDR, and the challenge is to vision engineer for both with just the single iris control,” she says. “Demand [for HDR] is limited for now, but there is a slow increase in requests.

Jonathan Lyth, technical director of ES Broadcast, reckons that HDR could become a critical part of broadcasting. “From a viewer perspective, it’s probably the most impactful technological development – more so than higher resolution. For that reason HD HDR – as opposed to UHD HDR – could be a great step.”

So HDR is being eased into OB, but costs will have to rise to accommodate it. “Budgets do seem to be larger, especially when Arri Cameras and PL mount lenses are specified,” says VE Live’s technical director Richard La Motte.
“I costed HDR up for one of our clients recently and I would say it was a relatively small uplift,” Ed Tischler recalls. “But it wasn’t huge.”

“Production costs are likely to increase a small amount, but I feel that this will be absorbed by the fact that we all have to invest in future technologies when upgrading trucks, studio, galleries and so on,” says Richard Baker, sales manager at Finepoint.

Arena has clocked-up hundreds of IP OBs in the past two years. “The beauty of IP is the ease of upgrading and we have already rolled out several enhancements to retain leadership in the field,” says Richard Yeowart. ”We expect the uptake of IP to snowball in 2018.”

NEP UK is building its first two IP trucks and an IP fly-pack system, to be deployed at Wimbledon 2018. Chris Cannon says: “The IP infrastructure enables NEP UK to build far larger broadcast systems in to trucks; systems incorporating UHD HDR workflows that would have not been feasible with a baseband SDI architecture.”

Finepoint’s Baker feels IP is not quite ready for the market, due to a lack of standardisation. “Will such a complex technology with many great benefits be overshadowed by 12G, that is easy to implement into existing facilities?”
“We haven’t taken the plunge with IP yet because we are waiting for the technology to settle down and become more reliable,” agrees Telegenic’s Eamonn Curtin. “The workflow of Quad-Link HD-SDI works well for us and all the contracts we have. It makes fault finding easier and keeps our trucks flexible, so we can move between different contracts, giving us the best utilisation for our trucks.”

 “Now that the SMPTE-2110 standard has been ratified, we will start to see more manufacturers producing that all important ‘2110 Interface’,” says Timeline’s Cowper. “We are confident this is going to happen, and we will essentially be able to connect these new devices into our IP router. The devices will become a source and/or destination and be treated the same way that our SAM [now Grass Valley] gateway cards are now used in UHD2.”

This is Timeline’s triple expanding, IP 4K HDR outside broadcast truck, which delivers large scale complex OBs simultaneously in uncompressed 4K UHD HDR and 4K UHD SDR.

“IP technology removes traditional SDI matrix limits, enabling production teams to fully harness the power of UHD 4K,” says Cowper. “Our UHD2 OB truck is based around the SMPTE 2110 standard, enabling both audio and video to be separately processed in the IP stream.”

“IP needs to be more in line with the prices of existing technology, in order to make the transition sensible,” says Gearhouse’s Tischler. “When you make engineering changes like this that production companies won’t necessarily see the advantage of it, so they’re not going to relate that to an uplift in price. I think also maturity of product is an issue.”

Going Remote
IP, however, is absolutely made for remote production, says Presteigne’s Mike Ransome. “IP connectivity on a camera head means that the majority of your infrastructure can be back at your primary location. The on-site requirement is about 10-20% of a normal crew. The cost savings are self-evident.” That said, there’s one important caveat, he adds. “It’s impossible to do remote production over a public network. If, however, you’ve booked a dedicated data pipe, remote production over IP works really well.”

Arena is also testing the water with remote production projects, says Richard Yeowart. “However, we only really see low profile events being considered for remote production in the foreseeable future. The cost savings aren’t enough to win over high-profile productions.”

“There has been a lot of interest in remote production for lower tier sporting events,” agrees VE Live’s Richard La Motte. “There will be more sport streamed live.”

Remote production is also a major talking point at Timeline. “Ultra-fast fibre connectivity is starting to appear, but there are only a handful of areas that have the infrastructure points to transmit these pictures from remote sites to broadcast centres,” says Quinn Cowper. These include Premier League grounds and city centre hubs. 
“A wildlife OB in a nature reserve in the Welsh mountains will probably not be suitable for a remote production – the high bandwidth IP connections would not be cost effective.”

“As in all TV production that content needs to be relevant and engaging to really gain the traction and push it to its limits to generate enough viewers to make the investment [in remote production] worthwhile,” says Richard Baker. “With all of this emerging technology it’s going to be an interesting and challenging few years to 
see if the content creators can work quickly 
to implement the products that manufacturing 
is delivering.”

Posted 27 April 2018 by Michael Burns

The Art of the DoP

DoPs Barry Ackroyd, Danny Cohen, Greig Fraser and Neville Kidd have between them shot films and shows including The Hurt Locker, Captain Phillips, United 93,  Jason Bourne, Green Zone, Detroit, The King’s Speech, Les Misérables, The Danish Girl, Room, Lion, Zero Dark Thirty,  Snow White and the Huntsman, Rogue One: A Star Wars Story, Doctor Who, Outlander and Sherlock.

The four tell Michael Burns the secrets of their craft, and explain the techniques they used to create the work

Barry Ackroyd

The Hurt Locker, Captain Phillips, United 93, The Wind that Shakes the Barley, Raining Stones, Jason Bourne, Green Zone, Detroit, The

I think you put together your personal style, based on the mood board, the period of the film maybe, and the director’s view of it. 

Once you’ve got the right equipment, once you’re surrounded by professionals on the day, you can achieve a look, a feel, using all your experience and all the talent that’s around you and make the film that you were always intending to make.  Because no-one can really predict it. But that’s definitely with directors that you have a particular relationship with.

I like to think that there is a strong vocabulary or an accent that you hear or see when you look at my films.

My history is in documentary and I’m deeply entrenched in realism, in the intimate relationship between the camera and the subject.

I handhold the camera and I’m usually just on the edge of the scene, which is a very documentary thing.  You intimately link to the subject but you are not necessarily feeling the pain or the joy; you’re just absorbing it all. 
Cinematography is an art form, the most personal and unique way of communicating that we’ve developed.

It’s about communication, it’s about people being moved by that wonderful thing of light-exciting chemicals… that is now light-exciting silicon chips!  But as long as we have a lens in front of the camera the process remains the same.

What has changed is not the cameras so much as visual effects and CGI work –  knowing that the background can be fixed, or that we don’t need to be in that location. The digital grade is just a whole new level of the dynamic. The quality of film-making has improved because of that.

I don’t tend to light exteriors at all. It’s fighting nature. You enhance what you can.

I have one secret weapon when working with available light [for internal scenes] - it’s just a down pipe, a drain pipe.

I got in the habit of taking a Kino Flo tube and a pipe just a little bigger than that. You cut out an 18inch section of the pipe, spray the inside white, clip the tube inside that and then you have a light, like a black down pipe, a stick, and then you can just move that around.

You can hide the pipe just behind a chair or a table and it’s just low enough to disappear behind that but throw light in the right direction. It might just lie on the floor and output a little under-light on someone’s head behind the chair, or just under the chin. To do that is the most subtle thing. It’s how I would get the best from what seems like a very natural unlit scene.

My method comes from my documentary background. I presume that the environment that has been chosen – usually a location – is the environment, and the look is how it should look. And the performances in it, although they’re by actors, and sometimes non-actors, should be as real as life is. 

So I give [the actors] all the space and we dance around them a little bit. We hide in corners and position two cameras so we get simultaneous action, and no-one is too worried about continuity.  But within that is a classical framing.  It’ll drift with the eye if someone turns their head. 

I can just listen, look and react in the same split-second way that you do in real life. You’re already informed with the script, with knowing what the actors are like, with knowing what the director is after and what the story requires. In a documentary you may never have another chance.  If you miss that moment, you’ve missed it but in film you can do that. I think it would terrify a lot of people.  I find it the most relaxing way to make a film. 

I get terrified if anyone wants me to turn every shot into the most beautiful painting.  I think what I do is much more like sculpture.

Every frame is precious to us and we make it work. An editor comes along and turns it into a masterpiece. It is this great collaboration.  The whole film industry is about collaborating and showing respect for each other. That’s why I love it.

Danny Cohen

The King’s Speech, Les Misérables, The Danish Girl, Room, Victoria and Abdul, Final Portrait, This is England, Creep, John Adams, Longford, This is England’86

I read a script to start with and then we start a conversation.  That’s the beginning. But all directors work completely differently. That’s the one thing that is completely consistent – they are all completely inconsistent and different.  Nothing prepares you for the next project. 

How you prepare for each film changes. I did a film called Final Portrait, directed by Stanley Tucci, about Alberto Giacometti the artist. Weirdly, when we were testing the film there was a really good retrospective of Giacometti at the National Portrait gallery and there are tons of interviews and photographs on YouTube. It was a matter of going through lots of material about how his studio looked and what could make the film interesting. Sometimes, if it’s an original screenplay, then it might bear no resemblance to reality at all. In a way you’ve got far more freedom, because you can make it all up from scratch. 

I take tons of photos. If you’re discussing something a photograph is something concrete that you can show a director and he can say “that’s interesting, or that’s boring” and you get a sense of their taste. 
The composition leads the eye to where you, as a storyteller, want the audience to be thinking about. Composition is a massive part of the filmmaking process, what you put in the frame and what you leave out of the frame is key.

Aspect ratio is always an interesting one. It’s changing, it’s something that isn’t locked in stone any more. I’ve shot lots of films that are classic wide, but I did something at the end of last year for the BBC and we shot that 2:1 which is a framing I hadn’t shot on before.  Essentially that’s come about because of people watching more Netflix and the big audience at home. It’s just a bigger frame that fits on a TV – there’s less banding top and bottom.

If it’s all going to be hand-held then the equipment needs to be sympathetic to that. You have to have a slimmed down, simple workflow. If it’s all something in a studio on cranes or a dolly then you might go a different way with a much bigger camera because you can and it makes sense. It’s just all part of the process of putting a project together, looking at all the different things you need to do and how they work for the story. It’s all narrative-driven.

Lighting technology is changing. Ten years ago you had to deal with three different sources - tungsten light, HMI and fluorescent lights. LED lights have come along. They consume a lot less power, you can get quite big lights now that run off 13A domestic. The equipment changing has a big impact on how you work because you can just light things with smaller lights potentially. It gives you more tools to make interesting films.

The big killer in film and TV is time - there’s never enough time to do everything you have to do. The more prepared you are on the day, the more you can achieve what you’re trying to do. When you’re filming and you’ve got a big crew and big lighting setup, in a way you can’t leave too much to happy accidents.

Greig Fraser

Lion, Zero Dark Thirty,  Snow White and the Huntsman, Rogue One: A Star Wars Story, Mary Magdalene, Foxcatcher, Let Me In, The Gambler, Bright Star

I’m a really big fan of getting involved very early on because the ‘photography can paint a thousand words’. A good, strong script is very, very important… but I believe that the visuals can augment that script massively.  

One of the beautiful aspects of the journey of discovery with a director is coming up with the same visual language, coming to the same conclusion via our different paths. And that’s really quite satisfying, when you’ve worked at growing an idea together, either through pre-visualisation, through locations, through discussions, through referencing… and then you come to that end product. It’s a small idea, a seedling at the beginning, that you contribute to and you end up having, hopefully, something really good.

You can over-plan in my opinion. You can basically shoot the whole day in your head and then something on the day happens where something changes. The weather comes in or you can’t shoot in that direction because there’s a truck parked there. If you’re 100% fully planned, that will throw you into a spin. I’m not saying you shouldn’t plan or shouldn’t [story]board. I love boarding and I get a lot out of it. It means that we can tick a box, before we’ve even walked on set. Being on set is really expensive time.

We really make sure we have all the tools at our disposal to test. For Mary Magdalene, which I’ve just finished with Garth Davis, we tested 35mm anamorphic lenses because we’d just shot Lion on that. However after we tested all the formats, we decided that 65mm had the most open, the most beautiful, wide scope.

To make a movie that is coherent visually, you’ve got to follow a framework. That framework might be that the lenses are a certain width, the lighting is a certain brightness …there are certain rules that you could follow to make a film feel coherent. For example, on Foxcatcher, one of the rules that I gave my camera operator was to imagine the camera was on valium. If it moves, it doesn’t move in a reactionary way to a sound. It’s a little bit late to an action. A lot of the story is told really slowly, very methodically, quite beautifully in the sense that the pans and moves are slow. Kind of like you’re blissing out.

Camera movement depends on the project. I love hand-held. I loved doing Zero Dark Thirty.  It’s one of my favourite camera styles, but at the same time [if] you shot Foxcatcher like Zero Dark Thirty, it would be a different movie. And vice-versa. You just couldn’t.  But if you were to mix those two together you can really come up with some interesting drama changes. 
I always get to the end of a job and hope I’ve shown the design in the very best way.  That’s the design of the costumes too. The art department consists of hundreds of other people too – model-makers, painters, builders, carpenters… I’ve seen these guys labour for hours and days over things. I have a huge amount of respect for that. If I’ve not shown them in the best light, I haven’t succeeded. It would do them a huge disservice.

Neville Kidd

Doctor Who, Outlander, Sherlock, Childhood’s End, Altered Carbon, A History of Scotland, Lip Service, A History of Celtic Britain

You have a hugely close relationship with the directors.  You are given the prep time to spend with the director, to look at the scripts, to find ways of telling the story, what trick shots you want to bring in, whether you want to bring in drones or aerial shots or how many cranes… it’s working out how many ways to slice a pie.  It’s working out where to spend the money, where not to. 

When you’re doing a lot of VFX work you’ve got to make decisions very early on. You will make a bit of a pre-vis, and the stunts guys will make a pre-vis of the stunts. We’ll make a pre-vis of the VFX work, we’ll combine that and then we’ll get the studio or network approval, and then we’ll film it. 

Use camera movement and framing to keep people’s attention. When you look at the scripts, you look at each scene and work out whose scene it is. Who are you going to focus on?  Whose story are you telling in that moment? Whose emotional journey do you want the viewer to go with? That kind of dictates where the camera is going. 
One of your jobs as DP is to make your world as big as possible. We’re now filming for people with televisions that have big screens. Whereas several years ago you were filming for people watching on a 32inch screen. It’s taking that scope and making it bigger so you can show more of your world than you traditionally could before. 

The advances in LED lighting technology mean you can just change colour temperature with the press of a button. It’s made our lives hugely easier. LED takes a lot less power, so power consumption has come down. We use a lot of Sky Panels for street scenes and chase sequences, so we can go from daytime to nighttime without having to change the fixtures. We’ve got far more control than we’ve ever had.

When you’re doing documentary you learn to be adaptable. I think you can take that into the drama world and keep the pace going. In episodic you’ve got to build to get a momentum going to be able to complete the days, and film the day’s page count, on time, on budget. 

Traditionally you did one grade but [with Netflix] you’ve now also got to do an HDR grade. But when you put an HDR look on it, it’s phenomenal. It’s almost like it’s ‘two and a half D’.  The pictures almost start to pop out at you, because HDR televisions are so much brighter and you have so much more extremes with your whites and colours.  It’s absolutely phenomenal to see.

What makes a good DP? There’s a way you see the world and it’s the way you transfer that through the cameras.  I read the script and I shut my eyes and I know I’ve done my job when what I’ve seen in my head is what I can see on the monitor. And if you combine that with collaboration, with the directors and the show runners and producers and production designers, I think that’s a huge skill. You can’t have egos that demand attention. You need to be all able to work together for the greater good. The show is number one, everyone is rooting for it, and nothing is bigger than the show. 

Posted 20 April 2018 by Michael Burns

The IBC preview

This week's kit show IBC in Amsterdam is an important place to catch up with the latest developments in acquisition, post production, accessories and new  technologies. Michael Burns checks out some of the delights in store

New Cameras

Ikegami (Hall 11, A31) is bringing a host of new models to the show, including demonstrations of new 4K and 8K cameras.

New for IBC is the Unicam UHD 4K-native 3-CMOS 2/3-inch camera, designed for easy integration into 4K studio and 4K field/OB truck systems. Uncompressed RGB 4:4:4 baseband is delivered from the camera head to the control unit. A 2/3-inch B4-mount allows direct docking with conventional lenses plus a focus-assist function. Ikegami BS-98/CCU-980 hybrid 2K/4K rack-mountable optical-fibre transmission links are also making their IBC debut. When used in combination with current Ikegami Unicam HD cameras, the BS-98/CCU-980 can deliver HD and 4K processed UHD signals simultaneously.

Also on show is Ikegami’s SHL-810 portable 8K camera. The company’s fourth generation 8K camera is one-tenth the size and weight of its original 2002 model. It employs a single 33 million-pixel Super 35 CMOS sensor, achieving 4,000TVL horizontal and vertical resolution and regular PL-mount lenses can be used.

ClearView Imaging (Hall 8, E17) is showcasing a new 4K compact broadcast camera which incorporates a high-quality 4K global shutter CMOS sensor and the CIS Clairvu ISP engine. 

Grass Valley (Hall 1, D11) is showing the LDX 86 Universe, an LDX 4K/6X HD switchable camera, together with the K2 Dyno Universe Replay System. The company claims this would enable any camera or replay position to be set up for regular HD, 4K or extreme-speed acquisition/replay. Also on show is the Focus 70 Live Camera. This consists of two different single HD format camera heads, with 1080i50/59.94 and 720p50/59.94 support. The system offers three fully digital Xensium-FT CMOS imagers with global shutter, while standard B4 2/3-inch lens mounts can be used to accommodate HD lenses.

Canon (Hall 11, E50) is promoting a ‘4K glass to glass’ message at the show, with models like the EOS C300 Mark II and the XC10 4K video and12MP stills camera, as part of the company’s entire 4K range on display for the first time. Though not 4K, one new model from Canon will be the tiny ME20F-SH. The HD camera is rated at a maximum ISO in excess of 4 million (+75dB) and capable of capturing full colour images in extremely low-light environments. Canon says the ability to install the camera in a semi-permanent location, with remote control operability, aims it at documentary and natural history filmmakers, long term projects and events filming.

A special camera is also on show from Panasonic (Hall 9, C45), in the shape of the AW-UE70, an Ultra HD PTZ camera with 4K IP streaming and in-camera 4K recording capabilities. It offers features such as HDR, dynamic range stretch (DRS), advanced DNR, and a night mode for monochrome shooting in almost total darkness. Panasonic is also showcasing new 4K models, like the AG-DVX200, a large-sensor 4/3 type handheld 4K camcorder that offers UHD up to 3840x2160/60p recording, the AK-UC3000 4K studio camera, offering a UHD signal output up to 3840x2160/60p, and a multi-purpose 4K box camera, the AK-UB300.

A firmware upgrade for the GY-LS300 4KCAM handheld Super 35 camcorder is the news from JVC (Hall 11, G30). The upgrade adds a ‘JVC Log’ mode that JVC says duplicates a film look, plus new Cinema 4K and Cinema 2K recording modes, a Prime Zoom feature for zoom capabilities when using prime lenses, and a histogram.

Sony (Hall 12, A10) will be showing its full range of cameras to deliver 4K, HDR and HFR. A brand new capability of streaming footage live will also be demonstrated for some of the company’s PXW-series XDCAM camcorders; the new technology will also allow Sony’s CBK-series wireless adapters to unlock live streaming for any camera with an SDI connection.

Red (Hall 11, A77) is showcasing Weapon, the newest member of the 6K Red Dragon family, offering workflow enhancements, cable-free peripherals, integrated mountings, and simultaneous recording of Apple ProRes and Redcode Raw.

As well as its Blackmagic Video Assist for external monitoring and recording, Blackmagic (Hall 7, H20), will be showing a lower priced model of Blackmagic Studio Camera. It’s due to the optical fibre connection now becoming a user installed option on the HD and UHD models of the camera. The Blackmagic Micro Cinema Camera, Blackmagic URSA Mini and the Blackmagic Micro Studio Camera 4K will be on show as well as a new 4.6K sensor for the URSA.

Supports, Tripods and Drones

Arri (Hall 11, F21) New additions to the company’s mechanical Pro Camera Accessories range and its Electronic Control System can be expected at the show. For example, designed specifically for Canon’s latest Cinema EOS cameras, there is a new cine plate for film set environments, a top-mounted support plate that provides room for handle and accessories and an adjustable broadcast plate for documentary-style filming. The latter allows quick changes from tripod to shoulder and perfect balance when handheld, says Arri.

Shotoku (Hall 11, F40) is showing a new pneumatic pedestal, with an integrated inflation pump. The TP500 is compact and lightweight and supports camera payloads of up to 55 kg. It is suitable for multi-location use such as OB, studio or event production. Also on show is SmartTrack, a rail-based dolly and elevator column with control system for live, multi-camera, studio production.

Cartoni (Hall 11, E30) is introducing four new fluid heads at IBC.  Focus 8, Focus 12, Focus 18 and Focus 22 accommodate an array of camera, lens and accessory packages from 0 to 22kg and are robust, lightweight and perfectly counterbalanced, according to the company. 

Miller Fluid Heads (Hall 11, D30) is debuting the Cineline 70 Tripod System. It comprises a heavy duty HD Mitchell Base 1-Stage Alloy Tripod, with features that include a high-capacity leg-lock system, turn-lock levers with rapid lock/release action, as well as a heavy duty Mitchell Base with a built-in bubble level. The lightweight Cineline 70 Fluid Head is constructed of corrosion resistant alloy and offers advanced precision fluid drag control, as well as counterbalance systems. Additionally, a HD Alloy Ground Spreader, designed for rapid setup and pull-down, easily attaches to the Alloy Tripod and is optimal for use on flat surfaces.

IBC has set aside a new feature area for 2015 – the Drone Zone. As well as Animon (Hall 11, C75) demonstrating its new Connex zero-latency wireless HD video transmission technology, DJI  (Hall9, C33) should be showing its Phantom 3 Standard, a drone designed specifically for first-time pilots. It records up to 2.7k HD video at 30fps using a 94-degree distortion-free lens. The DJI Phantom 3 Standard will be compatible with intelligent flight features including Follow Me, Waypoint Navigation and Point of Interest flight planning. These will also be available through a firmware upgrade at the show for the rest of the DJI Phantom 3 series and the DJI Inspire 1.


The latest Anamorphic lens from Cooke (Hall 11, D10) is being showcased at IBC.

The 65mm Macro Anamorphic/i 2x Prime lens boasts a close-up magnification ratio of 4:1:1 and a close focus of 5.5 inches from the front of the lens.  As well as showing the company’s 5/i, Anamorphic/i, S4/i and miniS4/i lens ranges, Cooke said it would also reveal more details about a forthcoming Anamorphic/i Zoom lens.  It will also present a major update to its /i Squared Technology metadata system. The firmware update for the latter provides distortion mapping of the specific lens in use, so providing even more detailed lens data to VFX and post-production teams.

P+S Technik lenses are also on show (Hall 11, G35), notably the PS-Zoom 18-35 short lens and the PS-Zoom 35-70 CS anamorphic lens. The company said the spherical PS-Zoom 18-35 is destined for 3D stereo production, or in double cameras setups for parallel recording of still and motion pictures with a beam splitter rig. The Sigma 18-35mm/f1.8 optics come in a robust cine-style housing with cam-driven focus mechanics as well as interchangeable mounts and focus rings.

The 35 to 70 mm Cinemascope zoom lens has an 1.5 anamorphic squeeze. The lens is adapted to the widely used 16:9 sensor ratio and offers anamorphic qualities, such as barrel distortion, nicely formed flares and shallow depth of field, according to P+S Technik.

Fujifilm (Hall 11, C20) is debuting two Ultra HD lenses for 2/3-inch UHD/4K cameras. The UA22x8BERD is portable broadcast zoom lens with a 22 x zoom. It covers a focal length from 8mm at wide angle to 176mm at telephoto.

The UA80x9BESM uses optical simulation to offer an 80x zoom with advanced optical performance, such as high image resolution, contrast and colour reproduction. Covering the focal length from 9mm in wide angle to 720mm in telephoto, it uses a new optical stabilisation mechanism to reduce image shake caused by vibrations and wind.

New lights are on show from Rosco (Hall11, G21) including the Silk LED lighting system, which the company says was specifically developed for film and video applications that demand extremely colour-accurate, high-quality light. Also on show is the compact LitePad Vector, an on-location soft light unit with a tunable colour temperature range of 3000K - 6000K.

Blind Spot Gear (Hall 9, B30a) is showcasing its versatile Scorpion Light, a low-cost portable led lighting kit for ‘tricky locations’.

Arri (Hall 11.F21) is showcasing SkyPanel, a bright but compact LED soft light that comes in two sizes, as well as both fully colour tuneable and remote phosphor versions.

Canara (Hall 11.B62) will be displaying a new range of LED products for broadcast, including a 400W LED Fresnel, 150W LED RGB Cyclorama and a 125W LED Remote Phosphor Panel.

Celeb 401, 401Q and 201 LED fixtures are on show from Kino Flo (Hall 11, E33) with an updated Kelvin scale from 2700K to 6500K and a built-in LumenRadio wireless link. The company is showing its interview light, the Diva-Lite 415 Universal and soft lights such as the Image 87 and 47, the Imara S100 and S60, and the ParaZip 415 and 215 fixtures with universal power input.

AJA (Hall 7. F11) will be showing new features in an upcoming release of Shotgun that makes it easier for teams to review and share creative projects, while Lenovo (Hall 5. C20) and Promise (Hall 6. C11) will be showcasing new extensions to Maya and 3ds max.

Blackmagic Design (7.H20) is showing Da Vinci Resolve 12, with new features added just before IBC. New features include support for Intel Iris and Iris Pro GPUs, which will improve performance and assist editors and colourists who are working remotely and on-set. DaVinci Resolve 12 is also now able to create optimised media proxies with custom settings for both the codec and resolution for faster editorial performance. A new Smooth Cut transition uses proprietary DaVinci optical flow algorithms to create a seamless transition between different parts of an interview so you don’t have to cover jump cuts with b-roll. The paid version of DaVinci Resolve Software has also been renamed to DaVinci Resolve Studio.

Editors will also get a preview of a new graphics plugin from Vizrt (Hall 7, A20) for Final Cut Pro X. This features the Vizrt meta graphics workflow which allows them to add Viz Engine-rendered graphics to the timeline and store them as metadata.

German startup firm fayteq (Hall 14.M24) is launching fayIN, its first plugin for Adobe After Effects. Using fayteq’s algorithms for automated camera tracking, editors can insert digital content into their footage, says the company, skipping the time consuming processes of manually defining tracking masks keyframe by keyframe. The plug-in also features automatic environment illumination and shadow transfer as well as automatic lens distortion correction.

Fraunhofer IIS (Hall 8.B80), will introduce new features for its digital cinema packaging tool, easyDCP. These include advanced subtitling options for quality assurance for multi-language packages, scaling options for parallel generation of Flat and Scope DCPs from one source, as well as support for QuickTime audio and 30-bit export for video files. Additionally, easyDCP will support a new generation of object-based audio formats. Fraunhofer is also showcasing how its light-field technology, which exploits recording of several camera views of the same scene from a single shot, can be used as plug-in for Nuke from The Foundry.

Virtual reality
Shotoku is addressing the demand for VR with Free-d2, a VR/AR tracking system. The system provides highly accurate and constantly referenced (absolute) position tracking. Based on advanced algorithms developed by BBC R&D, it uses simple ceiling markers to determine the exact position and orientation of the studio camera. The small Free-d2 camera is attached to the broadcast camera so it does not interfere in any way, and constantly views a lighting grid area where markers are positioned.

BBC R&D (Hall 8. F14)  is also demonstrating its own VR prowess, taking showgoers on an immersive tour of its IP-enabled broadcast environment concepts and object-based audience experiences using the Oculus Rift headset. 

Jaunt Vr (Hall 8. F18) is showing a camera system, codenamed Neo, which simultaneously records 3D stereoscopic video in all directions. Coupled with 3D sound-field microphones, the company says it captures everything needed to reconstruct a complete visual and auditory experience. Proprietary computational photography algorithms transform the recorded video data using sophisticated geometric calibration, colour adjustment, and image processing.

LiveLike (Hall 8. F17) is offering a simple VR service for sports broadcasters using a 4K camera fitted with a extreme wide-angle lens. The company then blends the 4K footage with a 3D environment.

VR is just as big in audio terms. At IBC you’ll hear 3Dception from Two Big Ears (Hall 8, F20a), a real-time 3D audio and environmental modelling engine that allows you to hear sounds above, below, behind or any point in space around you over any pair of headphones.

Posted 09 September 2015 by Michael Burns
Showing 1 - 5 Records Of 5

About this Author

  • Journalist

  • Total Posts: 5

Recent Posts by This Author



Televisual Media UK Ltd 23 Golden Square, London, W1F 9JP
©2009 - 2017 Televisual. All rights reserved
Use of this website signifies your agreement to the Terms of Use | Disclaimer