If there’s a single topic that’s been impossible to avoid over the last few years, it’s 4K. Just as the industry obsession with stereoscopic 3D was starting to fizzle out, in came a new obsession. It’s out with extra dimensions and in with extra pixels.
And while the fickleness in which an entire industry can instantaneously swap allegiance from one hot new thing to another may ring alarm bells, several years into the big 4K push, there are more than several reasons to believe it’s here to stay.
4K Ultra HD screens
The variety and price range of 4K Ultra HD TVs (the consumer branding for 4K) and the marketing push being given to 4K Ultra HD is an indication that TV manufacturers are beginning to back the format in a big way. It’s worth noting the resolution of 4K Ultra HD is 3,840 x 2,160, which isn’t quite the full 4K resolution of 4,096 x 2,160. Whether this very slight downgrading of resolution makes any discernable difference to the image quality is highly debatable.
55” 4K Ultra HD TVs are already available for as little as £699, admittedly from a Chinese brand you’ve never heard of, but even the likes of Samsung offer a 40” 4K Ultra HD TV for £729 that displays 4K images and up-scales HD to 4K.
Ok, so these TVs aren’t exactly appealing to a mass audience as yet, with forecasts for the USA indicating only around 450,000 4K Ultra HD TVs will ship there this year. Nevertheless, predictions are shipments of 4K Ultra HD TVs will begin to gain momentum over the next year or so.
One of the reasons 4K Ultra HD screens haven’t been flying off the shelves is there’s currently very little to watch in 4K and what there is isn’t easy to get hold of. Getting 4K content into the home isn’t going to happen via traditional broadcasting any time soon, so streaming services such as Netflix and Amazon Prime Instant Video are presently the main providers of 4K programming.
Netflix has announced all its originated content, including series two of House of Cards and its beauty series Moving Art, will be shot, posted and (when demanded by the end user) streamed in 4K. It’s also offering (at least in the USA) all 62 episodes of Breaking Bad in 4K, which it’s re-mastered from the original film negatives.
Likewise, Amazon is also commissioning original content in 4K for its Prime Instant Video service. Furthermore, it’s announced a partnership with content providers including Warner Bros., Lionsgate, 20th Century Fox and Discovery to provide additional 4K content.
There’s also 4K content on YouTube, which isn’t immediately apparent to the end user. It’s selectable via YouTube’s easily missed quality settings option on the menu bar of 4K clips.
Meanwhile, Sony has brought out one of the few physical 4K Ultra HD players, although it’s only available in the States. It provides users with the option to download, stream and store films and TV programming from Sony’s Video Unlimited 4K library of around 200 titles.
The box also offers access to 4K content via Netflix. Unfortunately, it doesn’t appear there are any plans to bring either the box or Sony’s Video Unlimited 4K library over to the UK.
Certainly, the limited availability of 4K content, and the fact it’s only viewable on expensive TV screens, means 4K Ultra HD is far from a mainstream proposition at present. Added to this, the bandwidth required to successfully stream 4K is largely prohibitive for much of the UK. Netflix recommends a constant stream of at least 20-25Mbps to be able to seamlessly stream 4K, which is considerably faster than the UK average broadband speed of 17.8Mbps.
Despite appearances, mainstream broadcasters aren’t resting on their laurels when it comes to 4K, but the legacy infrastructure they have to deal with means there’s a lot of development work required before 4K can be broadcast over traditional platforms.
Substantial efforts are in fact being made to find effective ways to utilise and tweak existing infrastructure to be able to broadcast 4K over the air, cable and satellite.
Sky has already tested 4K transmissions over satellite and proved it’s possible, but has shied away from committing to broadcasting anything in 4K so far. The BBC has also successfully experimented with 4K Ultra HD broadcasts during big sporting events, including Wimbledon, the Olympics, the World Cup and the Commonwealth Games.
Meanwhile, the Steering Board of the Digital Video Broadcasting Project (DVB), which is an industry-led consortium made up of over 200 broadcasters (including the BBC, Sky and BT), has introduced DVB-UHDTV, a broadcasting specification for 4K Ultra HD.
This creates a standard for which 4K Ultra HD could be transmitted over the air and on satellite across Europe, based on the same HEVC compression used by the likes of Netflix. It should provide a good starting point to move things forward.
The 4K Ultra HD floodgates may well open up over the next year or so, with reports suggesting there will be 300 Ultra HD TV channels globally within the next 10 years. There are already dedicated 4K Ultra HD channels in South Korea and France, proving it’s technically possible to introduce such services in 2014.
Other 4K broadcasts
The huge resolution of 4K images makes it the ideal format for cinema and for big screen displays at trade shows, consumer events, shopping outlets, art galleries, the IMAX and so on.
Back in 2011, the Showcase Cinema chain updated more than 270 cinema screens in the UK with Sony Digital Cinema 4K projection systems, and the following year Vue upgraded all its UK and Ireland cinemas (a total of 650 screens) with the same technology. “Sony 4K projection enables viewers to sit close to the screen and be completely immersed in an apparently seamless and continuous, incredibly detailed picture,” explains the Showcase website.
The growing popularity of event cinema, where music, theatre and sporting events are transmitted live to cinema screens across the country, is another area where 4K could have a positive impact. There have already been successful experiments with live 4K cinema broadcasts of events in the UK, including the theatrical production of War Horse and a Saracens Rugby match.
The 4K Ultra HD spec
Currently, 4K Ultra HD screens only have to offer 4K Ultra HD resolution to be classed as 4K Ultra HD. This is all well and good but the European Broadcasting Union (EBU) for one believes focusing exclusively on higher resolution and ignoring other aspects (such as frame rate, dynamic range, colour gamut and audio quality) in the technical spec of 4K Ultra HD isn’t going to be enough for consumers to notice a significant leap in quality over HD.
It’s currently lobbying for the introduction of a Phase 2 update to the DVB’s UHDTV spec to incorporate what it describes as these “enhanced parameters for better pixels”.
Similar issues abound as with the move from SD to HD when it comes to production. The amount of data captured when shooting in 4K is immense and requires a great deal more and a great deal faster storage, along with much more time-consuming back-ups and much more powerful (and therefore much more expensive) equipment to playback and monitor 4K content.
The amount of data you have to take through post production is therefore much greater, which again has cost implications. It’s also quite challenging for post houses to quickly move 4K data around and speedily edit, online and grade 4K content. And very few post houses have so far invested in 4K screens to view the content.
Shooting 4K also brings with it similar adaptations to make-up, set design and costumes that were required when production moved from SD to HD, with the detailed images revealing things that previously went unseen.
However, perhaps surprisingly, 4K cameras don’t always command a huge investment over HD cameras, with the likes of Blackmagic, AJA, Sony and Panasonic all offering very affordable 4K models, although most require an external recorder to store 4K rushes, which adds to the cost and is another piece of kit to have to worry about.
So, until there’s a good reason to shoot in 4K beyond simply future proofing your content, you’re pretty unlikely to do so just yet.
In the meantime, 4K can be derived from 35mm film archives, providing a straightforward means for content owners to speedily collate together sizeable libraries of 4K material. To tap into this demand, Blackmagic has brought out a sleek-looking, very affordable scanner (costing around US$29k) specifically designed to make 4K scans of 35mm film negatives.
How much longer?
Speculation as to how long it’ll take before 4K Ultra HD is a mainstream technology is anyone’s guess, although Televisual’s recent Production Technology survey estimated between five and six years. Once all barriers to broadcasting content to a mass market have been demolished, there’s good reason to believe consumers will embrace the potential leaps in image quality provided by the increase in resolution and other potential benefits of 4K.
So while a substantial number of 4K Ultra HD TV screens are predicted to be sold in key Asian markets including China and South Korea over the coming year, the same can’t be said for the rest of the world. Shipments of screens in the UK aren’t likely to be in particularly high demand until broadband speeds get faster and traditional broadcasters truly join in the 4K race. Consumers may also need to be convinced the technology is here to stay before committing to any purchase, after witnessing how quickly stereoscopic 3D fell out of favour and how soon HD TVs appear to have been replaced by something bigger and better.
No matter what some post production software makers claim, very few machines can handle full 4K natively in a way that allows the editor full creative freedom and the speed to get on with it. Neel Potgieter, DoP
Carefully planned storage management helps cope with the increased volume of media, and modern grading and editing systems perform much more smoothly with 4K images. 4K ‘robs’ us of some of the gains made by the natural progression of computing power, though. Richard Wilding, general manger/technology, Molinare
High-end computers can handle 4K quite well, but monitoring is another matter. We usually only monitor in the resolution in which we are delivering, so for us HD monitors are still acceptable at the moment. We are watching the ‘professional’ 4K monitors with interest though and will invest when the time is right. Derek Moore, md, Coffee and TV
I do believe 4K/Ultra-HD is the future. Unlike 3D this feels like it is here to stay, although it already feels like it’s just a stepping stone to 8K. Tom Arnold, head of technical operations, Evolutions Bristol
The public at present has a relatively low awareness of all things 4K, however, recent marketing around the World Cup in Brazil has probably had a positive effect. Richard Mills, chief technical officer, ONSIGHT
For consumers to enjoy the benefits of 4K, they will need to invest in new viewing technology. This kind of expenditure is dependent on a healthy economy which also determines the industry’s ability to invest in the technical infrastructure and training required by production and post production for the standard to prosper. Jess Nottage, technical director, Clear Cut Pictures
I see Ultra-HD as about three years away from being mainstream, although I don’t think it will be dominant even then in the home. The cost of screens and possibly scepticism on the part of consumers will slow things down. Jez Lewis, director, Bungalow Town Productions
Most people still watch SD on their HD TV. Do we really need 4K/UHD pixels in our living rooms to enjoy EastEnders? And I’m now convinced it’s not how many pixels you have, it’s the quality of those pixels. Bill Scanlon, producer
As pipelines develop and more powerful graphics cards become available, I can see 4K becoming commonplace in vfx. Pushing the frame rates up to 60fps or 120fps will also become the norm. Kerri Aungle, head of data lab, MPC
The bigger the screen, the greater the need for increased resolution. When TV went from SD to HD, the early screens were small and the difference unremarkable. When screens increased to 42” and 50” the ‘need’ for HD became greater. With a 70” UHD TV, the difference between 4K and HD will be marked. David Klafkowski, joint-md, The Farm Group
The first 4K work we produced ‘end to end’, we shot 16TB of data in two weeks. That was all double backed up. Data rate was a joke and managing that kind of quantity of drives alone is a serious headache. Nick Francis, creative director, Casual Films
In my view there are two major barriers to 4K uptake – too little bang for the buck and too big a push by TV set manufacturers to sell something to profit their bottom line without delivering adequate viewer benefits. Milan Krsljanin, director business development, Arri
When you get to see full 4K resolution, there is no doubt about its huge potential. When HD was first introduced, it wasn’t considered sustainable due to the high production costs it incurred, but it has now become an industry standard in much the same way that 4K will for the right projects. Anthony Geffen, CEO, Atlantic Productions
Every part of the image has to be impeccable. Focus is imperative and noise is even more important. Of course the hair, make-up, set design, costumes, etc need to be correct, but I don’t think anyone scrimped on the time to prepare these things in the past. Duncan Malcolm, director of 2D, Glassworks
As is often the case at these type of shows, one of the more interesting things I heard about at IBC2014 was through a conversation at the beach bar. I was chatting with Sherlock cinematographer Steve Lawes about what he considered to be exciting new technology developments. Rather than anything he’d seen at the show (to be fair, he'd only just arrived so hadn't had a great deal of chance to look around at this point), he showed me a short film on his iPhone.
It was a teaser film for a hopefully forthcoming product called SteadXP, which appears to be very clever indeed. Essentially, it’s a camera stabiliser housed inside a small white plastic box that sits on top of pretty much any camera and does what looks like a phenomenal job of steadying shots whatever the circumstances.
According to the guys who created the device: “Our little box brings professional-grade 3 axis stabilisation to everyone. Compatible with nearly all digital video cameras, it’s dead easy to use, extremely light, has no moving parts and lets you just point and shoot. All the necessary corrections will be done automatically by our post-processing software, to match the exact type of result you expect.”
“With SteadXP you can do almost any of the crazy camera moves you always wanted to do, it will correct your footage and smooth-out any stabilisation issues.”
A crowdfunding campaign to launch the device is planned to kick off shortly. In the meantime, check out the two films below to get an idea of the quality of stabilisation that’s potentially around the corner…
Thanks Steve for pointing me in the direction of SteadXP.
Polish film company Film Cyfowy has created a very interesting blind comparison test of many of the major large sensor cameras currently being used for film, commercials and TV production.
It has shot and graded identical footage shot by the likes of the Arri Alexa, Red Dragon, Sony F55, Canon C500 and Blackmagic Pocket Cinema Camera and presented it in a film, with a number representing each camera so the viewer has no idea which camera footage they are viewing.
The names of the cameras are then revealed in a separate film. Have a look and see if you can match the following models (and recording formats) to the clips then check your answers in the second film!
Arri Alexa – ArriRaw
Red Dragon – R3D 6K
Red Epic – R3D 4K
Kineraw Mini – CinemaDNG 2K
Sony F55 – MXF 4K
Sony FS700 + Odyssey 7Q – CinemaDNG 4K
Canon C500 + AJA KiPro Quad - ProRes 4444 4K
Canon 1DC – MOV 4K
Canon 5D Mark III (with Magic Lantern) – CinemaDNG
Blackmagic Production Camera – ProRes 4K
Blackmagic Pocket Cinema Camera (once with a PL adapter and again with a Metabones Speed Boster adapter) – CinemaDNG
Lumix GH4 – MOV 4K
Thanks to nofilmschool.com for pointing us in the direction of Film Cyfowy’s camera shootout.
The music industry trade body for composers and songwriters, BASCA, which runs the Ivor Novello awards, has just held a roundtable discussion debating how to survive and thrive in the world of media music composition.
The discussion was with five members of its board of directors, who have each made very successful careers out of writing for many different genres of television as well as commercial music releases.
Making a living, never mind a fortune, in the media music business is no mean feat, as revealed in the following extract from the panel discussion, during which the panel members debate whether it’s still possible to make a good living composing media music.
Thanks to BASCA for pointing me in the direction of the discussion, which will be published in full in the next issue of BASCA’s members magazine 'The Works'.
Mark Ayres is a television composer best known for providing incidental music on the original series of Doctor Who. He worked on Doctor Who during Sylvester McCoy's era as the Seventh Doctor and was hired after he sent the producer a demo video with music he'd written to accompany the episode Remembrance of the Daleks.
Richard Jacques predominantly writes music for games, but also composes for commercials and TV programmes. He has composed for a long list of video games created by Sega (where he was previously an in-house composer), including entries in the Sonic the Hedgehog franchise.
Simon Darlow has a long history in pop music composition. He co-wrote Grace Jones's huge hit Slave to the Rhythm and has penned hits for Toyah Wilcox and The Buggles. He arranged Musical Youth's Pass the Dutchie and has played with many successful acts, including Wham and The Buggles. He's also written over 70 TV themes.
Paul Farrer has composed music for an enviable range of successful television shows, including The Weakest Link, Dancing on Ice, The Krypton Factor, The Jerry Springer Show, Saturday Night Live, Ant & Dec's Push the Button and Tumble. He's also He is also the creator and executive producer of forthcoming ITV game show, 1000 Heartbeats.
David Lowe has written for a plethora of high-profile television series, including the current BBC News theme, The One Show, Panarama, Countryfile and Grand Designs themes. He also composed for the London 2012 Olympics and had a Top 3 hit in the UK in 1998 with the track 'Would You...?' under the name Touch and Go.
Is it still possible to make a good living composing media music?
<DL> It's still possible to make money but you have to be fairly Chameleon-like and accept that things change. Sometimes you'll make a few quid and other times you'll be paid a paltry sum. Every now and then you'll drop lucky with a show and you'll make money. You may have to give up some of your publishing now, which has changed since maybe 10-15 years ago. But if you get given a show that's going around the world then you're being given a guaranteed platform and guaranteed air time and that's going to get you very good publishing royalties.
<PF> I do a lot of media shows. Something I do well out of – The Weakest Link – was good for me. The franchise was great for me as everywhere it went they said it has to stay the same, including the music. It's the same with Millionaire and Deal or No Deal whereas other franchises do change the format quite radically and sometimes that goes wrong [for the composer]. I did The Chase and that went over to America and they took all the music off and changed the show completely. You're in the lap of the Gods really and all you can do is hope.
<DL> You've got to try to work out what your back-end is going to be, and depending on what they offer you in the front-end budget, you can judge whether it's worth doing. If they pay you £500 upfront and you're only likely to get £250 for it in the back-end and you spent four weeks on it, it's not worth doing.
<MA> It's incredibly democratic. You can luck into a successful show and you can make an awful lot of money. If you don't luck out into an incredibly successful show, you'll simply earn a living.
<DL> For the majority of jobbing composers, it evens out to a professional rate, which is probably the same as what a film editor might get or a senior person on the production team.
<PF> The good composers have some dramas, some entertainment, some of this, some of that, for as many broadcasters as they can and for a very long time. You just put them out there and hope, that's really all you're doing.
<DL> And sometimes you can be sitting there with nothing at all, but all the stuff you've done in the past is keeping everything ticking over.
<PF> It's an odd job because how busy you are versus how busy your music is are two completely different things. You can be slogging your guts out day after day on a big project that just falls apart and never sees the light of day. Meanwhile, you have times when you're just sitting around and doing nothing and a show you did ages ago comes out in Japan or something.
<RJ> A lot of composers have had to diversify a bit so work across multiple genres of media in a way that they didn't previously have to. So they might have been just a guy that does comedy shows on TV, but now they do comedy, drama, commercials, library music, games and films. That's partly because the industry has opened itself up a bit and partly because the more your career goes on, the more the doors open.
<PF> My advice for anyone wanting to become a media composer is to start small. It takes decades and that's with having some good breaks along the way. We live in a post The X Factor world where we believe it's a binary thing: You're nobody. You've got your own jet. It doesn't work like that. You have to want to do it, and to the exclusion of all other things, because of the amount of effort you have to put into demos and the amount of hours you have to spend listening to idiots asking you to tweak high hats.
<SD> You just have to, at some point, ask yourself whether you really think you've got it. There's an awful lot of people who just haven't got it, in truth. And if you do really feel you have got it and you're not kidding yourself, you just get to meet the right people over time, you just do.
<RJ> It's also very important to know the business side and networking is very important too. Some people expect the money to start rolling in quite quickly but it doesn't, it takes many years.
<MA> You've got to love it and want to do the job. You also haven't got to want to be famous or desperately want to make money out of it.
<DL> You've got to know the technology too, so as well as being a business person and a creative, you've got to be a sound engineer as well. You've got to know how to add EQ and phasing, all this technical stuff like dB levels – clients expect the quality of what you're doing to be the produced to the same standards as commercial music.
Every so often a movie comes along that genuinely re-writes the script for filmmaking. Jurassic Park was a prime example, creating photo-real dinosaurs that really looked like they were living and breathing. It was a huge moment for cg, proving what could be done and setting an incredibly high benchmark for creature design and visual effects work.
However, it was originally intended to be very different. The plan from the start had been to do stop-motion dinosaurs, which, looking back, when you see what was achieved with the digital dinosaurs of Jurassic Park, sounds absolute madness. But, back in 1993, no one really knew the full possibilities of cg on creature design and animation, and even those working on the initial cg dinosaur sequences and walk patterns were completely taken aback when the fully rendered cg dinosaur was put into an environment and truly came to life.
It's one of many awe-inspiring moments captured by Academy Originals, which has created a 9m mini-doc* about the groundbreaking decision to create digital dinosaurs for Jurassic Park and how this decision completely changed the future of filmmaking. Watch it below.
* Apologies if you've seen this already - it's been around since early June, but I only stumbled across it today, hence the blog!
Here's a link to an interesting little blog that looks at some predictions from 1957 through to the mid-60s about how technology might evolve over the next few decades.
It includes two or three TV-related predictions, including the 'electronic home library' (pictured below), centred on the ability to record TV programmes if you were out when they were on air, the ability to watch 3D TV and other wonderful futuristic developments that might eventually become reality.
There's also the prediction, from 1957, that one day we'd be able to make face-to-face phone calls, including a handy picture (see below) to show what it might look like.
It's interesting to see how the likes of Skype, Sky+ boxes, Smart TVs, etc, which we now take for granted, were envisaged 40-50 years before they became commonplace.
As you might imagine, subtitling live shows and live news channels isn’t without its challenges. To get it right requires a dedicated, professional, skilled approach. To find out how it’s done, what the typical pitfalls of live subtitling are and how to resolve them, check out this informative article from Red Bee Media’s IT Manager, Access Services, Hewson Maxwell.
Translating what’s said on live television into readable text – it sounds simple enough. But take a moment to consider the challenges of providing live subtitling across multiple channels, with breaking news, regional broadcasts and over-running sporting contests, then add to that mix the understandably high standards of timeliness and accuracy expected by viewers, lobby groups and Ofcom alike and it might start to seem more complex.
We, and the industry as a whole, are always looking at ways to address the challenges we face with live-subtitling through ongoing investment in innovation and technology.
Volume of work
A huge challenge presented by live subtitling comes from its sheer scale; 24 hours a day of live output for news channels, over 150 daily hours of live output, and a workforce of subtitlers spread across many continents and also working from home.
Most truly live subtitling is generated using voice recognition software, most commonly Dragon NaturallySpeaking. There are huge benefits to this approach. When coupled with the re-speaking method, whereby a subtitler repeats everything spoken on screen in a clear and level voice, with punctuation and colour commands, the recognition and output is broadly excellent.
Furthermore, good speech engines are easy to use, so it is relatively easy to recruit and train people to subtitle well.
However, historically, there have been some downsides. The best voice recognition engines avoid releasing text until they receive enough context to be sure of what is being said. This can lead to a significant gap of output on air, followed by an instantaneous release of a large chunk of text. Slowing this glut of text down to a readable speed, as most software does, leads to de-synchronisation of subtitles and video and the need to edit or omit sections to catch up.
Regardless of the amount of context, voice recognition software will always struggle to understand some of the time, particularly with unusual terminology or names – the kind of vocabulary commonplace in live news. Traditional subtitling software uses housestyles to automatically replace common typographical errors, such as “empire state-building” for “Empire State Building” and the program’s comprehension errors, such as “Nicholas so cosy” for “Nicolas Sarkozy”.
Subtitlers also define macros for alternate forms of sound-alike words, so they can tell viewers whether it is chilly out, or they’re making a chilli, or the sports event is happening in Chile. But macros and housestyles are set up prior to going on air, and are therefore of little use when a new name emerges in a breaking news story.
Innovation in subtitling
Over the last three years or so, we’ve been looking at how we can improve the quality of live subtitling. We believe technology and innovation will be key drivers to achieving this goal and as such, we have been investing heavily in building a bespoke platform and software that we believe will help to address some of the challenges listed above. Unique functionality such as the ability to integrate with broadcaster schedules and a re-speaking interface designed to be the fastest on the market are just some of the improvements we’ve been focused on.
Live subtitling will continue to be a complicated, imperfect and expensive process for some time to come, but with our new software, Subito, we believe we’ll be able to deliver greater quality to the audience, and begin to add extra utility and lower costs for broadcasters. We know there will always be more that can be done, and we remain committed to both integrating current technological advancements and driving the next set forward. We will continue to do so until we reach a day when live subtitles are barely distinguishable from prepared ones.
MPC, the lead visual effects studio for X-Men: Days of Future Past, delivering 372 of the vfx shots, has released a number of before and after stills showcasing its work on the film.
MPC created the future sequences in the movie, including the future sentinels, from concept art through to final compositing, the X-Jet, Xavier’s virtual world, future environments and mutant effects.
The sentinel is a 10-foot long, fully cg "mutant slayer covered with approximately 100,000 independent blades, the movement of which had to be directed artistically rather than driven by simulation," says MPC. "The vast number of objects proved too cumbersome with existing workflows so an entirely new approach was required."
Now for some incredibly complex sounding detail possibly to help explain the new workflow.... You've been warned…. this comes direct from the press release, and, to be honest, I don't understand a word of it….
MPC’s R&D team, lead by Tony Micilotta, introduced the concept of a follicle that approximated the shape and size of the final blade model. These were combined into per body part follicle-meshes and could be manipulated using standard deformers. This not only provided requisite visualization for animators, but doubled up as primitives from which transforms could be derived using trigonometric methods. These transforms were cached as particles and were subsequently ingested by a bespoke Katana Scene Graph Generator (SGG) that instanced blade models accordingly.
Back to the understandable side of vfx once more, the opening sequence of the film required MPC to create extensive matte painting and environment work as well as generating meshes, textures and particle effects. Also in the opening sequence, the character Sunspot’s flames were achieved using Flowline fluid simulation technology to produce multiple layers of volumes and particles. On top of this, the ice body of a character that turns into an iceman was created using effects layers of spray, ice crystals and 'dry ice' type effects. MPC’s team also created the character Colossus’ metallic skin and a digital double of War Path. The sequence takes place in a bunker deep underground, which involved work from the environment team to provide set extensions and finishing touches.
The climax of the film takes place in and around an ancient monastery in the Himalayan mountains. For this, MPC recreated and extended the monastery set and creating lightning effects, large swirling volumes and buffeting winds.
MPC also created a cg X-Jet, Xavier’s virtual world and Wolverine’s claws.
Here are a few before and after images showcasing MPC's visual effects work....