The latest installment of the Chronicles of Narnia – Chronicles of Narnia : The Voyage of the Dawn Treader – is the most recent blockbuster film to make extensive use of London’s vfx houses, with MPC, Framestore, The Senate VFX, Cinesite, The Mill and Prime Focus all working on the movie. Here’s a breakdown of what each of the companies did.
The principal vfx supplier was MPC, which completed more than 700 vfx shots on the film. The bulk of its work was creating and animating the mouse Reepicheep, the dragon Eustace, the Dawn Treader ship and a sea serpent. As well as character creation, MPC also worked on extensive digital environments.
MPC had already created Reepicheep for a previous Chronicles film, but with the mouse featuring prominently in over 200 hero shots in The Dawn Treader, MPC decided to update the character to “take the spotlight”.
The vfx house’s art department designed in what it describes as subtle changes to help create a “wiser and more mature Reepicheep”. And MPC’s animators even went so far as to take fencing classes to “learn the moves and tricks of the trade” to ensure the little mouse’s sword skills were tip top.
When it came to the dragon Eustace, the main challenge for MPC was conveying emotions through facial expressions alone, as Eustace is unable to speak.
For the giant sea serpent, which appears in a 180-shot ‘dark island’ sequence in which it attacks the Dawn Treader ship, MPC designed in ocean surfaces and splashing water effects for the serpent’s interaction with the sea. The serpent itself was designed with blubbery skin and “hundreds of articulated feelers” (whatever these are) by MPC’s art department.
MPC’s work on the Dawn Treader ship varied depending on the scene the ship appeared in. For some, a complete cg ship was built while, for other sequences, MPC added surrounding ocean to a land-based physical set of the ship.
The vfx supervisor for MPC was Adam Valdez, cg supervision was by Kevin Hahn and animation supervision was by Gabriele Zuchelli.
Framestore was responsible for the next biggest batch of vfx work on the film, working on around 280 vfx shots. The company’s key role was creating the God-like lion character Aslan – once again, Framestore had already brought the lion to life for the previous installment of the Chronicles series, so its work was focused on further refinement and improvement of the character design and animation.
On top of this, the 170-strong Framestore team also worked on the one-legged, giant-footed dwarf characters, the Dufflepuds, and vfx-heavy sequences including a standing wave and bringing a picture of a seascape to life.
The Framestore team, led by vfx supervisor Jonathan Fawkner (who also attended the shoot) worked on the rigging of Aslan to “bring it in line with Framestore’s centralised rigging tools”.
“We had shots where Aslan was walking and you saw his full body. The old rig didn’t allow the legs to be stretched far enough to make the gait look realistic,” says head of rigging Nico Scapel. “When this issue arose, we were able to iterate a change on the rig and then see a render the next day, which makes a huge difference.”
The animation of Aslan, which was handled by a team of eight at Framestore, had to be done within the constraints of him being a God, so should show “minimal signs of normal animal behaviour.” So, the team implemented “subliminal signs” that the creature lives, such a breath cycles, blinks, nostril flare, a slight shift in weight, a swish of the tail, and so on.
For the seascape picture sequence, where a picture of a seascape comes alive and waves start to roll and churn before water engulfs the room, Framestore combined a wet and dry set of the attic room with the picture hanging. The actors did their parts in the dry set, while a set of the same room was descended into a water tank to create the effect of the water rising as it enters the room.
Next, Framestore augmented water filmed gushing out of the painting on set, as there was a limit to how much could realistically be pumped into the set. To create the desired effect, Framestore used a “moving painting effect, using Corel Painter’s overpainting technique – we’d take moving footage of water and then, on a still frame, we used the impasto-like brush effect to give us something that looked a lot like the source painting but was based on footage of moving waves,” explains lead compositor Jan Adamczyk.
Finally, the Dufflepuds were created by mixing real actors for the top half and cg for the lower bodies, skin and cloth. Over 250 performances of the actors bouncing around and acting were filmed on blue screen, which were then scaled down to dwarfish size, tracked and given an animated leg and composited into the scene.
The Senate VFX
The Senate VFX produced the next largest amount of vfx work, completing 250 vfx shots for an opening sequence of King’s College, Cambridge as well as the creation of the star Liliandil, which takes on a human form.
Cinesite’s main work on The Dawn Treader involved creating the White Witch character. The vfx house’s team, led by vfx supervisor Matt Johnson and 3d supervisor Stephane Paris, scanned principal photography of the actress Tilda Swinton, who plays the White Witch, to generate a 3d model of the actress’s head.
The 3d model was then rigged and animated using Maya, to match Swinton’s live performance. Cinesite then added cg hair and a shroud of upper body mist to enhance the effect of her being a mythical, floating creature.
Cinesite also created green mist tendrils, which take on the form of the greatest fears of the crew of the Dawn Treader. Other Cinesite vfx included creating set extensions of The Goldwater Island sequence, and extending the bejewelled valley.
The Mill’s work focused on the Naiad water nymphs and their movement through the sea and around the Dawn Treader ship. The company spent six months on research and development and concept work to achieve the desired look for the characters.
Prime Focus completed the film's full stereoscopic 3d conversion. It converted 1,500 shots into stereoscopic 3d for the 115-minute movie. The globally focused post facility spent 24 hours a day across three different time zones working on the conversion process. In total, it delivered 600 shots from London, 550 from Los Angeles and 350 from Mumbai.
There’s a whole load of excellent free stuff available for post production editors and grading artists at present. Grab it while you can.
Free professional editing system - Lightworks
First up is the pro editing system Lightworks, as used to edit the likes of Pulp Fiction and Notting Hill, which has been updated and relaunched by new owner EditShare (which acquired Lightworks last year) as a free, open source editing package.
The new, free Lightworks provides resolution, format and codec independent edits, real-time 2k effects, varispeed, primary and secondary colour correctors, a multi-track audio mixer and voiceover tool and a newly designed user interface.
It also now works with Avid and FCP keyboard shortcuts and has native support for ProRes, Avid DNxHD and AVC-Intra, as well as stereoscopic support for left and right eye files.
Free colour grading - Airgrade
Next up is an interesting colour grading app from Pixel Farm. The company, best known for its vfx and restoration software, has focused its attention on colour correction for the first time with a combined free Mac software and iPhone app grading package called Airgrade.
Airgrade makes it possible to do anything from a one light pass to a complete professional-level grade. The bulk of the work is done by the Mac software, which “emulates professional film and TV grading tools”, while the iPhone app is used to wirelessly remote control the software, by rolling a virtual 3d trackball and rotating a radial wheel.
Airgrade provides lift, gamma and gain controls to control shadow, midtone and highlights, as well as a saturation control to tweak the overall colour intensity.
The grading data created by Airgrade can be saved in the universally recognised ASC CDL format to transfer to a dedicated professional grading system. The graded image is also auto-transferred to the iPhone’s photo album for quick reference.
The Pixel Farm believe Airgrade will not only be useful for DoPs to establish a basic look on location but also for aspiring colourists to familiarise themselves with grading techniques. Download Airgrade for free
Free digital cinema qc, grading and editorial tool - STORM
Finally, The Foundry has made a beta version of its ‘digital cinema camera production hub’ STORM available as a free download, for unrestricted free use until 1st March 2010 (when you’ll have to pay £250 to continue using it).
STORM provides extensive on-set digital rushes quality checking tools (to check exposure, focus, colour and audio), focusing specifically on Red-acquired rushes, as well as grading and multi-track timeline editorial tools.
It makes it possible for the director, DoP and editor to view high-res takes, make a rough edit and establish a basic look for a production, all while on location.
STORM also includes straightforward metadata tagging and timeline re-conforming to speed up the movement of content through to editorial and post production systems.
Red’s ridiculously titled Ted Schilowitz, who’s apparently the leader of the rebellion, says: “Having seen STORM in detail, I’d describe it as REDCine-X on steroids. It’s well worth the time to investigate its capabilities if you are involved in post production, working with Red footage.”
Download the free beta of STORM
There’s a lot going on in the post and facilities sector at present, with Deluxe in particular shaking things up in a big way with its purchase of Ascent Media Group.
Also in the last month, Warner Brothers confirmed its £100m purchase of Hertfordshire’s Leavesden Studios, making it its UK home.
And, at the other end of the scale, Manchester post facility Hullabaloo Studios, which worked on Fify and the Flower Tots and Roary the Racing Car, closed down for undisclosed reasons.
In a facilities sector as changeable and erratic as this, I asked a handful of post house mds what they think is in store for them in the coming year.
Cinesite’s md Antony Hunt singles out stereoscopic 3d as an ongoing positive development: “It’s been a tough climate for post houses and it’s those that continue to be cutting edge that thrive. Stereo 3d dominates the film industry and will continue to do so in 2011. We’ve made significant investment in stereoscopic and can deliver complex effects in this growing format, something that will define our work in 2011.”
The Mill’s exec producer Stephen Venning says the build up to the 2012 Olympics is the shining beacon for the coming year. “My personal excitement will be in seeing the effect the London Olympics has on the advertising industry and the creative vfx challenges that holds for us.”
Meanwhile, in Manchester, Andy Sumner, md of the city’s largest post house Sumners, acknowledges the year hasn’t been great for local facilities but believes there’s now every reason to be upbeat: “So farewell 2010, it’s certainly been a bit of a brutal time for post in Manchester – Red Vision and Hullabaloo have gone and everybody has striven for efficiencies simply to survive,” he says.
“So hello 2011, finally MediaCity is here, with seven HD studios, BBC children’s and BBC sport set to tip up in the North West, and, if we are to believe the BBC, this is only the start. If this is the case it has to be good for the whole production economy and what’s good for production has to be good for post.”
Jake Bickerton is Televisual’s features editor
Five of London’s leading vfx facilities have spent much of the last year working on sequences in Harry Potter and the Deathly Hallows Part 1.
As is commonplace in vfx-heavy big budget movies, work is shared out to a handful of different companies – in this case, a who’s who of the UK’s vfx facilities – Framestore, MPC, Cinesite, Double Negative and Baseblack. Added to this, there were other vfx houses in India, USA and Australia working on the title too.
Here’s a round-up of the UK vfx work on Harry Potter.
Framestore worked on 110 vfx shots, mostly involving two elves – Dobby and Kreacher, which are apparently both house-elves. Framestore’s vfx supervisor Christian Manz says the Framestore team of up to 60 vfx artists spent 16 months of “toil and creation” to complete its sequences on the film.
Framestore’s elves are keyframe animated rather than motion capture animated to be able to “carefully craft emotive and believable human performances from careful observation of a variety of sources,” says Manz.
The vfx facility concentrated on tweaking the appearance of both Dobby and Kreacher, compared to how they looked in previous Harry Potter films. Some of their more grotesque features were “softened out”, so Dobby’s neck was smoothed out, his arms shortened and his eyes were made less saucer-like. Meanwhile, Kreacher’s nose was shortened and his ears trimmed.
As well as the elves, Framestore was involved in a three-minute animated interlude in the film, which comes in when Hermione begins reading aloud from The Tales of Beedle the Bard.
A 37-strong team assembled from Framestore’s commercials vfx wing, led by sequence supervisor Dale Newton, worked on the animation, inspired by stop-frame silhouette animator Lotte Reiniger, who animated from the 1920s to the 1950s. Working in Maya, Framestore emulated the characters and motion of Reiniger’s hand-cut paper silhouettes.
Cinesite’s work centred on three key elements – Lord Voldemort’s snake-like nose, the ghost of Dumbledore and a Patronus doe.
It replaced Ralph Fiennes’ nose area with Lord Voldemort’s cg snout throughout 46 shots, which involved having to build a rig with three layers of animation controls to enable all the 16 tracking markers attached to Fiennes’ head to be exploited.
For Dumbledore’s ghost, Cinesite took a clean plate of the corridor in which the ghost appears and a green screen plate of Sir Michael Gambon and generated a digi-double of Gambon’s character. The digi-double was match-moved to Gambon and projected back on to his green screen performance to create the desired effect.
The Patronus doe appears in the film in the form of a light expanding into a semi-formed character. To achieve the effect, Cinesite generated a fully-rigged photo-real animated cg doe.
Cinesite also worked on additional effects such as a cg wreath of Christmas roses that are conjured up by Hermione at Harry’s parents’ grave.
MPC completed over 180 shots on the film. A good deal of its work went into the transformation sequence where six members of the order of Phoenix take a polyjuice potion and assume Harry’s form to confuse Voldemort.
For this sequence, MPC had to create fully cg versions of the six characters. The actors were motion captured, including a facial motion capture shoot, to provide the level of detail required.
MPC’s concept artists then tried out different ways to best achieve the visual effect of the transformation, blending features, sizes and skin textures from Harry and the other characters. A custom rigging system controlling the blending of data from the facial capture shoot provided the animators with control over the finer details.
Other characters created by MPC include cg thestrals (a cross between a horse and a dragon) and over 100 cg characters for the Death Eaters’ chase scene, including full screen digi-doubles for Harry, Hagrid and the Death Eaters.
It also worked on set extensions, and vfx work including explosions, wand effects, cg water and a bike crash during a wand duel between Harry and Voldemort.
The smallest of the UK facilities to work on the movie, Baseblack, ended up doing the largest number of shots. It worked on 45 sequences and over 300 shots, including every appearance of the Golden Snitch and all work on the final scene of the film, involving a huge lightning storm and Dumbledore’s tomb.
It also created the effects for spells such as Obliviate, the Deluminator and Lumos, the Horcrux Locket’s underwater attack on Harry, a wand shoot out in a café, Hermione’s magic handbag and moving photos in newspapers, as well as major background replacement throughout the film.
Double Negative’s work on the film was mostly focused on set and environment extensions. It extended the Burrows and its surrounds and also extended Xenophilus Lovegood’s home.
In addition to this, Double Negative added an extra dimension to the Death Eaters, introducing a ‘flayed man’ stage between their fuid, flying state and their live-action presence when they land. Double Negative also created the Patronus charm that interrupts a wedding party to inform guests Voldemort has taken over the Ministry of Magic.
Online catch-up TV services, YouTube, LoveFilm streaming, etc, all mean more and more TV content being viewed on laptops rather than on TV. The small laptop screen and tinny laptop sound is a less than enticing viewing experience compared to TV so the Veebeam – a newly released consumer device – aims to put laptop content back on TV.
The makers of Veebeam sent Televisual one of its £139 Veebeam HD devices to see what we thought. Out of the box, the first impressions are favourable - it’s a fairly small, agreeably designed piece of hardware that sits relatively unobtrusively next to the TV.
The first task to get it up and running is to connect the Veebeam to your TV, which, with the HD version, is via an HDMI cable. An SD version is also available (for £99) that uses a composite a/v cable rather than HDMI.
Next up is installing the Veebeam software. It works on a Mac or PC, and you can install it on as many laptops as you have in your home.
Slotted into the Veebeam is a removable USB antenna that you take out and plug into your laptop. Then you have to sit your laptop in line of sight of the Veebeam, and no more than 10 metres away, and wait a few moments for the two to make a connection.
Once connected, your laptop screen is mirrored on the TV, so you can start full-screen streaming from iPlayer, YouTube or whatever and it’s all shown on your TV. The audio also comes out of your TV speakers, so it’s a very TV-like viewing experience.
The Veebeam software also installs a Veebeam player that, rather than mirroring your laptop screen on TV, ‘sends’ movie files stored on your hard drive to the Veebeam, making it possible to watch them on TV while still being able to use your laptop.
The Veebeam HD player enables high-quality 1080p HD files to be displayed in full-res on your TV, but currently Mac users can only use the player for .mov and .mp4 files. Other commonly used file formats, such as .avi files, aren’t presently supported. Veebeam says this will come in a future version of the Mac software, while the PC player already works with a much broader range of file formats.
Having used the Veebeam fairly extensively for the last few days, overall I’d say it’s a useful device. It’s very straightforward to get up and running and is much less cumbersome than connecting a display port to HDMI adapter and HDMI cable to a MacBook (along with a set of speakers as the display port adapter doesn’t carry sound for some reason) every time you want to watch laptop-hosted content on TV.
As long as you keep the laptop in line of sight, the Veebeam link seems to work fine. There’s a delay of a few seconds in whatever you do on your laptop being shown on TV – it’s not instant as it is when using cables – so this takes a bit of getting used to. And you’ll want to point your laptop screen away from you as it’s pretty distracting seeing the content a few seconds ahead out the corner of your eye on the laptop screen.
Not having a good range of movie files supported for the Mac Veebeam player is frustrating as using ‘screencasting’ (ie. displaying the laptop screen on the TV) ties up your laptop so you can't use it for anything else.
The Veebeam’s image quality – even when streaming from online catch-up services (assuming you’ve a reasonable broadband speed) – is consistently good, and downloaded 1080p HD files look suitably impressive on the TV screen, not dissimilar to watching a Blu-ray.
The main issue for Veebeam will be how long it will be relevant. Once the likes of YouView are available, Veebeam may well struggle to get much of a look in.
Japanese artist Jitsuro Mase has created an innovative iPhone/iPad gadget that turns the iPhone/iPad into a nifty little handheld 3d cinema.
Dubbed the i3dg, Mase’s device uses mirrors set at 45-degree angles to project mini movies into the space around your iPhone or iPad. You don’t need any special equipment or polarised lenses or anything to view the 3d projections.
The visuals are created in a layout corresponding to the position of the mirrors and Mase has already created dozens of animations that show off the i3dg's capabilities. A series of i3dg films and animations are being shown at the International Film Festival Rotterdam 2011, early next year.
To see clips from some of these animations and find out how the i3dg works, see the video below.
While we’re on the iPhone, here’s another interesting device to maximise its creative potential. The Owle Bubo slots around the iPhone 4 to further enhance the 720p video capturing capabilities of the phone - providing it with a custom 37mm wide-angle lens, a high-quality microphone, and the ability to use any 37mm thread lens.
According to Owle, the Bubo gives the iPhone 4 “better colour saturation, contrast and sharpness, crystal clear sound and hugely reduced hand jitter”. It also makes it possible to use interchangeable 37mm lenses.
The Owle Bubo for iPhone 4 costs £150, or bundled with a Rotolight RL48-A, Rotolight Stand and belt pouch costs £275. For more information, go here
Following the launch of Pro Tools 9 earlier this month, Avid held an audio event for press yesterday to run through and demo the newly improved functionality of the industry-standard audio toolkit.
The overwhelming feeling following the event is the latest version is quite a leap forward in providing users with options for using Pro Tools with a much wider assortment of control panels and audio interfaces, or even on the move on a laptop with no hardware at all.
One of the key upgrades as part of Pro Tools 9 is it is now available as a software-only version. This is the first time it’s been available as a standalone piece of software, enabling full access to Pro Tools for audio prepping, mixing and editing on a laptop.
Another significant improvement is Pro Tools 9 now supports a much broader range of control surfaces (through the Avid EUCON open Ethernet protocol), including (naturally) Avid’s newly acquired Euphonix consoles and controllers, which have been rebranded as Avid Artist Series and Pro Series consoles.
Added to this, Pro Tools 9 also supports a much larger range of audio I/O interfaces, as a result of new Core Audio and ASIO driver support.
At yesterday's event, Avid announced it is no longer going to sell a handful of its formerly separately available Pro Tools add-ons, including the popular Music Production Toolkit. These are now bundled in with Pro Tools 9 as part of its expanded features set.
The functionality added to Pro Tools 9 by the addition of these add-ons includes automatic delay compensation, ending the need to manually compensate for latencies from hardware I/Os and plug-in algorithm processing.
Other improvements to Pro Tools includes the ability to do OMF/AAF/MXF file interchanges and MP3 exports, there's also enhanced accuracy when syncing audio to video in post, through a new built-in time code ruler, and an updated 7.1 surround sound paner.
Pro Tools 9 is available as a software-only version at around £500 or packaged with different audio interfaces at increasing price points.
Here’s a collection of music and audio post people giving their first impressions on the upgraded system...
Newcastle’s live action and animation production outfit J6 Films has completed a series of 30 extreme slo-mo films of people laughing for an art project commissioned by BBC Radio 3.
Each of the films is a visual portrait of someone from the North East, with the 30 people filmed ranging in age from 2 to 80 years old. They were shown on a series of flatscreen TVs throughout the concourse area The Sage, Gateshead earlier this month as part of an installation called the Free Thinking Festival.
Image credit: BBC/Dan Prince
The theme of the festival was ‘the pursuit of happiness’ and the films are “a visual metaphor of the pursuit of happiness, contemplated with a magnifying glass,” says film director Chuchie Hill.
“When high speed shooting, at approximately 1000 frames per second, we can capture every little movement of the face muscles and skin not appreciated when normal speed or live,” adds Hill.
The films (four of which are below) were produced by J6’s James Baxter.