The music industry trade body for composers and songwriters, BASCA, which runs the Ivor Novello awards, has just held a roundtable discussion debating how to survive and thrive in the world of media music composition.
The discussion was with five members of its board of directors, who have each made very successful careers out of writing for many different genres of television as well as commercial music releases.
Making a living, never mind a fortune, in the media music business is no mean feat, as revealed in the following extract from the panel discussion, during which the panel members debate whether it’s still possible to make a good living composing media music.
Thanks to BASCA for pointing me in the direction of the discussion, which will be published in full in the next issue of BASCA’s members magazine 'The Works'.
Mark Ayres is a television composer best known for providing incidental music on the original series of Doctor Who. He worked on Doctor Who during Sylvester McCoy's era as the Seventh Doctor and was hired after he sent the producer a demo video with music he'd written to accompany the episode Remembrance of the Daleks.
Richard Jacques predominantly writes music for games, but also composes for commercials and TV programmes. He has composed for a long list of video games created by Sega (where he was previously an in-house composer), including entries in the Sonic the Hedgehog franchise.
Simon Darlow has a long history in pop music composition. He co-wrote Grace Jones's huge hit Slave to the Rhythm and has penned hits for Toyah Wilcox and The Buggles. He arranged Musical Youth's Pass the Dutchie and has played with many successful acts, including Wham and The Buggles. He's also written over 70 TV themes.
Paul Farrer has composed music for an enviable range of successful television shows, including The Weakest Link, Dancing on Ice, The Krypton Factor, The Jerry Springer Show, Saturday Night Live, Ant & Dec's Push the Button and Tumble. He's also He is also the creator and executive producer of forthcoming ITV game show, 1000 Heartbeats.
David Lowe has written for a plethora of high-profile television series, including the current BBC News theme, The One Show, Panarama, Countryfile and Grand Designs themes. He also composed for the London 2012 Olympics and had a Top 3 hit in the UK in 1998 with the track 'Would You...?' under the name Touch and Go.
Is it still possible to make a good living composing media music?
<DL> It's still possible to make money but you have to be fairly Chameleon-like and accept that things change. Sometimes you'll make a few quid and other times you'll be paid a paltry sum. Every now and then you'll drop lucky with a show and you'll make money. You may have to give up some of your publishing now, which has changed since maybe 10-15 years ago. But if you get given a show that's going around the world then you're being given a guaranteed platform and guaranteed air time and that's going to get you very good publishing royalties.
<PF> I do a lot of media shows. Something I do well out of – The Weakest Link – was good for me. The franchise was great for me as everywhere it went they said it has to stay the same, including the music. It's the same with Millionaire and Deal or No Deal whereas other franchises do change the format quite radically and sometimes that goes wrong [for the composer]. I did The Chase and that went over to America and they took all the music off and changed the show completely. You're in the lap of the Gods really and all you can do is hope.
<DL> You've got to try to work out what your back-end is going to be, and depending on what they offer you in the front-end budget, you can judge whether it's worth doing. If they pay you £500 upfront and you're only likely to get £250 for it in the back-end and you spent four weeks on it, it's not worth doing.
<MA> It's incredibly democratic. You can luck into a successful show and you can make an awful lot of money. If you don't luck out into an incredibly successful show, you'll simply earn a living.
<DL> For the majority of jobbing composers, it evens out to a professional rate, which is probably the same as what a film editor might get or a senior person on the production team.
<PF> The good composers have some dramas, some entertainment, some of this, some of that, for as many broadcasters as they can and for a very long time. You just put them out there and hope, that's really all you're doing.
<DL> And sometimes you can be sitting there with nothing at all, but all the stuff you've done in the past is keeping everything ticking over.
<PF> It's an odd job because how busy you are versus how busy your music is are two completely different things. You can be slogging your guts out day after day on a big project that just falls apart and never sees the light of day. Meanwhile, you have times when you're just sitting around and doing nothing and a show you did ages ago comes out in Japan or something.
<RJ> A lot of composers have had to diversify a bit so work across multiple genres of media in a way that they didn't previously have to. So they might have been just a guy that does comedy shows on TV, but now they do comedy, drama, commercials, library music, games and films. That's partly because the industry has opened itself up a bit and partly because the more your career goes on, the more the doors open.
<PF> My advice for anyone wanting to become a media composer is to start small. It takes decades and that's with having some good breaks along the way. We live in a post The X Factor world where we believe it's a binary thing: You're nobody. You've got your own jet. It doesn't work like that. You have to want to do it, and to the exclusion of all other things, because of the amount of effort you have to put into demos and the amount of hours you have to spend listening to idiots asking you to tweak high hats.
<SD> You just have to, at some point, ask yourself whether you really think you've got it. There's an awful lot of people who just haven't got it, in truth. And if you do really feel you have got it and you're not kidding yourself, you just get to meet the right people over time, you just do.
<RJ> It's also very important to know the business side and networking is very important too. Some people expect the money to start rolling in quite quickly but it doesn't, it takes many years.
<MA> You've got to love it and want to do the job. You also haven't got to want to be famous or desperately want to make money out of it.
<DL> You've got to know the technology too, so as well as being a business person and a creative, you've got to be a sound engineer as well. You've got to know how to add EQ and phasing, all this technical stuff like dB levels – clients expect the quality of what you're doing to be the produced to the same standards as commercial music.
Every so often a movie comes along that genuinely re-writes the script for filmmaking. Jurassic Park was a prime example, creating photo-real dinosaurs that really looked like they were living and breathing. It was a huge moment for cg, proving what could be done and setting an incredibly high benchmark for creature design and visual effects work.
However, it was originally intended to be very different. The plan from the start had been to do stop-motion dinosaurs, which, looking back, when you see what was achieved with the digital dinosaurs of Jurassic Park, sounds absolute madness. But, back in 1993, no one really knew the full possibilities of cg on creature design and animation, and even those working on the initial cg dinosaur sequences and walk patterns were completely taken aback when the fully rendered cg dinosaur was put into an environment and truly came to life.
It's one of many awe-inspiring moments captured by Academy Originals, which has created a 9m mini-doc* about the groundbreaking decision to create digital dinosaurs for Jurassic Park and how this decision completely changed the future of filmmaking. Watch it below.
* Apologies if you've seen this already - it's been around since early June, but I only stumbled across it today, hence the blog!
Here's a link to an interesting little blog that looks at some predictions from 1957 through to the mid-60s about how technology might evolve over the next few decades.
It includes two or three TV-related predictions, including the 'electronic home library' (pictured below), centred on the ability to record TV programmes if you were out when they were on air, the ability to watch 3D TV and other wonderful futuristic developments that might eventually become reality.
There's also the prediction, from 1957, that one day we'd be able to make face-to-face phone calls, including a handy picture (see below) to show what it might look like.
It's interesting to see how the likes of Skype, Sky+ boxes, Smart TVs, etc, which we now take for granted, were envisaged 40-50 years before they became commonplace.
As you might imagine, subtitling live shows and live news channels isn’t without its challenges. To get it right requires a dedicated, professional, skilled approach. To find out how it’s done, what the typical pitfalls of live subtitling are and how to resolve them, check out this informative article from Red Bee Media’s IT Manager, Access Services, Hewson Maxwell.
Translating what’s said on live television into readable text – it sounds simple enough. But take a moment to consider the challenges of providing live subtitling across multiple channels, with breaking news, regional broadcasts and over-running sporting contests, then add to that mix the understandably high standards of timeliness and accuracy expected by viewers, lobby groups and Ofcom alike and it might start to seem more complex.
We, and the industry as a whole, are always looking at ways to address the challenges we face with live-subtitling through ongoing investment in innovation and technology.
Volume of work
A huge challenge presented by live subtitling comes from its sheer scale; 24 hours a day of live output for news channels, over 150 daily hours of live output, and a workforce of subtitlers spread across many continents and also working from home.
Most truly live subtitling is generated using voice recognition software, most commonly Dragon NaturallySpeaking. There are huge benefits to this approach. When coupled with the re-speaking method, whereby a subtitler repeats everything spoken on screen in a clear and level voice, with punctuation and colour commands, the recognition and output is broadly excellent.
Furthermore, good speech engines are easy to use, so it is relatively easy to recruit and train people to subtitle well.
However, historically, there have been some downsides. The best voice recognition engines avoid releasing text until they receive enough context to be sure of what is being said. This can lead to a significant gap of output on air, followed by an instantaneous release of a large chunk of text. Slowing this glut of text down to a readable speed, as most software does, leads to de-synchronisation of subtitles and video and the need to edit or omit sections to catch up.
Regardless of the amount of context, voice recognition software will always struggle to understand some of the time, particularly with unusual terminology or names – the kind of vocabulary commonplace in live news. Traditional subtitling software uses housestyles to automatically replace common typographical errors, such as “empire state-building” for “Empire State Building” and the program’s comprehension errors, such as “Nicholas so cosy” for “Nicolas Sarkozy”.
Subtitlers also define macros for alternate forms of sound-alike words, so they can tell viewers whether it is chilly out, or they’re making a chilli, or the sports event is happening in Chile. But macros and housestyles are set up prior to going on air, and are therefore of little use when a new name emerges in a breaking news story.
Innovation in subtitling
Over the last three years or so, we’ve been looking at how we can improve the quality of live subtitling. We believe technology and innovation will be key drivers to achieving this goal and as such, we have been investing heavily in building a bespoke platform and software that we believe will help to address some of the challenges listed above. Unique functionality such as the ability to integrate with broadcaster schedules and a re-speaking interface designed to be the fastest on the market are just some of the improvements we’ve been focused on.
Live subtitling will continue to be a complicated, imperfect and expensive process for some time to come, but with our new software, Subito, we believe we’ll be able to deliver greater quality to the audience, and begin to add extra utility and lower costs for broadcasters. We know there will always be more that can be done, and we remain committed to both integrating current technological advancements and driving the next set forward. We will continue to do so until we reach a day when live subtitles are barely distinguishable from prepared ones.
MPC, the lead visual effects studio for X-Men: Days of Future Past, delivering 372 of the vfx shots, has released a number of before and after stills showcasing its work on the film.
MPC created the future sequences in the movie, including the future sentinels, from concept art through to final compositing, the X-Jet, Xavier’s virtual world, future environments and mutant effects.
The sentinel is a 10-foot long, fully cg "mutant slayer covered with approximately 100,000 independent blades, the movement of which had to be directed artistically rather than driven by simulation," says MPC. "The vast number of objects proved too cumbersome with existing workflows so an entirely new approach was required."
Now for some incredibly complex sounding detail possibly to help explain the new workflow.... You've been warned…. this comes direct from the press release, and, to be honest, I don't understand a word of it….
MPC’s R&D team, lead by Tony Micilotta, introduced the concept of a follicle that approximated the shape and size of the final blade model. These were combined into per body part follicle-meshes and could be manipulated using standard deformers. This not only provided requisite visualization for animators, but doubled up as primitives from which transforms could be derived using trigonometric methods. These transforms were cached as particles and were subsequently ingested by a bespoke Katana Scene Graph Generator (SGG) that instanced blade models accordingly.
Back to the understandable side of vfx once more, the opening sequence of the film required MPC to create extensive matte painting and environment work as well as generating meshes, textures and particle effects. Also in the opening sequence, the character Sunspot’s flames were achieved using Flowline fluid simulation technology to produce multiple layers of volumes and particles. On top of this, the ice body of a character that turns into an iceman was created using effects layers of spray, ice crystals and 'dry ice' type effects. MPC’s team also created the character Colossus’ metallic skin and a digital double of War Path. The sequence takes place in a bunker deep underground, which involved work from the environment team to provide set extensions and finishing touches.
The climax of the film takes place in and around an ancient monastery in the Himalayan mountains. For this, MPC recreated and extended the monastery set and creating lightning effects, large swirling volumes and buffeting winds.
MPC also created a cg X-Jet, Xavier’s virtual world and Wolverine’s claws.
Here are a few before and after images showcasing MPC's visual effects work....
If you have seven minutes to spare and would like to become an expert in the different stages involved in creating a top-end animation, you'd be well advised to check out the following film.
It's an interview with Beakus's animation producer Steve Smith that's a superb overview to how an animation is created, with loads of step-by-step examples. It was commissioned by Time Out New York and created by Hibrow.tv and Smith is an excellent narrator to explain the skills and craft of animation.
The six news channels of Dutch broadcaster RTL have just undergone an extensive refresh including a new identity created by designers Mark Porter Associates (who created the look for Guardian newspaper and website) and Dutch design studio Smörgåsbord Studio.
The redesign has taken 18-months, which sounds like a very generous amount of time for simply sprucing up the look. However, the work involved redesigning logos, title sequences, on-screen graphics for news, weather and business as well as creating the layout and design of the news studio set. And, of course, all designs had to work across multiple platforms.
RTL Nieuws broadcasts 17 bulletins each weekday and six weekend bulletins, and has an audience of up to two million. It also broadcasts business programming under the RTL Z banner, which airs during the day and a weather, as well as traffic service (RTL Weer & Verkeer).
“Everything has changed since we introduced our previous identity in 2007. The world of news, the way we consume news and RTL Nieuws itself," says Said Caroline Schnellen, marketing and communications manager at RTL Nieuws.
"With all this in mind it became the right time to rethink the brand’s visual identity. We needed to show one brand, one look-and-feel, on TV, web and app. In order to do so we felt this could only truly happen if we would change all platforms at the same time. What followed was an immense project, with many parties involved and close coordination throughout."
To view the redesign and find out how it was thought up and executed, check out the short film below.
Here's an interesting film from US-based production kit shop Fotodiox showcasing its top five products for filmmakers that are all available for under $20 (around £12). The list has been compiled after consulting Fotodiox's customers and includes an LED light panel, a lens adapter, a power arm, clapperboard, gear belt and a 5-in-1 reflector.