Iron Sky

Sci-fi on a shoestring

Thanks to developments in software and ever increasing processing power, high-quality CGI is no longer the preserve of big budget blockbusters.

Movies are big business and big business means big bucks. Last year a record three films raked in more than $1bn each in gross revenues: 'Harry Potter and the Deathly Hallows Part 2', 'Transformers: Dark of the Moon' and 'Pirates of the Caribbean: On Stranger Tides'. A closer look at these box office record breakers reveals that they have several things in common: they are all franchises, they all share estimated budgets in excess of $100m ($150m, $195m and $160m respectively) and they are all packed with computer-generated imagery.

While sometimes lazily derided as the last resort of the creatively bankrupt, CGI frequently plays an essential role in the creation of the believable alternate reality necessary for the enjoyment of a film. It is, in short, one of the constituent elements of the alchemical mix that produces the magic of cinema. Would 'Terminator 2: Judgment Day' have been as memorable without the sinister liquid-alloy shape shifting of the T1000, for example? Would the Lord of the Rings trilogy's Gollum have been so lifelike if not for the use of cutting-edge motion capture technology? Would the fight scenes of the Matrix movies have been so widely discussed were the innovative bullet-time sequences left out?

There is, however, a caveat. As shown by the colossal budgets quoted above, CGI work is extremely labour intense, and often extremely expensive. It is typically the work of huge teams of talented artists and programmers using banks of costly, specially-built computers. This, of course, all comes at a premium that is out of the reach of all but a select few Hollywood big hitters.

In order to compete with the big studios, independent film makers have traditionally relied on contemporary, real-world settings, dialogue-heavy scripting and straightforward cinematography to tell their stories. As visionary director Jean Luc Godard once stated, "All you need for a movie is a girl and a gun". But, in the same manner in which affordable DV cameras have opened up a world of possibilities for the financially constrained film maker, the progress of Moore's Law has now reached the point where directors are also increasingly able to add CGI to their creative arsenal.

"I think if you ask most independent filmmakers, they would say a combination of things have been coming together over the last few years that have made it much more viable for small shops to make high quality films on low budgets," explains Greg Estes, of GPU specialists NVIDIA. "First is the emergence of DSLR cameras such as the Canon 5D, which produce high-resolution images at very low cost, and the emergence of high-resolution file-based cinema cameras such as Red Epic. Combined with that, you have the increase in power and capability of computer workstations with modern GPUs to be able to process these images with a variety of software tools."

GPUs, or graphics processing units, are specialised parallel processors that are optimised for the production of graphics. They include a frame buffer and groups of transistors organised into cores that perform calculations on the information that is stored on the frame buffer. Because they are specialised they can carry out these functions many times faster than a general-purpose CPU (central processing unit), the main processor in a computer. They also have many more cores; where a CPU might have four, for example, a GPU can have 1,000 or more. GPUs are not as efficient as their less specialised cousins at general processing such as formatting a disk-drive, although they are many times faster for more intensive applications such as the creation and rendering of CG.

Falling costs

"I think any objective observer would agree the adoption of GPU computing is fundamental to this process and a real key to this transformation," says Estes. "If you look to the work done by independents like Bandito Brothers, who made Act of Valor (which was number one at the box office in the United States when it was released) on a very small budget, they would say they could not have done that film without their NVIDIA GPU-based systems. Across their entire workflow they are using GPU acceleration for editing, film grain management (critical when using DSLR cameras), colour grading, encoding, etc. Every step of the process from the time frames come in off the camera through the entire post-production processes through to the final film uses GPUs for processing."

And with demand for faster and faster graphics processing being driven by the design, animation and gaming industries, each year new GPUs are released offering substantially more bang for the buck than those that came before. This progress is so fast in fact that the current GPUs on the consumer market completely outgun those used by professionals on films produced just a few years ago.

"The SGI Reality Engine graphics used by ILM for films like 'Jurassic Park' or 'Terminator 2' had a processor with 0.64Gflops of processing power. Our current GPUs have 3,090Gflops. So using that one metric (and there would be other ways to compare) then we'd be on the order of 4,800 times faster. The Reality Engine boardset that went into the SGI workstation had a list price of about $100,000. NVIDIA professional GPUs with the amount of processing I just referenced have a street price of around $2,000," says Estes.

Accompanying this rapid evolution of graphics processing is a similarly rapid development in 3D modelling technology. As a result software such as Autodesk's Maya, a powerful animation and modeling package that allows users to create 3D objects, apply textures to their surfaces, animate them and even subject them to the effects of gravity and friction, is now within the reach of the serious hobbyist.

"3D is filtering through a lot," says Rob Hoffman, senior product marketing manager for 3D media and entertainment at Autodesk. "Blockbuster films are not the only realm in which you see high-end visual effects. We are seeing a lot of low-budget indie films, a lot of stuff you would see at Sundance [movie festival] for example, with extensive amounts of 3D in them. The idea that the only ones that can have 3D in their films are the elite of film making, those days are long since gone.

Also, you are seeing a lot more 3D in television. Keep in mind television series don't typically have large budgets to begin with and they also have very short production timeframes. They are being done by a small number of people on tight budgets. 3D really has become more democratised. It's not a tool that only those in the ivory towers can use. It's really a tool that everybody is using."

Along with this is a corresponding increase in usability. For example, software such as Maya not only contains a library of spheres, cones and other more complicated objects from which complex creations can be constructed, but also a texture library and other built-in functions to aid the creative process. There is of course also the facility to save templates for future use and even the option to buy pre-designed characters or objects from a third party.

"3D can be very complex at times. A lot of the people working in CG at the moment are true artists. They want to create the next 'Star Wars' or whatever blockbuster movie you want to talk about but they also want to create beautiful images. The things they don't want to focus on are the technical aspects of the production. They want to focus on making the skin of the character look really realistic so you can see fine lines around the eyes or on the little itty-bitty hairs around the eyebrows. They want to focus on the art itself," says Hoffmann.

Digital DIY

One recent benefactor of the falling cost of CGI is a curious Finnish production called 'Iron Sky'. Released in June, the film is set in the year 2018 in an alternate universe in which the Nazis, who fled to the dark side of the Moon in 1945, return to attack the Earth. Along with the film's decidedly b-movie premise it carries with it the resourcefulness and all-hands-to-the-pump DIY ethic displayed by directors of classic low budget cinema such as Roger Corman or Sam Fuller. The CGI for the film was produced by a group of enthusiastic graphics pros recruited online who used a bank of 50 computers equipped with standard Intel processors and a relatively modest 16GB of RAM. And as anyone who has seen the film will testify, it looks every bit as polished and realistic as anything produced by the studio system.

There are, however, some directors pushing the independent film DIY ethic even further. Former BBC special effects artist Gareth Edwards made his 2010 debut 'Monsters' for an estimated $500,000, writing, shooting and directing it with a skeleton crew of five. Set six years after extra-terrestrials have landed in the Americas, the film follows a photographer as he chaperones his employer's daughter through an alien-infested quarantined zone in Mexico. After shooting the film on location Edwards singlehandedly added in the CGI using a lone laptop running Autodesk's 3DS Max – the same software that was used to produce the CGI effects in James Cameron's 'Avatar', which at $230m is one of the most expensive movies ever made. Perhaps it is time to modify Godard's dictum. What you really need to make a movie is a girl, a gun and a laptop. 

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close