Racoon with fur generated by neural network

Virtual animal fur rendered more realistically using neural network

Image credit: University of California San Diego

Researchers at the University of California (UC) San-Diego have developed a new method for animating animal fur, which uses a neural network to speed up the process.

While animation technology has come a long way since Snow White, the first fully animated feature film, many obstacles remain which prevent animated characters appearing more photorealistic.

One such obstacle concerns the animation of animal fur, like that seen throughout Disney’s Zootopia. Models in use today for animating animal fur were designed for the animation of human hair. These models do not take into consideration the medulla (the soft, innermost cylinder of each strand of hair), as the medulla in human hair is much smaller than that in animal fur. Rays of light are modelled as simply bouncing from hair to hair.

This affects the appearance of animal fur in animation and is also highly computationally expensive.

In order to find a solution, the UC San Diego researchers and their collaborators experimented with subsurface scattering to model how light is bounced between fur fibres. This approach - which is used extensively in computer graphics and computer vision - accounts for how light enters a translucent object, is scattered, interacts with the material and exits at a different point.

We can observe this effect when we cover a small light source with a finger. The light enters the finger, scatters inside the tissue and exits elsewhere, creating a ring of light around the finger.

The researchers used an artificial neural network - a machine learning program approximately modelled on the structure of a brain - to apply the properties of subsurface scattering to animal fur. After being trained with just one animated scene, the neural network was able to apply subsurface scattering to other scenes.

“Our model generates much more accurate simulations and is 10 times faster than the state of the art [systems],” said Professor Ravi Ramamoorthi, senior author and director of the Centre for Visual Computing at UC San Diego.

This system does not only apply to animal fur, but also improves the appearance of human hair. It could be applied to computer animated films, video games and computer-generated special effects in live-action films.

According to a recent study by the Institute for Operations Research and the Management Sciences, despite huge technological advances in animation such as these, using the latest technology does not guarantee movie magic in animated films.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close