Animation!

We thought we’d write a post on the animation that we’re working on to add another layer of data visualisation to the launch. It’s being made in Blender, which is a free, open source alternative to the other giants of the CGI industry (e.g. 3DS Max, Maya etc.).

It’s presented a lot of challenges – it’s being worked on primarily by Dylan, who’s had some experience with Blender before but has never undertaken a project of this magnitude. The final result will (hopefully!) be a man who reacts to the data from our launch – for instance getting colder and colder as the launch gets higher and higher. To this end, there’ve been challenges with character modelling, clothing creation, texturing, rigging and animation. Right now, the next step is to find a way to import the real data into the model, to make it change over time.

Blender Screenshot 1The handsome, strangely textured bald sort-of-human you see above is the current working version of the character we’re going to throw into (virtual) space. Those things sticking out of his legs are… bits of his skeleton. Don’t worry – it’s (probably) perfectly healthy! To get to this point though, we’ve had to learn a lot about character modelling. Since the character has to deform into any pose, the mesh has to be made in such a way that his skin doesn’t crease as he moves. This is accomplished by edge loops – ‘rings’ of squares that encircle areas that will deform in a particular way.

Head Mesh

To get an idea for this, you can see in the mesh above the edge loops that encircle his mouth, his eyes, his entire face, and his mouth and nose together. These areas typically deform a lot, and moving them around leads to a nice clean deformation.

Next came an unsuccessful attempt to texture him via UV unwrapping – a process that produces a hellish looking flattened… thing that looks a lot like something out of a Doctor Who episode…

Figure UV MapMoisturise me!!

The aim with this is to paint detailed texture directly on to the mesh and then re-assemble it into a figure. Sadly, however, we discovered that the colour could not be changed as dramatically with a textured figure as with one with plain skin, so instead of a fancy skin, our unfortunate guy is all one colour.

The next step in the process is rigging – giving him a skeleton to hold his skin in place, and deform it as needed. This is what allows animation to occur – instead of moving 25,374 vertices individually (!!), you only have to move one of his 66 bones. To make the process even easier, another 6 bones are added to make him “animation ready” – that is, allowing the animator to move his foot, say, and have the rest of his leg move in to position on its own. In this rig (which you can see below), we’ve included (as well as inverse kinematics on his legs), the ability to roll his feet from heel to toe as you do when you walk, and forced each finger to curl when the base joint is moved (instead of moving each joint individually).

Skeleton

We’ve worked out how to do animations the normal way – but the next step is to plug in real data. I’m currently working on solving this problem – you can keep an eye on the GitHub repository here!!

That’s all for now – we’ll write another blog post on how the code works/will work and the confusing power of the almighty F-Curve soon.

Dylan

Advertisements