Blender & Unity: Manually Rigging Blender Humanoid Characters for use with Unity Mecanim


I’m definitely no character animator by trade, but there comes a time when you end up with a Unity project that somehow requires it. There are obviously many automatic rigging methods available (Blender does actually have an auto-Rigging system called Rigify for biped humanoids) and you could even try to download other rigs made by other people and plonk them into your scene, but I found that so many of the rigs including the rigify one seems to involve so many complicated bones you don’t need, so you end up having to sift through the bones, deleting so many unwanted bones, renaming bones, perhaps even having the impression of the impossibility of rigging up them bones.

Although it may seem terrifying at the beginning (I’m not an animator or rigging specialist!), I found that surprisingly, it is not that difficult to manually rig up all your bones if what you have is a very simple humanoid character. You just need to be orderly and to stick with the admittedly tedious bone naming process. (Although our character is blobby, we’re sticking with a humanoid as we’re going to use it with the Kinect to sync it with the movement of the human user, and our human user is going to return a humanoid set of values that we’ll need to rig up our character to…)

According to the Unity Blog’s post on Mecanim Humanoid:

“The skeleton rig must respect a standard hierarchy to be compatible with our Humanoid Rig. The skeleton may have any number of in-between bones between humanoid bones, but it must respect the following pattern:”
Hips – Upper Leg – Lower Leg – Foot – Toes
Hips – Spine – Chest – Neck – Head
Chest – Shoulder – Arm – Forearm – Hand
Hand – Proximal – Intermediate – Distal

This is the list of all the bones you need (I found it useful to copy and paste in these names directly)

head
neck
collarbone.L
collarbone.R
upperArm.L
upperArm.R
lowerArm.L
lowerArm.R
hand.L
hand.R
chest
abdomen
hips
upperLeg.L
upperLeg.R
lowerLeg.L
lowerLeg.R
foot.L
foot.R
toes.L
toes.R

Optional: eye.L and eye.R

For starters: Ensure that your character model is positioned at origin and that its pivot point is also at origin (0,0,0). Make sure you reset the scale to 1 just in case (Ctrl+A, Select Scale). The hip bone is the key bone in all this, so start by creating one big bone starting from the bottom of hip to top of the chest. Hit Space and start typing “Subdivide Multi” (Armature) and give it 2 cuts so you get 3 bones. These will form the hips, abdomen and chest bone.

After you’ve done the main spine bones, you can turn on x-axis mirror.

– Select the ball on top of the bottom bone (hips bone). Make sure Options>Armature option>X-Axis Mirror is selected, then press Shift-E to extrude mirrored bones. When you’re in mirror mode, every time you create a new bone, you’ll have a second one mirrored on the other side of the X-Axis. Remember that you’ll have to rename BOTH bones later on – if you are facing your model face-on, also remember that L is actually to the right and R is to the left, and name it accordingly.

– Arrange the leg bone into position (you may need uncheck “Connected” in order to let the leg bone go into the right position). Reposition the leg bones away from the hip. Subdivide Multi (1 cut) this leg bone into two bones, forming upperLeg and lowerLeg.

– Shift-E to extrude two more foot and toe bones, and also add in the collarbone, arms and neck+head bone. Do make sure you keep it all in a standing T-pose (as if the character is standing in the shape of the letter t).

– Ensure that all of your bones are renamed correctly as per the list. If there is an L bone there must always be a R bone.

– Go into Object Mode and Select first the character and then Shift select the armature. Press Ctrl+P and select Set Parent To – Armature Deform – With automatic weights. Your computer might lag for a second before its all connected up.

From there, you’re in the home stretch. Export your Blender model in FBX format and then import it into Unity, and in Unity set the rig to humanoid (instead of generic) and at the bottom of that, hit Apply.

Let the wild rigging begin!

See also:
Animate Anything with Mecanim

The Sun Cycle (Documentation of New Work)

P6014759

P6014761

P6014711

⚙ / THE SUN CYCLE
By Debbie Ding

In the rear-view mirror appeared Tezcatlipoca, demiurge of the “smoking-mirror.” “All those guide books are of no use, “said Tezcatlipoca. “You must travel at random, like the first Mayans; you risk getting lost in the thickets, but that is the only way to make art.”
Robert Smithson, “Incidents of Mirror-Travel in the Yucatan” (1969)

In 1973, the land artist Robert Smithson was in a plane overseeing the site of his new work Amarillo Ramp in Texas when his pilot collided into a mountain, killing them instantly. It was a tragically prophetic death for a man who once said that the physical and the mind are in a “constant collision course”. For in the process of making of his work, he had inadvertently ended the physical possibility of himself making the work. And for those of us who are deeply interested in land and spaces, and in places, the physical impossibility of being in more than one place has always been a quandrary.

For me, the most striking realization of traveling from Mexico to Singapore is the physical distance from Singapore. I marvel at being able to sit in a plane that flies across continents and oceans, according to the map on your inflight screen, and finally lands on the opposite side of the globe… And then to pick up and to hold in my hand a seemingly inconsequential rock on the ground in Mexico… These rocks, stones, soil, and dirt have been a silent but constant audience to man’s numerous movements and interventions around the globe.

I saw numerous artefacts at the Museo de Antropología e História de Toluca, at the Museo Nacional de Antropologia in Mexico City, and in a small bookstore here I also saw miniature replicas of some of these ancient sculptures – from the iconic coiled feathered serpent of Quetzalcoatl to the Aztec Sun Stone, which is also frequently simply labelled as “Mayan Calendar” in small shops. As an artefact, the “Mayan Calendar” is visually arresting with its detailed glyphs and symbols. The calendar round itself is made up of 3 interlocking cycles made up of 365 days, 20 names, and 13 numbers, and the names of the dates are designated according to the alignment of the three cycles. 52 years will pass before the three cycles will align in the same way again. The origins of these calculations also came from the way in which the earth aligned with the sun and the other stars and planets. Although we think of time as a fluid abstraction, it is marked by observable changes in the position of physical matter – the very physicality of the land and earth that we are on.


Automatic translation with Google Translate:

⚙ / EL SOL DE CICLO

Por Debbie Ding

En el espejo retrovisor apareció Tezcatlipoca, demiurgo del “hábito de fumar-espejo”. “Todas esas guías no sirven de nada”, dijo Tezcatlipoca. “Usted debe viajar al azar, como los primeros mayas, corre el riesgo de perderse en la espesura, pero esa es la única manera de hacer arte.”

– Robert Smithson, “los incidentes de espejo de Viajes en la península de Yucatán” (1969)

En 1973, la tierra artista Robert Smithson estaba en un avión que supervisa el sitio de su nuevo trabajo Amarillo Ramp en Texas cuando su piloto chocó contra una montaña, matándolos al instante. Fue una muerte trágicamente profético para un hombre que dijo una vez que el físico y la mente están en un “curso de colisión constante”. Porque en el proceso de elaboración de su obra, sin darse cuenta que había terminado la posibilidad física de sí mismo haciendo el trabajo.

Para mí, la realización más notable de viajar desde México a Singapur es la distancia física de Singapur. Me maravillo de poder sentarse en un avión que vuela a través de continentes y océanos, de acuerdo con el mapa en la pantalla durante el vuelo, y finalmente aterriza en el lado opuesto del mundo … Y luego para recoger y sostener en la mano una apariencia roca intrascendente sobre el terreno en México … Estas rocas, piedras, tierra y suciedad han sido un público silencioso pero constante de numerosos movimientos del hombre y las intervenciones en todo el mundo.

Vi numerosos artefactos mayas y aztecas en el Museo de Antropología e Historia de Toluca, y en una pequeña librería aquí también vi réplicas en miniatura de algunas de estas esculturas antiguas, desde la serpiente enrollada icónica emplumada de Quetzalcóatl a la Piedra del Sol Azteca, que También es frecuente, simplemente etiquetados como “Calendario Maya” en las tiendas pequeñas. Como un artefacto, el “Calendario Maya” es visualmente con sus glifos y símbolos detallados. La ronda del calendario en sí se compone de 3 ciclos entrelazados compuestos por 365 días, los nombres y los números 20, 13, y los nombres de las fechas se designan de acuerdo a la alineación de los tres ciclos. 52 años pasarán antes de los tres ciclos se alineará de la misma manera otra vez. Los orígenes de estos cálculos también vino de la forma en que la Tierra alineada con el sol y las estrellas y los planetas. A pesar de que pensar en el tiempo como una abstracción de líquidos, que se caracteriza por cambios observables en la posición de la materia física: la corporalidad misma de la tierra y la tierra que nos encontramos.


P6024810

P6024813
The Actual Sun Stone which I went to see at the Museo Nacional de Antropologia in Mexico City…

What can you do with a Kinect in one day: The $100 Exhibition

TODAY! My goal is to spend one day making up something using kinect in time for an impromptu exhibition/show organised by Kathleen and Kent this evening. Given the timeline, it will probably just be some experiment or sketch.

EQUIPMENT: I have a kinect from my project last year, a nyko zoom lens which i picked up from Gamers Hub the last week (S$40), and i just updated my trusty old Macbook Pro’s OS to 10.6. Sherwin has a projector, and we improvised some temporary setup with the furniture, a big white cloth for a projection screen, and lots of duct tape…

Installing the Nyko Zoom Lens for XBOX 360 Kinect

Since it was a small space, I thought I would try the Nyko Zoom lens to see if it would help for installations in tiny spaces. When installing it over the Kinect you will hear a loud click when the lens snaps into place. I took a before and after screenshot so you can see the different in view.

Screen shot 2012-03-10 at AM 10

Without Nyko Zoom Lens on (as seen via flkinect demo example)

Screen shot 2012-03-10 at AM 11

With Nyko Zoom Lens on (as seen via flkinect demo example)

Xbox_ZoomNew3_zoom

According to the documentation, this is what the nyko zoom can do.

I noticed that the corners are not visible with the zoom but i have always had problems with those areas (esp for blob detection) so maybe i won’t really miss those spots. Another thing is that some people on the internet seem to be complaining that removing the zoom device will scratch the kinect’s plastic body. That seems a bit unfair since the product actually does come with a BIG BRIGHT YELLOW INSTRUCTION SHEET warning you to apply the protective sticker so you can prevent this from happening. I had no issue with clipping and unclipping the zoom lens to be honest so the possibility of scratching the kinect is a non-issue to me.

My verdict is that the zoom lens can provide the comfort of knowing you could reduce the range if you really had to, but i find that the data returned is actually more “noisy” so I wouldn’t use it for an installation that is returning “blob” data.

as3kinect server

Installation is a breeze. Why did I panic or struggle the last time around? I suppose its all about pretending that you know what is going on and then slowly figuring it out along the way.

Download the OpenKinect wrapper v0.9c for Mac OSX. Install it and then open up terminal and run as3-server as follows:

Screen shot 2012-03-10 at AM 11

When you see “### Wait client” its ready.

Screen shot 2012-03-10 at AM 11

Screenshot of the demo package on the as3kinect site.

Screen shot 2012-03-10 at PM 01

my housemate tries the blob detection – success!

as3-server Isochronous Transfer Errors

Screen shot 2012-03-10 at PM 01

And then, of course it had to happen. I spoke too soon. On my Macbook Pro (OSX 10.6), it keeps dropping the packets – something that i never experienced with openkinect on Windows 7 (via Parallels). Not sure what to do about this. Playing with depth settings cause it to completely stop communicating… At this point I didn’t feel confident that this would stand up to the rigours of being used in an installation so I scrapped this.

tuiokinect

Tried downloading tuiokinect. TUIO is a popular protocol used for blob detection, fidicual detection, and gestures and it works brilliantly with many things such as Flash, Processing, Java, etc. I wonder why I hadn’t looked at this earlier!

Screen shot 2012-03-10 at PM 02

You can use udp-flashlc-bridge to send TUIO messages to Flash.

Screen shot 2012-03-10 at PM 02

DISCOVERED THAT IT WORKS OUT OF THE BOX WITH OTHER TUIO CLIENT APPLICATIONS! Now, this is a winner. Opened up my \ application which was from my solo show in 2010 which also used TUIO, and the blob detection works immediately. The people working on TUIO are doing something right here, it is seriously awesome.

Screen shot 2012-03-10 at PM 02

Here, the gestures demo shows what it can do. After tweaking depth, I can use it to move squares around, rotate them, and make them larger or smaller.

I was going to build a tesla coil simulation and had scraped together something on the flash side, but it was impossible for me to mash it together with tuiokinect within the amount of time so I ended up just projecting this gestures demo which seemed to entertain people quite a bit, if not at least induce them to exercise for a few minutes by trying to push a square from one end of the screen to the other.

CALIBRATION: I ended up using flkinect inbetween to adjust tilt and other motor functions. If the kinect is accidentally disconnected, this will cause TUIOkinect and flash to crash! This happened when people tripped over the cabling which had to be run across the room. I also found myself constantly recalibrating it in TUIOkinect when people approached it from a different angle!

LOGISTICS: The next issue I think is if you want to build kinect things then projector cables need to be a lot longer because the computer and kinect have to sit in the front of the room and the projector needs to be high enough so that people’s heads do not get in the way. We were forced to use a slightly awkward or should i say “cosy” arrangement of equipment because of cable length, and in my previous installation at MBS ArtScience Museum, a few dozen metres of VGA cables were used to connect the projector in the front of the room, with a cable that ran all the way up the wall to the ceiling of the exhibition hall across the entire space to the back and to a large metal projector rig hanging a good few metres from the ceiling.

PROOF-OF-CONCEPT: I guess the point of this is that a few years ago I wouldn’t have imagined that it would be possible for me to even control or put together something like this on my own but this basic demo can be easily put together by someone with no formal technical background within one afternoon. If I can put this together in one day, then clearly, if given sufficient time and motivation, anyone could build something like right out of the interface in minority report, or something like microsoft surface – entirely from scratch, powered by a few smart opensource sockets and libraries. The next question would then be: but why would you want to do that….?”


Thanks to Kathleen and Kent for putting together the show, the other artists for bringing their works, and Sherwin for so graciously having us invade his place until the wee hours of the morning (including that remarkably focused artist talk that inexplicably went on until 3am). Photos I took at The $100 Exhibition can be seen here.

flkinect – simple socket server for kinect and as3 flash

01

flkinect is a Cocoa application by Koji Kimura that allows communication between Flash and Kinect. thought i’d check it out, and realised that it makes it incredibly easy to control some of the simpler functions of the Kinect. i tried it today and it worked instantly. INSTANTLY! WITH ONE CLICK! (Being of the perverse nature, I was almost disappointed it wasn’t harder, or that I didn’t have to understand more in order to make it work!)

Screen shot 2012-03-07 at AM 10

Screen shot 2012-03-07 at AM 10

Not sure why I hadn’t tried this earlier – I supposed I was set on using Windows as I knew that the computer on-site would be Windows, so I never looked back. Another drawback is that i suspect that one will still need to use something like OpenKinect to access the more useful features in kinect. But to get the kinect and flash to basically just communicate in Mac OSX is clearly an absolute breeze, unlike my experience trying to install it for windows…