Once the concept was approved, what was the overall pipeline?
We would use the concept sculpt as a base for retopology in combination with the scans. As I said before, we used Project All to retain volumes and details when we had new geometries following a new concept. Then in general there was a lot of back and forth between Maya and ZBrush when modeling.
We used SpotLight to lay out reference images on the screen while sculpting. We also used it to give to the models a fast PolyPaint before texturing. This was mainly done using ZApplink and Photoshop.
Once we had models and textures, we would bake 32-bit displacement maps using Multi Map Exporter and then render the creatures with 3Delight.
How did ZBrush benefit you at that point?
We sculpted everything we could using ZBrush.
The main volumes were polymodeled, but from the mid frequency to the high frequency details it was all sculpted in ZBrush. Also, all the blend shapes and the animated displacement maps were done in ZBrush.
I personally sculpted the creatures EdvardThing and Edvard/AdamThing and their blend shapes. I used the Layer system and I worked around the shapes using the subdivisions. This allowed me to rapidly export all my shapes from the first subdivision and then bake every layer in maps that we would animate using the rig and the shader. ZBrush was essential.
What sculpting features stood out as you were creating the models and displacement maps?
Definitely Layers and Project All. All the work done on the faces and the muscle deformations really benefited from the possibility to sculpt and store the blend shapes, using ZBrush brushes.
If I was going to do all that work in Maya it would have taken three times longer and it would also have been much more frustrating. Using ZBrush made the work artistic. I was able to move vertices around with the Move Topological brush and also the Nudge brush when I was sculpting blend shapes. Then I could also subdivide these shapes and sculpt the consequential details all by using the same application. This way, when animating the facial expressions, the blendShapes value was always the same as the magnitude of its displacement maps because they came from the same sculpt.