MetaDraw Technique: Hand Drawn Appeal in 3D
I developed a new technique inbetween 2D / 3D rendering for my newest short film „The Train, The Forest“. The goal was to keep the rawness of hand drawn images, but also to maintain the improvisational advantages of 3D.
This is the trailer for “The Train, The Forest”:
And here are some tests that resulted from a recent workshop I gave at the Academy of Media Arts in Cologne (about this technique):
Unsatisfying techniques I tried before:
(Anyone who tried to achieve a hand drawn look in 3D knows how disappointing it can be)
- importing drawings in 3D space (image planes in perspective are unattractive and the stroke width varies depending on camera distance)
- Texturing 3D Objects with drawings: Still the problem of perspective distortion and 3d look.
- Any sort of digital cel shading still looks very artificial – the charme and rawness of original drawings are lost
- Filters look like… filters
What I was going for:
- A more raw look you get in rough sketches
- „professional amateurism“
- An individual FPS control for each stroke, because smooth movements are less charming for my purpose and I haven’t seen this crazy mix of FPS’es before.
- Being able to have an interplay between hand drawn images and 3D animation (procedural and manual)
- Making crazy complex movements and camera animations without loosing the appeal of the hand drawn elements
Technical outline of my solution:
- Drawing one type of line or form in many different lengths on paper
- Writing software to disassemble the big image into a database of seperate files and sorting by length
- Creating and animating 3D curves in Maya.
- Projecting the animated curve CV coordinates in screen space → Result is an animated 2D Path. The info which points will be connected is exported as well.
- The data is imported into Nuke.
- Writing a Nuke script that measures the distance between two connected points and chooses the appropriate hand drawn image that fits inbetween. Then position / rotate the image accordingly.
- Adding more control in Nuke like offsets, random FPS control, scaling, grouping, random color shift, extending lines etc
I also came up with a second solution I’ll present in a future post.
(can be anything, just needs to contain seperate drawings that have different lengths)
I scan the image and create an alpha channel in Photoshop.
2 Disassembling The Drawing With Object Recognition
I’ve written this standalone python software with PyQt, PIL and scipy / numpy for the algorythms.
The alpha channel is converted to a binary array which can be interpreted by the object recognition algorythms in scipy. An array of rectangle coordinates is returned. In a second step before exporting, I need to multiply the alpha of the exported object with the binary array of only this object in order to remove all the other lines that would appear in the rectangle.
Finally, the images are sorted by width and saved as png sequence along with a txt info file that stores the data of all elements: path, width, height, position in original image
3 Animating curves in Maya
Each line segment will represent one stroke. One limitation of my method is the non-existence of curvature. So I set the degree of all curves to linear in order to get a feel for what I’ll get.
I developed many script to be able to deal with curves, like jittering along path, fast rebuilding, better poly edge to curve function, nCloth simulation to curves … I might write about this in another post.
4 Projecting the animated 3D curves to 2D screen space coordinates
This is a process happens in Maya and Nuke.
The exporter in Maya exports three arrays: The names and initial positions of each point (e.g. curve1_cv01,curve1_cv02, etc), the information on which points are conneted (e.g. curve1_cv01 → curve1_cv02) and, if animated, the point position for the given frame range.
The screen space conversion happens in Nuke with the reconcile3D Node.
The upside of not doing the conversion in Maya: The camera can be changed in Nuke at any time.
5 / 6 importing the data in Nuke
Three main node groups with custom attributes are created:
- Point info node group: A list of all point names with 2D coordinates. These attributes are connected to a Reconcile3D Node for each point that reads the 3D position that is animated with keyframes.
- Image Controller: This node group contains the images sequence of the scanned images and an attribute that stores the width values (as a string value which is later split, I couldn’t find a good way to store arrays in Nuke nodes). This Node group feeds the draw node group with the images.
- Draw node group: This node reads the positions from the point info group and chooses the appropriate line image and then rotates/scales it according to the settings (set in the draw node group).
With more complexity comes more Nuke node aestheticism:
7 adding more control
- random colors: the hacky version is to put a hueShift with random hue values between the image controller and draw node. The colors of all lines will vary, but their value changes with each frame.
- Grouping of connections: The script reads a custom attribute in Maya that tells the exporter what draw group it belongs to. The Draw Node will get more than one input and each group can have its own settings.
- Random Fps per draw line
- Extending the line over its end point
- jittering: Randomly deviating from the best suited line image
- Fluid Vector export: Visualizing fluid vectors with squiggly lines can be fun, also NCloth simulations.
More to come. Thanks for reading!