top of page

Online Course: 
2D Character Rigging in Unreal Engine for Game, Animation, AR/VR, and Virtual Production

Key Points:

2D Rigging

Unreal Engine

Control Rig

AR Kit

Live Link Face app

Sequencer

PRAXINOS

Motion Capture

Real-Time Tracking

Dynamic Chain

Drawing Replacements

HTC VIVE

Proposal Summary


I respectfully request a grant of $30,000 from the Epic Mega Grant program to develop the workflow, build the materials, and create an online course to demonstrate the step by step instructions on how to rig 2D Characters in Unreal Engine using Control Rig. In this free video tutorial series, I will explore the tools and techniques that Unreal offers for rigging 3D characters, but I will show how to apply those techniques to 2D characters. To target more audiences, interactive and MoCap-based setups will be covered in this course as well. The unique challenges and needs of 2D character setup will also be discussed and the necessary approaches to address the specific needs of 2D character setup will be explained. 

farmer.jpg

As an Assistant Professor of Animation and Motion Picture Science at the Rochester Institute of Technology, I have been a team member of the previous Epic MegaGrant supported project at RIT to develop the Virtual Production Curriculum. This year, I have already secured the administration support for this proposal to be considered as a part of my research and scholarship, and I am hoping this proposal can get your support in building the content and training material for this topic, and keep the long-time collaboration between RIT and Epic Games fresh and alive. 

Overview of the Novel Techniques Covered in this Course

 

  • Rigging two dimensional puppet characters in Unreal. 

  • Using motion tracking workflows to be used on IK controllers, rather than all bones of a character.

  • Using ARKit Face Tracking systems to drive drawing substitutions of 2D puppets, rather than a conventional approach of enabling and driving morph targets. 

Course Background and Description 


Realtime rigging is becoming more popular as the game engines are improving their real-time workflow and are incorporating more advanced, and yet simplified methods of rigging 3D characters. These workflows also offer dynamic setups that can be easily implemented into the 3D character rigs. 

However, since this workflow is still young, various techniques are under development, and there are areas in the character setup that are still fresh or unnoticed. Rigging 2D characters has always been a subject where each software has its own approaches to setting up charters and making them functional and user friendly. Though still each one of these techniques in various 2D software are having their own limitations in building the structure for 2D characters. Despite various 3rd party plugins having been, or are being developed for these software, they are still not able to fully leverage the 3D techniques that are well established in 3D software. 

As an advanced 3D technical artist with various experiences in puppet animation in 2D, I tend to bring the 3D approaches into 2D and apply those techniques to 2D characters, to leverage the advanced features that 3D applications offer, and apply them to the more artistic and stylized world of 2D design. Leading several internationally recognized projects as technical director and character rigger has allowed me to develop new techniques and test them in practical settings, and resolve any minor or major issue that could result in any unintended movements on the characters, assets, or effects. Examples of these techniques include procedural animation workflows, breaking conventional rigs to make them practical for each situation, creating fake dynamic setups for clothes and hair, or even smooth secondary actions.

redLadygif.gif
Dolphin.gif

Previous examples of non-keyframe based animation approaches;

Left: fully procedural puppet animation in AfterEffects (©Storyline Online); Right: Procedural animation test in Unreal (see more)

With the techniques that are under development in real-time engines like Unreal, and also with the introduction of Control Rig in Unreal, and seeing how quickly this tool is being improved and used, I am exploring the functionalities of Control Rig, and I propose the workflow of rigging 2D characters using Control Rig. 

The proposed course covers the entire process of designing, modeling, and applying the textures of the 2D puppet character in 3D space, all the way to building the joint hierarchy in any standard 3D application, and then rigging the character entirely in Unreal using Control Rig. Since the character is designed and prepared in 2D, I demonstrate the steps of breaking the design into layers for the purpose of puppet setup, and I show how these layers can be exported as individual files with alpha channel. The 3D modeling will only involve basic tools to build basic 3D cards, and I will cover how to apply those extracted textures to these cards and leverage their alpha channel, to simplify the 3D modeling process. In this course, I will discuss how we can adapt the 3D character setup approaches to a 2D character. In this course, I will also discuss the limitations of movement in 2D, and how it should be addressed in these setups. IK workflow along with setting up the pole vector in 2D space and their challenges will also be covered. 

cahrModel.png

3D character modeled using simple cards, and 2D design is applied to the cards using alpha channel

 

A chapter will be dedicated to a topic in 2D character setup that does not exist in 3D workflows and needs to be specifically developed and discussed in this 2D workflow. This dedicated chapter covers the concept of Drawing Replacements or Drawing Substitutions that helps change the shape of parts of a 2D puppet from one drawing to another. This can be applied to body parts like the eyes, nose, hands, etc. and mimics the functionality of morph targets, but in a 2D setting. This drawing replacement method is a well developed approach in 2D workflows, but such workflow does not exist in 3D platforms. The proof of concept included in this proposal shows two examples of implementing the concept of drawing replacements to a 2D character, and how I am using materials and material instances to switch between these drawing substitutions. 

handDrawingSubs.jpg
HandDrawingSub.gif

Left: two drawing replacements for hand; Right: switching the visibility of the drawings to suggest different hand gestures

 

Rigging 2D characters using Control Rig also brings the opportunity to leverage dynamic setups that Control Rig offers, and it also allows for having the 2D character interact with other environmental elements, including real time simulations, particles, and dynamics. In this course, the workflows of creating simple dynamic setups will be demonstrated, and in order to give control to the animator to fully control the behavior of these dynamic setups, custom parameters will be created and will be available to the animators to use during the process of animating these characters. 

Another topic that will be discussed in this course is how the characters can be set up so they can be used both in the animation and interactive/game workflows. For the animation workflow, the sequencer will be used and thus, in order to be able to animate the characters in the sequencer, all the attributes and controllers will be exposed and will be available to the animator so they can fully animate all the features of the character. This includes the custom attributes for extra functionalities like the dynamic setups, as well as parameters that control switching between different drawing substitutions. 

Another benefit of rigging 2D characters in Unreal Engine is the quick and simple access to the power of interactive design in the game engine. In order to target a broader range of users of this proposed workflow, I will dedicate one complete chapter to demonstrate the process of using blueprints in order to use game controllers to interactively animate our 2D characters. I will specifically use HTC VIVE controllers and trackers to track hands, feet, and center of gravity of the character, and I will show the process of connecting these controllers to HTC trackers using Live Link. The figure below shows the result of this approach on a 2D stylized character,  and it demonstrates how successful this workflow is in quickly animating a 2D character using live performance of real actors.

hands.gif
walk.gif

Live performance capture applied to 2D rig using five HTC trackers to drive two Hand IKs, two Foot IKs, and Center of Gravity

In order to push this setup further, I will also cover the process of using the ARKit, to read facial expressions using the Live Link Face app on iPad and connecting them to our rigged character. The novelty of this approach is the fact that ARKit is used to drive morph targets on a given 3D character. But in this setup, I will show how it can be used to switch between drawing replacements. One particular example of this will be switching between different eye blink drawing replacements by reading the eye blink on a real actor captured on iPad. 

It is notable to mention that this approach is not limited to only using iPad and Live Link Face app to dive drawing replacements or other aspects of animating facial expressions, and it can be expanded to or used with any other product like VIVE Full Face Trackers, VIVE Focus 3 Eye Tracker, VIVE Focus 3 Facial Tracker, or any other product that can communicate with Unreal. 

brow2.gif
eyeBlink.jpg
blink2.gif

Facial expression capture using ARKit and Live Link Face app applied to facial elements; Left: brow raise-up drives translation of eyebrow; Center: three drawing replacements for the eye; Right: switching between 3 drawing replacements using the values of eye blink received from ARKit

In order to offer more flexibility to the workflow, so that different performances can be adapted to different character proportions and default poses, a Sensitivity parameter is introduced to remap the range of movements from the actor/actress to the 2D rig. This will allow immediate and interactive adjusting of the motions so that the performer can test the range of movements and quickly adjust to the limitations of the rig. This is crucial as in most cases, 2D characters feature stylized aspects, in which proportions and default poses are not necessarily realistic. Introducing this Sensitivity parameter also allows for adaptation to any non-human based design and broadens the application of this workflow. This also can help reduce the effect of jitters captured by the trackers.

sensitivity_0.gif
sensitivity_2.gif
sensitivity_4.gif
sensitivity_5.gif

Sensitivity parameter applied to the translation of IK hands. Range of motion captured data can be adapted to the range of movements of the rig; Top left to bottom right: Sensitivity = 0, 0.5, 1, and 2

With having multiple industry experiences, I have also come to the conclusion that occasionally it is best if the characters are not rigged in a typical A or T-pose. There are plenty of examples in which the character’s body form is having an unconventional default pose. This raises a challenge when these non-typical default poses are going to be animated using motion tracking systems like HTC VIVE. To address this issue, I will demonstrate an easy way to quickly calibrate the placement of the IK controllers to match the performer's default pose. This also offers freedom to the actor/actress to try different default poses and decide on the best default pose that allows them to reach all the ranges of movements on any unconventional character form, shot, or action.  

The process of resetting the default poses by re-calibrating the controllers rotation and orientation also allows the artist to use this workflow for non-human characters and have a human actor perform the desired actions, and then apply those human based tracked motions to non-human characters. This is due to the simplicity of the proposed approach and the fact that the default pose for all IK controllers can be reset at any given time.

resetDefaultPose_Wide.gif

Resetting the default pose of trackers to readjust and conform to different character proportions or non-traditional default poses

There are plenty of tracking suits that allow for full body motion tracking. But the simplicity, and availability of smaller scale and low cost motion tracking devices like HTC VIVE allows this workflow to be used by a broader range of artists. Also, this method uses IK setups that usually prevents the undesired jitters that can happen on conventional motion tracking methods where all joints on a character are tracked. This demonstrated workflow only relies on fewer tracked points like IK feet and IK hands, and the IK setup drives the rest of the joints in the hierarchy, resulting in smoother and more stable movements on these joints. This is also more favored in 2D animation since realistic motion capture is not necessarily desired and IK tracking can give a smoother and more stylized articulation on the arms, legs, and spine, which can complement the stylized nature of 2D design.  

Jump_wide.gif
headset.gif

Demonstrating the stability and smoothness of IK rig, motion capture process, and the dynamic setup in 2D

Rigging 2D characters in 3D applications also comes with some challenges. Joint rotations are supposed to be limited to one axis only, to avoid breaking the rig into the 3rd axis. This also applies to dynamic setups and requires attention when setting them up, especially when it comes to setting up the pole vectors. The IK controllers are also the only controllers that are animated using their translation values. So in these cases, the Y axis is supposed to be locked so that they will not be pushed in the Y axis. Besides that, in order to respect non-straight controllers for IK hand and IK feet setups, the concept of buffer groups will be covered, so that the rigging artists will understand the best practices to avoid flipping or unwanted rotations happening on the sensitive IK controllers. 

Integration With Other Tools

Although this workflow is fully functional on its own, with the introduction of PRAXINOS and its implementation into Unreal, there is a great opportunity for the proposed workflow to be integrated in the fully functional 2D pipeline that PRAXINOS offers. The integration of PRAXINOS in Unreal proves that real-time pipelines like Unreal offer a robust workflow to the world of 2D. PRAXINOS also streamlines the process of building the 2.5D environment, and allows for blending the charm and artistic look of 2D with the flexibility and strength of not just a traditional 3D workflow, but also the power of interactivity and real-time rendering. 

However, although PRAXINOS offers all the 2D tools needed to make a 2D film or game, its main focus is enabling the use of frame by frame animation workflows and it lacks the 2D puppeting tools required for rigging and animating 2D puppets in Unreal. 2D Puppeting is certainly a major branch in the field of 2D animation, and there are plenty of demands and skill sets in this area in the industry and institutional settings. In addition to that, as suggested earlier, 2D puppet-based animation allows for limitless modification of movements, and offers the freedom of building more animation clips as each project progresses, as opposed to the frame by frame approach where animation clips are locked in as they are created frame by frame, and will be used as sprites and Flipbooks, which means if there is a need for modifying the animations, the sequences need to be redrawn, and also there is no freedom of modifying the animations on the fly.

Building puppets in Unreal also allows for integration of motion capture and real-time tracking, which is not possible if the traditional frame by frame animation is used to build the Flipbooks for the 2D movements. This offers the flexibility of implementing 2D animations in the non-linear projects, in which there is a need for live interaction with the assets and rigs. Using this pipeline, does not replace what PRAXINOS offers, but rather complements it by merging the flexibility of puppet rigs and animations with what PRAXINOS already offers in the 2D workflows.

Project Key Players 

Instructor and Principle Investigator: Meghdad Asadilari

Meghdad Asadilari is Assistant Professor of Motion Picture Science in the School of Film and Animation at the Rochester Institute of Technology, Rochester, NY. Previously, he was Professor of Animation at the Savannah College of Art and Design and Visiting Assistant Professor in the 3D Digital Design department at the Rochester Institute of Technology. 

Meghdad is an award winning animator who tends to leverage his engineering and technical background to produce innovative artworks or introduce novel approaches in animation workflows, and then brings his gained expertise to the classes he teaches. As an instructor, he has been teaching a wide variety of technical courses like Advanced Rigging, Programing, Particles and Dynamics, Real-time Simulations, and VFX and Compositing. As an artist, he has been collaborating with companies and studios in the US and abroad, as a rigger, tool developer, and VFX artist. Meghdad’s industry expertise and success in the area of 2D puppet animation has also allowed him to teach 2D Digital Animation courses where he uses AfterEffects and ToonBoom Harmony, to demonstrate the process of rigging 2D characters in these two industry standard software.

Meghdad’s personal short animations were recognized in various Academy qualified film festivals. He has also been invited to museums like Walt Disney Family Museum and Museum für Kunst und Gewerbe Hamburg to exhibit or present his works and give talks about the techniques he has used in producing his short animations. 

As a member of the RIT team that was awarded an Epic MegaGrant in 2020 for the project: Rochester Institute of Technology - MAGIC Spell Studios - RIT Previsualization and Virtual Production Curriculum Development with Unreal, Meghdad developed a comprehensive course on real-time particle simulations in Unreal’s Niagara System, where he teaches the foundations of setting up particle systems in Unreal, all the way to demonstrating the process of connecting simulations to VIVE Controllers to interactively control the behaviors of particle setups. Meghdad has also been involved in supervising student produced projects using the Virtual Production workflows developed at RIT, where he has supervised the real-time compositing workflows, as well as real-time particle simulation setups used in these films. 

Meghdad has received RIT Provost’s Learning Innovation Grant to develop a full online course titled Character Setup and Procedural Animation in AfterEffects. The course covers the workflow of preparing 2D characters for puppeting proposes, and rigging them in AfterEffects. The course dedicates a full chapter on procedural animation workflows, using basic expressions to create animations for characters and assets, while respecting the principles of animation. 

Meghdad is the Animation Director and Lead Animator on the web series Storyline Online, executive produced by SAG-AFTRA. So far, he has directed the animation for 6 episodes (1, 2, 3, 4, 5, 6). The Storyline Online series was nominated for the 2023 Children’s and Family Emmy Award, in which Meghdad animated 20 percent of the episodes of the Series premiered during the eligibility period. The animations for these web series heavily involve 2D puppet character and prop animation, and Meghdad practices his 2D procedural animation approaches and techniques to speed up the workflow and produce polished movements for the characters and illustrations of these children’s books. 

2D Character Design: Vanessa Sweet

Vanessa Sweet is an award-winning multidisciplinary artist that specializes in illustration and animated short films. She has created animations, illustrations and graphic design for Fandango, Hallmark, "General Hospital", the BBC and more. With the use of both digital and physical mediums, often with hand-painted watercolor textures, she creates hybrid pieces that utilize texture to create mood and feeling as a subtext to her work.

 

Currently, Vanessa teaches 2D animation as an Assistant Professor in the School of Film and Animation at the Rochester Institute of Technology. Vanessa is the recipient of several grants to support her works, from the 2017 Rasmuson Foundation Individual Artist Grant for her film Wild Woman, as well as her most recent short film Amplifying Feedback Loop which was awarded a Genesee Valley Council of the Arts Grant, Faculty Engagement and Development (FEAD) grant and a CONNECT grant through RIT. 

Vanessa’s independent films have been screened in over 70 festivals and conferences internationally and have won numerous awards. She spends most of her time divided between teaching, raising her two girls, and crafting independent 2D animated works that are centered on themes of social and environmental advocacy through a feminist lens.

Course Chapters and Topics

  • Character Design Considerations for 2D Puppeting

  • Modeling 2D Character for 3D Rigging

    • Using cards with transparency

    • Edge loops and topology considerations for 2D modeling and skinning 

    • Drawing substitutions as individual geos 

    • Leveraging Z-axis for layering 

    • UV unwrapping and layout

  • Skeletal Setup in 3D

    • Hierarchy and placing joints in 3D

    • Joint orientations - more crucial for IK setups

    • Cleanup and preparation for export

  • Rigging in Unreal Using Control Rig

    • Importing character and joints

      • Updating skeletons in case of further modification in Maya

    • Rigging using Control Rig

      • Arrays for faster FK setup 

    • IK setup 

      • Feet

      • Hands 

      • Pole vector in 2D space

      • Preserving world orientation for IK controllers using Buffer Nulls

    • Dynamic setup 

    • Materials, instances, drawing substitutions, visibility parameter 

    • Attributes to control drawing substitutions in blueprint 

  • Sequencer Setup

    • Setting up custom attributes to switch drawing substitutions

  • Interactive Setup

    • Retargeting 3D tracking to 2D space

    • Live Link

      • Project settings to enable reading the controllers

    • Connecting HTC controllers to IK controllers

    • Connecting iPad ARKit for facial setup 

      • Driving drawing substitutions using facial morph targets 

    • Connecting buttons to change facial expressions and drawing substitutions 

    • Creating the setup for calibrating default pose and setting it up in blueprint

    • Setting up the camera in orthographic mode

Course Prerequisites, Target Audience, and Impacts

Targeted audiences of this training session should have a basic knowledge of 2D puppeting workflow. They should have a basic understanding of joints, hierarchy, and skins in 3D settings as well, but the process will be covered in the first section of the course. No prior modeling skills is necessary for this workflow as the models are made using simple primitive geometries, and the rest of the character is built relying on 2D textures. 

A basic understanding of Unreal workflow is a plus, but the only topics that will be focused in this course are the Control Rig and Sequencer. For the integration of trackers, the viewers will be introduced to the topic of Blueprints and visual scripting. No prior experience in using Blueprints is necessary. 

 

The proposed setup that will be demonstrated in this course is targeting three groups of audiences:

 

  1. 3D Animators who are seeking to apply their skill sets to 2D characters, and at the same time, bring their expertise into the real-time pipeline, and leverage the tools that Unreal Engine and Control Rig offer to build characters and animate them directly in Unreal. Because of the nature of 2D puppet animation, drawing skills are not necessary for animation, and animators can just focus on the animation and leave the character design to be handled by a 2D artist. This can easily gain 3D animators’ attraction as drawing skill is not necessarily the skill set that 3D animators might have a strong foundation in. 

  2. 2D Animators who are interested in exploring new pipelines that help them build their 2D puppets in new applications. This will also allow them to use the dynamic systems, physics, simulations, and VFX capabilities of Unreal, that are otherwise inaccessible or not as powerful in other 2D or 2.5D applications. 

  3. Game designers and AR/VR artists who are planning to build 2D games or interactive AR/VR projects, in which they need to rig 2D puppets directly in Unreal to expand their character’s capabilities and range of performance, as opposed to creating Flipbooks using pre-rendered sprites of pre-animated cycles.

Building characters in Unreal also allows both groups of animators to use the interactive tool sets that Unreal offers, along with leveraging the real-time motion tracking capabilities that are readily available in Unreal, to animate their characters either in a linear approach using Sequencer, or in a real-time interactive workflow using simple motion tracking systems like HTC Vive or any other small scale or large scale tracking systems or game controllers. 

The real-time workflow that is introduced in this course using the interaction between live motion capture setup and implementing it into the rig could also benefit Virtual Production workflows and opens the possibility of blending real world with a stylized 2D virtual world, where 2D puppets belong either to a fully 3D virtual world, or exist in a more stylized 2.5D world made by 2D multiplane setups to create the parallax effects in the virtual world. 

Dissemination

Plan for disseminating the results has two folds: 

 

  1. The course is in the form of a free online series, and will be published on the instructors YouTube channel. This channel has subscribers in various fields, including 3D Rigging, 2D Character Rigging, Real-time Particle Simulation, Interactive Setups, and Materials and Rendering. The instructor’s previous tutorials have been hand picked by online platforms like Autodesk Area, LesterBanks.com, and other professional communities, gaining more than 200k direct views in total.

  2. The developed workflow will be presented as workshops or talks held in national conferences like University Film and Video Association (UFVA) to share the results or the workflow with instructors and industry peers. 

Budget Breakdown

The requested budget breakdown is as follow:  

  • Summer salary - $26,000: This covers the 2.5 months of summer effort to prepare the course material, record the lessons, and edit the recorded footages. 

  • Character and other 2D artwork design - $2,000: This covers compensation for a peer faculty member to design original 2D character and assets that contain elements and features that can be used to demonstrate all the proposed techniques. 

  • 3D Assistantship - $1,000: This covers hiring graduate assistants to model the 2D character in 3D using planes with proper topology, UVs, and textures. 

  • MoCap Assistantship - $1,000: This covers hiring graduate assistants for the final full body motion capture process to demonstrate the functionality of the setup. 

bottom of page