The Perfect Avatar

3D Scan Turns You Into A Game Character in Just 4 Minutes


In four minutes from human to gaming avatar: This study from the University of Southern California uses 3D scanning to turn you into a game character with ease.

Remember the last time you spent decades in the character generation screen to get your profile look a bit like you? Behold, here’s a scientific solution: Researchers at the University of Southern California created a program that, in four minutes, turns you into a realistic, moving avatar.

What the Character Generation Tool Does


The potential of a personal avatar is big: Think of a perfectly scanned avatar that‘s linked to your gaming account (i.e. on Steam, Microsoft Gamertag, Nintendo Mii etc). It could even become available in every game that supports the feature. You could even have a statue printed out and sent to your home.

Led by head of Character Animation and Simulation research group at the USC, Ari Shapiro, the research team has created a 3-step process to turn a person into a moving game avatar. The digital toolkit from USC researchers works in three steps: 3D scanning, conversion of scan to a character, and animation.

Creating a 3D image is relatively easy. But what about your movements? The created avatars are capable of steering, manipulating objects, lip syncing, as well as nonverbal behaviors such as nodding, and hand gesturing. But they aren’t necessarily perfect – and that’s what Shapiro is working on. He told the DailyMail, “once got over the visual, they say ‘well, that doesn’t look like me, that doesn’t act like me”.

Shapiro is still very positive about the program. The team sees these avatars are a new level of interaction for gamers and simulators. They do, however, mention the assumption that such avatars would not be used in games that include violence, as players would not want to see themselves being injured or dying. It is likely the team did not realize that’s exactly what many gamers will want.

Three Steps To A Realistic Avatar

  1. 3D scanning. The scan be made using any sort of scanning technology—including Microsoft Kinect or Intel RealSense.
  2. Automatic rigging software. This converts the 3D scan data into a character. It also let’s users reshape the character in terms of weight, height, and similar parameters.
  3. SmartBody. Shapiro developed this platform with Andrew Feng, at the character Animation and Simulation (CAS) group at the Institute for Creative Technologies (ICT). This is the program that teaches a frozen avatar how to move, jump and smile.


Via: Daily Mail