Contents:

Visualizing Avatars

Expressions, Gestures, and Other Details

Visualizing Avatars

With most VR devices currently on the market, only supporting head and hands tracking, these have been the primary elements making up most avatar visualizations. Even though that omits your neck, body, and legs, users can actually be very expressive with simply just their hands and head.

This hasn’t, however, stopped designers from implementing full body avatars. Either they would incorporate add-on sensors to track waist and feet or they’ve developed methods of guessing where body, feet, and shoulder locations would be, based on hands and head locations.

Here are some examples of how avatars are represented in different VR experiences:

Full-body avatar

recorded by Node of Boneworks by Stress Level Zero

These avatar designs guess where the user's body parts are simply based on where their hands and head locations are.

View post

Expressions other character details

Aside from a user’s movement, there are additional ways that can allow avatars to be more humanized. Facial features and even simulated features such as moving mouths or blinking lights for when a user is speaking or eyes that blink randomly are all small details that can make an impactful difference during social interactions. These details can help facilitate deeper connections between users and create a more engaging social experience.

Here are some examples:

Simulated eyes

recorded by Google Developers

Including simple simulated humanistic features, such as eyes that look around and blink, can easily make an avatar a lot more real and easier to connect with in a social context.

View post

Leave a Reply

Your email address will not be published. Required fields are marked *