Quest’s ‘Body Tracking’ API is just a legless estimate

Quest’s ‘Body Tracking’ API is not what it looks or sounds like.

The Body Tracking API was released Thursday as part of the SDK movement, which also includes the Eye Tracking API and the Face Tracking API for Quest Pro.

The official Oculus Developers Twitter account release announced With an illustration from the documentation showing the full body position of the user being tracked. This has been widely shared – leading many to believe that Quest just got body tracking support – but the name of the API and the illustration are misleading.

The Meta Hand Tracking API provides the actual position of your hands and fingers, which are tracked by the outward facing cameras. The Eye Tracking API and Face Tracking API provide your actual gaze direction and facial muscle movements, which are tracked by the Quest Pro’s inward-facing cameras. But the “Body Tracking” API only provides”upper body skeleton simulation“Depending on your head and hand positions, a Meta spokesperson confirmed to UploadVR. It’s not actual tracking, and it doesn’t include your legs.

A better name for the API would be Body Pose Estimation. The speaker described the technology as a combination of inverse kinesiology (IK) and machine learning (ML). IK refers to a class of equations for estimating the unknown positions of parts of a skeleton (or robot) based on known positions. These equalizers power all the full-body virtual reality images in applications today. Developers don’t need to implement (or even understand) the math behind IK, as game engines like Unity & Unreal are built into IK, and packages like the popular Final IK offer fully integrated apps for less than $100.

Unless you’re using body tracking devices like HTC’s Vive trackers, IK for VR tends to be inaccurate though – there are just so many potential solutions for each particular combination of head and hand positions. Meta’s idea here is that its machine learning model can produce a more accurate object for free. The demo video seems to support this claim, although without the lower half of the body – and with support limited to Quest headphones – most developers probably won’t accept this offer.

However, hints presented at the Meta Connect 2022 event, and company research, suggest that legs will be added in the future.

Speaking to developers, Body Tracking Product Manager Vibhor Saxena said:

“New body tracking improvements will be available in the coming years through the same API, so you can be sure that you will continue to get the best body tracking technology from Meta without having to switch to a different interface.

We are excited to bring these capabilities to you and are working hard to make body tracking that much better in the years to come.”

During the keynote keynote, Mark Zuckerberg announced that the Meta Avatars are getting two legs with a demonstration that was also misleading. Legs will arrive in Horizon later this year, and then in the SDK for other apps next year. Saxena confirmed that the Body Tracking API makes use of the same underlying technology that powers Meta Avatars – which seems to suggest that the API will get legs, too.

You may be wondering: If the Body Tracking API is just an estimate based on head and hand positions, how can it integrate legs? Last month, Meta featured research on exactly this, taking advantage of recent advances in machine learning. The system shown is not entirely accurate though 160ms latency – 11+ frames at 72Hz. This timing is very slow and the output is imperfect, so you can’t expect to look down and see your legs in the positions you would expect. Comments from the CTO on Meta suggest that the company may use technology like this to show legs other people Avatars instead:

“Having legs on your avatar that don’t match your real legs is very frustrating for people. But of course we can put legs on other people, and you can see that, and it doesn’t bother you at all.”

So we’re working on legs that look natural to a spectator – because they don’t know how your real legs are positioned – but it’s possible that when you look at your legs you’ll still see nothing. This is our current strategy.”

As we noted at the time, The next solution may not be as good as this research. Machine learning papers often run on powerful PC GPUs with relatively low frame rates, and the paper did not mention the uptime performance of the described system.


#Quests #Body #Tracking #API #legless #estimate

Leave a Comment

Your email address will not be published. Required fields are marked *