A new approach to DIY motion capture revealed at SIGGRAPH 2018

Xsens, Kite & Lightning, IKINEMA and Unreal Engine to showcase how the iPhone X and Xsens can empower democratized, full-performance motion capture at SIGGRAPH’s Real-Time Live!

Enschede, Netherlands, July 2018 – A new approach to DIY, full-performance motion capture will be showcased at this year’s Real-Time Live! at SIGGRAPH 2018.

The session ‘Democratising mocap: real-time full-performance motion capture with an iPhone X, Xsens, IKINEMA, and Unreal Engine’, will take place on Tuesday 14 August from 6pm-7.45pm at the Vancouver Convention Center’s West Building, Ballroom AB.

Cory Strassburger, co-founder of LA-based cinematic VR studio Kite & Lightning, will demonstrate how an iPhone X, used in tandem with Xsens inertial motion capture technology, can be used for simultaneous full-body and facial performance capture, with the final animated character live streamed, retargeted and cleaned via IKINEMA LiveAction to Epic Games’ Unreal Engine – all in total real time.

You can watch the technology in action here!

Cory Strassburger, co-founder, Kite & Lightning, comments: “Thanks to recent technology innovations, we now have the ability to easily generate high-quality full-performance capture data and bring our wild game characters to life – namely the iPhone X’s depth sensor and Apple’s implementation of face tracking, coupled with Xsens and the amazing quality they've achieved with their inertial body capture systems. Stream that live into the Unreal Engine via IKINEMA LiveAction and you've got yourself a very powerful and portable mocap system – one I'm very excited to show off at SIGGRAPH 2018.” 

Taking a ‘Beby’ character from Kite & Lightning’s upcoming ‘Bebylon’ game, Cory will show on stage how this simple, DIY set up, powered by accessible technology, can power real-time character capture and animation. He will demonstrate how the new approach to motion capture does not rely on the process of applying markers or setting up multiple cameras for a mocap volume, but rather relies only on an Xsens MVN system, a DIY mocap helmet with an iPhone X directed at the user’s face, and IKINEMA LiveAction to stream and retarget (transfer) the motion to ‘Beby’ in Unreal Engine. Via this setup, users can act out a scene wherever they are.

The Real-Time Live! session will also cover the implications and potential of this process on future creative projects, revealing how new scalable workflows can empower games, CG, and animation industries at the indie level without the need for huge budgets.

  • WHAT: Democratising mocap: real-time full-performance motion capture with an iPhone X, Xsens, IKINEMA, and Unreal Engine’
  • WHEN: Tuesday 14 August from 6pm-7.45pm
  • WHERE: Vancouver Convention Center’s West Building, Ballroom AB, at SIGGRAPH 2018.

About Xsens

Xsens is the leading innovator in 3D motion tracking technology and products. Its sensor fusion technologies enable a seamless interaction between the physical and the digital world in applications such as industrial control and stabilization, health, sports and 3D character animation. Clients and partners include Electronic Arts, NBC Universal, Daimler, Autodesk, ABB, Siemens and various other leading institutes and companies throughout the world.

Xsens is part of mCube, the provider of the world’s smallest and lowest power MEMS motion sensors, key enablers for the Internet of Moving Things.

Xsens has offices in Enschede, Los Angeles and Hong Kong.  http://xsens.com.

About IKINEMA LiveAction

IKINEMA LiveAction enables users to live-stream, retarget and clean motion capture data, taken during an actor's’ performance with the assistance of motion capture suits. The technology boasts zero lag – meaning there is no latency between the actor performing the motion, and instant retargeting and viewing on the virtual avatar. Characters move with authentic human or creature behaviors and interact with props in their virtual environments, all performed within Unreal Engine 4. Prominent clients include: Activision, Bethesda, Capcom, CBS Digital, Digital Domain, Disney, DreamWorks Animation, Electronic Arts, Epic Games, Fox VFX, Framestore, Rede Globo, GREE, Inc., Technicolor, Tencent NEXT Studio, 20th Century Fox, Ubisoft and many more.  Ikinema.com.

About Kite and Lightning

Kite & Lightning is a cinematic VR company known for creating immersive computer-generated worlds that blend interactive gaming, social and story. As one of the leaders of the new virtual reality movement, Kite & Lightning became known for their original emotionally charged transformative experiences such as the award-winning VR Mini Opera Senza Peso. Other noteworthy commercial projects include a 3D VR experience for NBC’s “The Voice” featuring Pharrell Williams, Adam Levine, Blake Shelton, & Gwen Stefani; and Lionsgate’s first VR narrative, starring Kate Winslet, Mekhi Phifer, and Miles Teller. 
kiteandlightning.la




© 2024 Internet Business Systems, Inc.
670 Aberdeen Way, Milpitas, CA 95035
+1 (408) 882-6554 — Contact Us
ShareCG™ is a trademark of Internet Business Systems, Inc.

Report a Bug Report Abuse Make a Suggestion About Privacy Policy Contact Us User Agreement Advertise