SIGGRAPH 2015 Emerging Technologies Demonstrate Transformation of Interactive Trends and Techniques

Program to Showcase the Latest Technologies and Computer Graphics Approaches that Push the Boundaries of Skill and Creativity

CHICAGO — (BUSINESS WIRE) — July 8, 2015SIGGRAPH 2015, the annual interdisciplinary education experience and conference on the latest computer graphics and interactive techniques, announces the lineup of this year’s Emerging Technologies program, which will showcase the latest in interactive and graphics technologies. The 2015 Emerging Technologies program will feature projects from various industries that demonstrate how evolving technologies and techniques impact the way we live and work. The 42nd annual SIGGRAPH Conference will take place 9-13 August 2015 at the Los Angeles Convention Center in Los Angeles, CA.

This Smart News Release features multimedia. View the full release here: http://www.businesswire.com/news/home/20150708006493/en/

The projects in the 2015 program fall under the theme “Work and Play, Technology that Improves Daily Lives.” Each project and installation exhibits how innovation can improve work environments, make everyday tasks easier, or help make leisure time more enjoyable.

“As technology builds upon itself and becomes cheaper and wider spread, it’s important to see the beginnings of how it was developed,” said Kristy Pron, SIGGRAPH 2015 Emerging Technologies Program Chair. “For this year’s conference, we wanted to find technologies that can be applied to daily life, whether it will be tomorrow or in a few years, but can be applicable regardless. It’s exciting to see first-hand a technology that you can follow the development of and know that it will be relevant to you in the near future. Also, we wanted to uncover practical applications of emerging technology from various industries, such as automotive or assistive, and I believe we’ve done that with the wide range of interactive installations that will be showcased.”

SIGGRAPH 2015 Emerging Technologies demonstration highlights include:

  • An Auto-Multiscopic Projector Array for Interactive Digital Humans
    Presented by Linkoping University and the University of Southern California
    With this installation, users interact with life-size 3D digital human subjects displayed via a dense array of 216 video projectors to generate images with high angular density over a wide field of view. As users move around the display, their eyes transition from one view to the next, making it ideal for displaying life-size subjects and it allows for natural personal interactions with 3D cues, such as eye-gaze and spatial hand gestures. Automultiscopic 3D displays allow a large number of people to experience 3D content simultaneously without the need for special glasses or headgear.
  • Ford Immersive Vehicle Environment
    Presented by the Ford Motor Company
    The Ford Immersive Vehicle Environment (FiVE) is a highly realistic immersive virtual reality system that addresses the unique challenges of automotive design, engineering and ergonomics. FiVE enables a collaborative approach that allows its program teams to see and understand complex engineering issues from any customer’s perspective; while, considering aesthetic design, fit and finish, manufacturability and maintenance of a vehicle’s system.
  • Christie Digital Sandbox
    Presented by Christie Digital
    Christie Digital’s latest technology expands the capabilities of today’s automatic projection-calibration systems. This demonstration uses this technology application to seamlessly calibrate projection-mapped displays automatically on any surface, smooth or complex, and even 3D. Christie Digital’s Sandbox presents an automatic alignment of a projection display in less than 30 seconds, even after the projector and/or surface is moved.
  • Semantic Paint: Interactive Segmentation and Learning of 3D Worlds
    Presented by Stanford University, Nankai University, University of Oxford and Microsoft Research
    This installation is a real-time system, interactive system for geometric reconstruction and object class segmentation of 3D worlds. With this system, a user can walk into a room wearing a consumer-depth camera and a virtual reality headset, and reconstruct the 3D scene and interactively segment it into object classes. The user physically interacts with the scene in the real world, touching objects and using voice commands to assign appropriate labels to these objects.
  • MidAir Touch Display
    Presented by Keio University and the University of Tokyo
    The MidAir Touch Display integrates technology for tactile feedback, acoustic energy distribution, planer phased arrays, ultrasonic fields and aerial images through the Aerial Imaging Plate to provide visuo-tactile interaction with bare hands. This project enables users to see and touch virtually floating objects with the naked eye and their hands for true interaction. This presentation is a SIGGRAPH pick from the DC Expo in Japan.

For more information about SIGGRAPH 2015 and the Emerging Technologies program, follow the conference on Facebook , Twitter , Google+ , YouTube , Instagram and the ACM SIGGRAPH blog .

1 | 2  Next Page »



© 2024 Internet Business Systems, Inc.
670 Aberdeen Way, Milpitas, CA 95035
+1 (408) 882-6554 — Contact Us
ShareCG™ is a trademark of Internet Business Systems, Inc.

Report a Bug Report Abuse Make a Suggestion About Privacy Policy Contact Us User Agreement Advertise