XR – Cornell Tech https://tech.cornell.edu Fri, 21 Jun 2024 13:27:51 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 https://tech.cornell.edu/wp-content/uploads/2019/09/T_Filled_Cornell-Red-favicon-100x100.png XR – Cornell Tech https://tech.cornell.edu 32 32 Virtual, Mixed Realities Converge in New Driving Simulator https://tech.cornell.edu/news/virtual-mixed-realities-converge-in-new-driving-simulator/ https://tech.cornell.edu/news/virtual-mixed-realities-converge-in-new-driving-simulator/#respond Fri, 21 Jun 2024 13:25:52 +0000 https://tech.cornell.edu/?p=28808 Portobello, a new driving simulator developed by researchers at Cornell Tech, blends virtual and mixed realities, enabling both drivers and passengers to see virtual objects overlaid in the real world. This technology opens up new possibilities for researchers to conduct the same user studies both in the lab and on the road – a novel […]

The post Virtual, Mixed Realities Converge in New Driving Simulator appeared first on Cornell Tech.

]]>
Portobello, a new driving simulator developed by researchers at Cornell Tech, blends virtual and mixed realities, enabling both drivers and passengers to see virtual objects overlaid in the real world.

This technology opens up new possibilities for researchers to conduct the same user studies both in the lab and on the road – a novel concept the team calls “platform portability.”

The research team, led by Wendy Ju, associate professor at the Jacobs Technion-Cornell Institute at Cornell Tech and the Technion, presented their paper, “Portobello: Extended Driving Simulation from the Lab to the Road,” at the ACM Conference on Human Factors in Computing Systems (CHI) in May. The paper earned honorable mention at the conference.

Co-authors included doctoral students Fanjun Bu, Stacey Li, David Goedicke, and Mark Colley; and Gyanendra Sharma, an industrial adviser from Woven by Toyota.

Portobello is an on-road driving simulation system that enables both drivers and passengers to use mixed-reality (XR) headsets. The team’s motivation for developing Portobello stemmed from its work on XR-OOM, an XR driving simulator system. The tool could merge aspects of the physical and digital worlds, but it had limitations.

“While we could stage virtual objects in and around the car – such as in-car virtual displays and virtual dashboards – we had problems staging virtual events relative to objects in the real world, such as a virtual pedestrian crossing on a real crosswalk or having a virtual car stop at real stop signs,” Bu said.

This posed a significant obstacle to conducting meaningful studies, particularly for autonomous driving experiments that require precise staging of objects and events in fixed locations within the environment.

Portobello was conceived to overcome these limitations and anchor on-road driving simulations in the physical world. During the design phase, researchers utilize the Portobello system to generate a precise map of the study environment. Within this map, they can strategically position virtual objects based on real-world elements (placing virtual pedestrians near stop signs, for example). The vehicle operates within the same mapped environment, seamlessly blending simulation and reality.

With the successful integration of Portobello, the team has not only addressed the limitations of XR-OOM but has also introduced platform portability. This innovation enables researchers to conduct identical studies in both controlled laboratory settings and real-world driving scenarios, enhancing the precision and applicability of their findings.

“Participants treat in-lab simulators as visual approximations of real-world scenarios, almost a performative experience,” Bu said. “However, participants treat on-road simulators as functional approximations. [They] felt more stress in on-road simulators and felt their decisions carried more weight.”

Bu said Portobello could facilitate the “twinning of studies” – running the same study across different environments. This, he said, not only makes findings more realistic, but also helps uncover how other factors might affect the results.

Said Ju: “We believe that by going beyond running pristine studies and allowing some variability from real-world to bleed through, research results will be more applicable to real-world settings.”

Hiroshi Yasuda, a human-machine interaction researcher at Toyota Research Institute (TRI), also contributed to the research, which was supported by TRI and Woven by Toyota.

The post Virtual, Mixed Realities Converge in New Driving Simulator appeared first on Cornell Tech.

]]>
https://tech.cornell.edu/news/virtual-mixed-realities-converge-in-new-driving-simulator/feed/ 0
XR Collaboratory Announces Inaugural Grant Winners https://tech.cornell.edu/news/xr-collaboratory-announces-inaugural-grant-winners/ https://tech.cornell.edu/news/xr-collaboratory-announces-inaugural-grant-winners/#respond Wed, 17 Jan 2024 23:34:53 +0000 https://tech.cornell.edu/?p=27843 Director of the XR Collaboratory Harald Haraldsson, PhD student Shuo Feng, and Cornell Tech Assistant Professor Thijs Roumen posing with a custom wire-bent Cornell Tech Twisted T. By Sarah Marquart The XR Collaboratory (XRC) at Cornell Tech has selected two winning proposals for its inaugural XR Collaboratory Prototyping Grant, which provides technical guidance and monetary […]

The post XR Collaboratory Announces Inaugural Grant Winners appeared first on Cornell Tech.

]]>
Director of the XR Collaboratory Harald Haraldsson, PhD student Shuo Feng, and Cornell Tech Assistant Professor Thijs Roumen posing with a custom wire-bent Cornell Tech Twisted T.

By Sarah Marquart

The XR Collaboratory (XRC) at Cornell Tech has selected two winning proposals for its inaugural XR Collaboratory Prototyping Grant, which provides technical guidance and monetary support for developing augmented and virtual reality (AR/VR) applications. Thijs Roumen, assistant professor of information science at Cornell Tech, who is supporting PhD student Shuo Feng, and François Guimbretière, professor of information science at Cornell University, who is supporting PhD student Mose Sakashita, are the recipients of the new grant award.

The XR Collaboratory at Cornell Tech accelerates activities in augmented and virtual reality through course offerings, projects, and cross-campus collaborations. The team’s primary interests are in 3D user interfaces and interaction design for head-mounted displays. They build high-fidelity prototypes with students through projects and coursework, and collaborate with Cornell faculty on exploratory AR/VR-related research across disciplines such as computer vision, computer graphics, and human-computer interaction.

“Conducting research involving the development of 3D user interfaces for AR/VR can be both challenging and time consuming,” says Harald Haraldsson, Director of the XR Collaboratory. “Our aim with this grant is to provide resources to help with applying best practices to AR/VR prototyping for research purposes. This includes how to develop an inventory of modular, reusable components, being consistent across the research team in the use of design patterns and code style, all with the objective of speeding up the prototyping process and ensuring continuity from project to project.”

François Guimbretière and Mose Sakashita
Cornell Bowers CIS Professor François Guimbretière and PhD student Mose Sakashita posing with the VRoxy robotic system.

Roumen and Feng’s XR project focuses on wire bending, a crucial aspect of industrial manufacturing in which items like springs, paperclips, and hooks are mass-produced. Recently, there’s been a growing interest in custom wire bending for medical applications, entertainment, soft robotics, and fabrication support. To enhance the design process and meet this need, the team has proposed building an AR interface to design wire-bent structures.

Guimbretière and Sakashita’s ongoing work explores the potential of integrating VR with physical robots to enhance remote collaboration. It’s based on a prior initiative by the pair called the VRoxy system, a mobile robot with a monitor for a head that can be present on a remote user’s behalf. While VRoxy can mirror the user’s non-verbal cues — such as pointing or face tilting — in real-time, the bot can’t physically interact with objects, limiting certain types of collaboration. Guimbretière and Sakashita’s proposal includes addressing these limitations by implementing a 3D user interface that enables the remote VR user to navigate larger environments and manipulate physical objects with robotic arms.

The post XR Collaboratory Announces Inaugural Grant Winners appeared first on Cornell Tech.

]]>
https://tech.cornell.edu/news/xr-collaboratory-announces-inaugural-grant-winners/feed/ 0