Speaker Q&A: Mahdi Azmandian on the science and engineering of redirected walking in VR

Mahdi Azmandian is a Ph.D. candidate at the University of Southern California and will be at VRDC 2017 to present his talk The Science and Engineering of Redirected Walking, which will cover what redirected walking is and how it’s deployed in practice. Here, Azmandian gives us some information about himself and his work.

Attend VRDC Fall 2017 to learn about immersive games & entertainment, brand experiences, and innovative use cases across industries.

Tell us about yourself and your work in VR/AR

I’m a PhD candidate at the University of Southern California. Working under the supervision of Dr. Evan Suma Rosenberg at the ICT Mixed Reality Lab, my dissertation has been focused on developing a generalized framework for Redirected Walking in immersive virtual environments.

Throughout my career as a researcher, most of my attention has been spent on what I’d like to refer to as “the engineering of illusions.” A great deal of finesse and craftsmanship goes into executing the perfect magic trick, and Redirected Walking is no exception. Getting the machinery right is what makes the difference between a captivating illusion and a gimmick that just falls flat.

While VR often aims to faithfully replicate reality, it can also be used to trick us into believing things that are not true. Things start to become really interesting when we start manipulating our senses  to solve fundamental problems of virtual reality. This is the key idea behind Redirected Walking.

Without spoiling it too much, tell us what you’ll be talking about at VRDC

The talk will take an honest look at what Redirected Walking is really about, and what it can and cannot do. As the title suggests, I’ll be talking about both the science of Redirected Walking meaning how it works, and how to improve it, and also the engineering aspects which include how to implement it effectively and how much space you need to deploy it. My goal is to make sure everyone is on the same page with regards to the scope and current state of the art, and more importantly decide if they want to use it for their own applications, and what tools are available to get a head start on development.

What excites you most about VR/AR?

Perhaps the most enticing aspect of VR/AR for me is how it opens the door to experience a broad range of emotions through a novel form of expression. Thrilling experiences that would normally be either impractical, costly, or even dangerous can be felt in an intimate way directly through our senses. Pushing matters even further, VR/AR can even offer entirely surreal worlds with concepts and colors that have no counterpart in the rule-bounded world we live in.

Being on the research side of VR/AR, what I find particularly gratifying is that the outcome of each endeavor manifests in the tangible form of an experience. In this way, sharing your work with the community goes beyond mere education; but it provides a very personal way to reach every single person.

What do you think is the biggest challenge to realizing VR/AR’s potential?

I think in the near term the cost of access to VR/AR is the greatest hurdle. The more ubiquitous it becomes, the more people can become familiar with it to appreciate what it has to offer. Broader access can also allow more creative minds can explore its capabilities to develop applications that can connect with various audiences. Eventually, much like the internet, each of us can find our own way of using VR, and our own set of niche applications that please our palette.

What’s the biggest limitation of tracked spaces in VR?

Despite the many advances in tracking technologies, wide-area tracking is still prohibitively costly. This is perhaps partly because moving up from room-scale tracking isn’t just a simple matter of duplicating resources. The high framerate and low-latency requirements for VR/AR are difficult to keep up with when more sensors are involved and a greater number of components need to be synchronized.

Beyond just determining accurate tracking information, the transmission of data is also a persistent issue. Until robust wide-range wireless solutions for video transmission become available, we need to cope with the vexing task of managing lengthy cables.

All things considered, I am excited to see with inside-out tracking solutions hitting the market, how the landscape of tracking will evolve.

The Redirected Walking Toolkit is open source. Has the collaborative element of open source helped with development of the software?

I don’t have the full picture of how broad the toolkit’s reach is, but we’ve had a number of correspondences with institutions including MIT, Georgia Tech, Unity Research Labs, UCF, UT Dallas, the University of Hamburg, and the University of Hong Kong. The feedback we’ve received has helped greatly with finding bugs and fixes for the toolkit. More importantly, our discussions have to lead ongoing plans for collaboration with VR researchers to incorporate their advancements of the field into the toolkit and extending its capabilities even further.

Register for VRDC Fall 2017 to hear more about the Redirected Walking Toolkit from Mahdi, and join other creators of amazing, immersive experiences at the premier industry event.