The Live Q&A for this talk will begin as soon as the livestream ends. In the event the livestream runs over, the Live Q&A and any follow-up conversation will continue in the Discord channel for this talk.
The era of spatial computing is here. Augmented and virtual reality is more accessible to creators, designers, and end users than ever before bringing new opportunities for applications and digital experiences that enhance the world as we know it. This is especially true of mobile computing with over 75% of Android users and 94% of iOS users having a device capable of accessing augmented reality easily over the web without the need to download an app. With advanced spatial awareness capabilities like LIDAR on the most recent smartphone models, spatial computing represents a paradigm shift similar to what we saw when the original iPhone changed the way we think of mobile design and development. In this talk we’ll explore how IA and UX practitioners can get started in spatial computing through the lens of a real-world augmented reality case study.
Recently our team of designers and data engineers worked on a challenge to create an augmented reality interface for first responders, leveraging IoT sensor data to improve communications and make critical decisions faster in a disaster response. Our first step was to identify and organize the universe of data relevant to the disaster scenario, a flood emergency. Through early collaboration with our data team to organize and map the information to real world use cases, we identified an opportunity to develop a predictive model that could help first responders in the field visualize critical data in the context of the physical world and the areas of most critical risk and need.
Addressing the need to quickly make decisions and take action, we were further able to leverage augmented reality to visualize predictive models created by our data team, enabling the ability to visualize not only conditions in the present but also where the most likely areas of critical need would be in the future. This presented the opportunity to accelerate the time to proactively make critical decisions about things like road closures, business & residential notification, and deployment of personnel with the potential to operate more efficiently and most importantly safe lives.
While the project represents an early proof of concept, the importance of information architecture and collaboration early in the design process had strong benefits for helping those new to working with new technology become familiar with the opportunity for and challenges of spatial computing and immersive experiences.
To conclude, we’ll review and discuss how to apply these familiar IA and UX practices and techniques and combine them with new ones for working in 3D to a broad range of applications for augmented and virtual reality applications like education, training, immersive storytelling, and more.
Register now to get access to the full library of talks. Already registered? Sign in now to access IAC21.
About the speakers
Jordan Higgins is a human-centered engineer lead at MITRE. Jordan specializes in spatial computing and augmented, virtual, and mixed reality experiences, with an emphasis on ensuring accessibility and fostering collaboration with emerging technologies. Active in the UX and immersive communities, he serves as President of UXPA DC and an organizer of DCUX, and teaches design as an adjunct at the George Mason University School of Art.