Redefining spacecraft mission collaboration at NASA with mobile virtual reality
Capstone project for the MHCI+D program at the University of Washington sponsored by NASA Jet Propulsion Lab with Edward Roberts, Eugene Meng, Terrie Chan
Interface Design, Visual Design, Research, Concept Generation
Mar 2017 - Aug 2017 (6 months)
With the help of our sponsors at the NASA Jet Propulsion Lab (JPL) OpsLab, our team set out to design a collaborative tool for scientists and engineers to use in the upcoming Europa Clipper spacecraft mission. Set to launch in the early 2020s, the Clipper spacecraft will perform a series of low altitude flybys of Jupiter's icy moon, Europa to measure for the ingredients of life.
To build empathy and a better understanding of the Europa Clipper mission, we performed a series of design research activities with 17 engineers, scientists, domain experts, and NASA mission personnel. We also conducted a competitive analysis of existing immersive applications and tools.
We tested a number of immersive applications for HoloLens, Vive, and Daydream. We also conducted a heuristic evaluation of Google Earth VR (Vive), Galaxy Explorer (Hololens), and Lunaserv Explorer (Web-based NASA tool) to understand interface conventions for our problem space.
Semi-structured interviews helped our team understand the context in which scientists and engineers collaborate with each other in NASA orbiter missions.
Closed Card Sort
A remote closed card sort allowed our team to assess how NASA scientists and engineers prioritize different types of orbiter data in three different mission-based scenarios: safe mode, nominal mode, and unexpected discovery.
Given the complexity of our problem space, we saw value in presenting NASA engineers and scientists with a visual diagram that was derived from our primary and secondary research findings to allow us to receive ongoing critical feedback of our team’s understanding of the collaborative relationship between scientists and engineers.
Now it was time to synthesize our findings into a set of user needs and potential opportunity areas to explore further so that we could hone in our target problem.
To make sense of 17 interviews, 4 card sorting activities, and 5 iterative diagrams, our team created a massive affinity diagram. From this method, we generated over 90 themes from over 1,000 codes of data. These themes were distilled into a series of insights, generative research artifacts, and design principles.
Generative Research Artifacts
Some of qualitative data we collected did not fit into themes or categories and required a more visual interpretation. Therefore, we designed 4 generative research artifacts to help us synthesize and share our findings with each other and our sponsors. One of these was an orbital map, which I designed.
After analyzing and synthesizing the data we collected, we distilled our findings into a set of opportunity areas, insights, and design principles. All of these can be viewed in great detail in our research report. We further refined these findings into a set of 4 user needs:
Need for Time Sensitive Tools
Given the ramp up, ramp down nature of the flyby missions, tools must accommodate the emotional valence, time pressure, and difference in workload.
Need for New, Yet Proven Tech
In the face of mission uncertainty, there’s conflict between NASA “heritage”, or a tendency to gravitate towards tools and operations with a proven track record, and a need for new tools and processes.
Need for Social Translucense
Despite the geographic separation, mission personnel will go to great lengths to communicate face to face for the purpose of mission success.
Need for a Unified Visualization
There’s a communication chasm between instrument science teams and spacecraft engineers, so much so that middlemen have to relay information back and forth. Additionally, different data sets are meant to be viewed and understood together for validation purposes.
Remotely Located Mission Team
We also learned from our research that over half of Europa Clipper mission personnel will be located in different parts of the country during the mission. We decided that this would be an excellent problem space to design around.
For a more thorough look at our research and synthesis, please download our design research report.
To translate our research findings into meaningful design concepts, we went through a rigorous ideation process to create a breadth of ideas that we narrowed down to one concept using our research findings.
Through mutliple rounds of timed rapid sketching, we generated over 90 concepts.
Using our design principles and research findings, we narrowed down our concepts through voting and a series of 2x2 matrices.
Here's a storyboard I drew demonstrating our selected VR concept: A Mobile VR Timeline Simulation. Because it aligned with our principles, user needs, and insights, we decide to move forward with it for testing.
To share our storyboard with our target users at NASA JPL, we created a video slideshow with voice over audio to simulate how our design could be used during Phase E of the Europa Clipper mission. We used this as means of testing the design's context of use and the information needs of JPL users.
To make sense of the mobile experience, we created wireframes and a mobile architecture of the design.
We also created a site map for the VR experience as well to help us better define our interface.
Before creating our UI components, we created a flow demonstrating how the mobile experience would transition to a VR experience.
We also designed a system diagram to demonstrate how our concept would be integrated with current NASA operational tools.
Now that we had chosen a final design, it was time to refine our storyboard and sketches into high fidelity concepts and prototypes to share with our stakeholders at NASA.
SpaceTalk is a mobile virtual reality application that allows engineers, scientists, and mission personnel to remotely collaborate inside a virtual simulation of the mission. Think “Skype” or “FaceTime”, but in virtual reality. Collaborators can discuss complex mission information as if they were, in orbit, next to the spacecraft.
Here's some high fidelity screens that I designed for our group to demonstrate different VR features to our stakeholders at NASA. These screens were derived from specific tasks and information needs that JPL users wanted out of our design concept.
For a more detailed breakdown of our design concept, please download our official design specification.
Mission personnel can quickly and easily request a “VR Call” with other collaborators with a specific orbit in mind via their smartphone. Once accepted, they place their smartphone into a VR headset to be transported to a specific point in time in the mission.
As opposed to having to find the right data at the right distance, the application features 5 information zones in which collaborators can quickly view mission information at optimal distances.
New, Yet Proven
Although there are many different types of VR, we thought mobile VR fell into this sweet spot: one that utilizes a proven and familiar device that we all love and use everyday (smartphones), but does so in an entirely new and immersive way that could simulate a mission environment.
To keep users grounded in an entirely new immersive 3D experience, it makes use of familiar 2D visual systems and interface design conventions as seen smartphones, laptops and desktops, but in a way that introduces the user to the z-axis.
Using VoIP, SpaceTalk seeks to mimic face-to-face conversations as much as possible with Mobile VR.
Avatars are used to describe the types of input collaborators have with the virtual world: in particular their remote position and head position.
SpaceTalk not only simulates the spacecraft in its mission environment at different points in time, it augments it by adding additional meaningful data visualizations on top of the simulation. In this particular example, collaborators can see the states of their instruments shown along the orbital path in context, information that would normally have to be accessed using separate tools as we found in our research.
To present our concept to a general audience, I helped my team design a poster that summed up our project.
To share our concept with our sponsors and stakeholders at NASA Jet Propulsion Lab, we created a concise video prototype to demonstrate how SpaceTalk could be used in a future spacecraft mission.
Perhaps the most exciting aspect of this project is the impact it has created at NASA Jet Propulsion Lab thus far. Ever since we first began to share our research findings with the OpsLab and the Europa Clipper team, we’ve received much praise for our work, so much so that our findings and subsequent prototypes have been shared repeatedly amongst various mission team members.
To quote a JPL data visualization scientist “You guys have uncovered things that took me 6 months to learn at NASA - and you guys are remote!” Additionally, a lead tool developer at JPL stated that “You have both shown me new insights about [NASA JPL] and affirmed my hunches.”
After giving our official project presentation at NASA JPL in Pasadena in late-August, we were excited to see many NASA missions team representatives at our talk expressing interest in the design concepts we shared. A mission planner in particular expressed much interest in our instrument schedule visualization. Additionally, a trajectory engineer for Juno said , “I really wish Juno had VR software working for us.”
From what we understand, using immersive technologies for spacecraft operations has seldom been explored prior to SpaceTalk. Given the exciting conversations and collaborative relationships that SpaceTalk has sparked, I wouldn’t be surprised to see a variation of it used for Europa Clipper and future spacecraft missions.
I’m honored to have worked on such an awesome project with incredible teammates and sponsors. As a result of our hard work, the UW MHCI+D program will have an additional project sponsored by NASA JPL next year. I’m excited to see what the new cohort will come up with.