What Operating Rooms Can Teach Us About 'Calm' Design
Evie Powell is a virtual reality engineer and the lead UX designer for Proprio, a surgical imaging company that fuses human and computer vision to create a powerful system for surgical performance and training. She spoke to Doreen Lorenzo for Designing Women, a series of interviews with brilliant women in the design industry. Doreen Lorenzo: When did you first realize you were interested in design? Evie Powell: I spent a lot of time as a kid deeply engaged with my computer. I started out with simple arcade games, the Pac-Mans, the Marios, and all of that. Then in middle school and early high school I played Japanese RPGs that involved developing a character from a very low level. I eventually started playing games that were only released in Japan and that got me into learning Japanese when I was in college. As an introvert attending the Game Developers Conference for the first time, participating in a business-card trading game is what shifted me into a different mindset where suddenly, talking to people to collect and trade business cards was something that I really wanted to do. That’s where my research into games really started to take root and direct my career trajectory. I did my PhD at UNC Charlotte on socially pervasive game experiences, where my research was focused on how to create a game that never really turns off. I studied how computing would change and how I thought design—specifically software-engineering design and software user-experience design—would have to change as computing becomes increasingly ubiquitous and contextually aware. As computers have gotten smaller and more powerful every year, the capabilities they enable and the way we think about computing have fundamentally changed. It’s like having a library at your fingertips. And games are no different—we use games and play to learn better. Games structure how we gather information and potentially learn and create new things. So how do you create a game with no spatial, temporal, or social boundaries? What does that mean for the person who’s making the game, and what does that mean for the user? That’s what my research was about. Dr. Powell uses a VR headset to test the placement of a virtual screw into a model of a human spine. [Photo: courtesy Proprio] DL: What led you down the path from gaming to healthcare? EP: After my PhD, I ended up working on natural user interfaces and the Kinect technology at Xbox. Ultimately my research in games always had a focus on helping people learn, play, and experience things differently by shifting mindsets. In healthcare, anticipating how a surgeon needs to think and designing a suite of tools to empower them to think and perform optimally is a great next step for me and my research. DL: Tell us more about what you’re working on now at Proprio. EP: We’ve built a robotic surgical system, which is guided by many cameras and sensors that can focus on a surgical site and create a digitization of whatever surface anatomy you’re looking at. In a process called light-field rendering, we capture light data from multiple directions and use it to provide a physically accurate representation of the scene for the surgeon to interact with in VR from every possible viewpoint. This real-time representation can also be fused with preoperative imaging and other patient data to further enhance a surgeon’s vision in the operating room. You could think of this as the next generation of a microscope. Usually a surgeon has to press their head up against the microscope lens in order to see the close-up view of…
Like to keep reading?
This article first appeared on fastcompany.com. If you'd like to keep reading, follow the white rabbit.