Latest News

 

More Than Virtual Fun and Games

Published on March 21, 2018 by Aaron Dubrow

Deena Rennerfeldt, a Ph.D. candidate at MIT wanted to understand how stem cells interacted, so she developed a virtual reality tool that gave her a petri-dish-eye view of how cells coalesce, grow and evolve in real time.

Marvis Cruz and Grace Rodríguez, students from the University of Florida and the University of Puerto Rico, wanted to help doctors and nurses understand how individuals with epilepsy feel after a seizure, a time where they frequently experience powerful visual and auditory distortions. Working with visualization experts at the Texas Advanced Computing Center (TACC), they created a virtual reality tool that approximates those symptoms in virtual space and can be used by health care providers to increase empathy.

Zoltan Nadasdy, a neuroscientist at The University of Texas at Austin, wanted to understand how grid cells in the brain help orient us in the world. To do so, he created a virtual reality tool for use by individuals with implanted electrodes that tracks their brain processes while they move in virtual space.

When virtual reality (VR) and augmented reality (AR) are extended beyond video games and into the world of research and knowledge representation, they expand our understanding of the world and present new perspectives.

About five years ago, TACC — already a leader in scientific visualization — turned its attention to VR and AR technologies to explore how they can be used to advance research. Pilot projects enabled by the center have shown the potential of immersive visualization to extend knowledge.

From historical recreations to augmented air traffic control, the results are impressive.

Virtual time travel to Jane Austen's world

The first major VR project that TACC partnered on was "What Jane Saw" — an effort led by Janine Barchas, an English professor at UT Austin. The project lets individuals time travel to an art exhibition in 1813 of works by Sir Joshua Reynolds that was attended by novelist Jane Austen two centuries ago.

University of Texas at Austin English professor, Janine Barchas, collaborated with staff at TACC to create the virtual reality experience, "What Jane Saw." The project lets individuals time travel to an 1813 art exhibition of works by Sir Joshua Reynolds that was attended by novelist Jane Austen two centuries ago.

Barchas originally developed a three-dimensional model of the three-room gallery with its 141 Reynolds paintings for a website dedicated to recreating this 1813 blockbuster. TACC staff, intrigued by the project, asked if they could collaborate to transform the website's environment into a virtual reality experience.

Initially she resisted. "I associated VR entirely with video games and didn't think of it as a pedagogical tool," she said. However, when they began to ask interesting questions — like how tall Jane Austen was (to set eye level), or what direction the gallery faced (to calculate sun position) — she realized there might be some research and educational value in giving the technology a try.

Over the next few years, Barchas and TACC staff worked to accurately depict the building's geometry, the precise location of each portrait, the perspective of a virtual Jane Austen, and even the play of light and shadow on the gallery walls, inside an interactive VR framework.

Using Unity3D, a game creation engine, TACC researchers integrated the model with the Oculus Rift, a virtual reality head-mounted display, and more recently with Vive goggles, to create a fully-immersive, virtual reality experience that lets users move through the gallery and identify relationships between portraits more naturally.

Students have been enthusiastic and vocal about the value of the VR experience in tandem with the informational website, Barchas said.

"Students noticed things in VR that they didn't realize on the website, such as the scale of the paintings and the spatial relationships between them -- how paintings were in conversation with others across the gallery," Barchas said.

Barchas herself is interested in how emerging technologies can be used to bring the past to life, as well as to unlock the future.

"Using this technology, we're stripping the prejudice — that only STEM is about the future — and giving history a new shine," she said. "We're using a holodeck and current tools to shed light upon the humanities generally, and Jane Austen's environment and history as it looked in 1813 specifically. It's proven to be a very mutually productive experience."

Studying how the brain orients us in space

Zoltan Nadasdy had been studying spatial memory in animal brains when he first had the opportunity to study spatial cognition — sometimes call the "GPS of the mind" — in human subjects.

Doctors from University Medical Center Brackenridge (now UT's Dell Seton Medical Center) in Austin, Texas, were monitoring individuals with epilepsy to localize the sources of seizures. The patients were wired with electrodes in their brain, but the doctors were only using small fraction of the data while, most of the time the patient's brain showed normal activity.

UT Austin neuroscientist, Zoltan Nadasdy, created a virtual reality tool for use by individuals with implanted electrodes that tracks their brain processes while they move in virtual space. The research helps understand how grid cells in the brain help orient us in the world.

Nadasdy realized this was a golden opportunity to explore how the human brain processed information. He requested permission to work with these patients to explore a range of phenomena — some related to epilepsy, others related to everyday experiences.

In particular, he wanted to know whether the grid cells which had been first discovered in rats also operated in human brains? However, his subjects could not move freely, so to test their spatial processing, he developed a video game where individuals moved around a virtual space. The first iteration used a tablet computer, but more recently Nadasdy has been developing a VR version of the game.

"With virtual reality, participants don't need to leave the bed. I can just put an Oculus on them and they play the game," Nadasdy said. "These games are so immersive that they emulate the real navigation experience."

TACC staff helped Nadasdy fine-tune his VR experiments. He tested the prototype VR in the TACC Visualization Laboratory and analyzed data from the experiment on TACC's powerful high-performance computers.

His experiments have shown that, though similar, grid cells in human brains work in subtly different ways from those in rats. Whereas both orient oneself by mapping the surrounding space and placing the individual relationally, like a personalized Google Map, grid cells in the human brain relate to the scale of the area — differentiating small, indoor spaces from large, outdoor ones. In general, the human brain adapts to the spatial context of the environment better than the rodent brain does.

Nadasdy is currently working to develop a VR scuba-diving experience to determine whether grid cell mapping is two-dimensional or three-dimensional. He envisions a range of VR-based experiments that explore the mind in ways that normal experiments cannot.

"What if I switch your environment from one second to another?" he asked. "You're in the office and then one second later, you're at the Giza pyramids. How does the brain adjust? We could never do this experiment in the real world, but with VR we can easily make all sorts of transformations and test how the brain adjusts to those changes."

Immersing newsreaders with Immerj

"What Jane Saw" and Nadasdy's grid cell research represent individual research collaborations, but TACC is also developing tools to serve a whole community.

In 2015, the center partnered with the UT Austin School of Journalism, the UT3D film program, and the Washington Post on an effort to develop an open-source virtual reality publishing framework for journalists who lack the engineering skills to create and publish their own VR content.

Immerj, an open-source virtual reality publishing framework, allows journalists to publish their own VR content. The project is being developed by staff at TACC in collaboration with the UT Austin School of Journalism, the UT3D film program and the Washington Post.

The first tool they created, called Immerj, makes it easy to embed text or 3D models within 360-degree virtual reality video. Creators can add content elements — including in-scene text, images, 3D models, spatial audio, and video-in-video — to their reporting and place the objects so they appear as they would in VR.

"We spent a lot of time with stakeholders to find out what they would be interested in," said Brian McCann, a member of TACC's Visualization Interfaces & Applications group who helped lead the effort. "The secret is not adding too much, but still allowing journalists to do what they want."

The software was developed by five students from UT Austin under the guidance of experts from TACC's Visualization group and with support from the university and the Knight Foundation. The team presented a prototype of the tool at the Online News Association annual meeting in Miami, at the Texas Tribune Festival in Austin (both 2016), and more recently at the 2017 IEEE-VR Conference in Los Angeles.

"We decided to team up with the Texas Advanced Computing Center and its visualization lab," said R.B. Brenner, a professor of journalism at UT Austin, former editor at the Washington Post and one of the leads on the project. "Now we are working with computer scientists and engineering students. They learn from us; we learn from them. All of a sudden we had this powerful collection of talents. That is what has elevated our project."

Starting in January 2018, journalists at leading media organizations and universities, including Stanford, Syracuse, the Washington Post, NPR, NBC Network-owned stations, Dallas Morning News and the Texas Tribune, will begin beta testing Immerj in their newsrooms.

Said McCann: "With Immerj, we want to democratize these newly available immersive technologies, thus helping storytellers reach their audience with a more impactful sensory experience."

Augmenting Scientific Inquiry With Augmented Reality

Scientific visualization brings research data to life, but it still frequently lies flat on a computer screen, making interpretation difficult.

Augmented and virtual reality, on the other hand, provide another dimension through which researchers can experience their data.

A proof-of-concept demonstration transformed fusion simulation data into an augmented reality visualization.

In recent years, staff members at the Texas Advanced Computing Center (TACC) have begun developing applications for the Microsoft HoloLens that let scientists interact with their computer models in new ways.

At the 2017 International Conference for High Performance Computing, Networking, Storage and Analysis, they unveiled a proof-of-concept demonstration that allowed scientists to see a plasma model, developed by University of Texas physicist Wendell Horton, evolve over time in virtual 3-D space.

"The dynamics of plasmas is complicated due to the interaction of the electric and magnetic fields with electrically charged ions and electrons. This leads to complex vortex structures in three dimensions that are rather impossible to understand with two-dimensional projections," Horton said. "Augmented and virtual reality tools give us a new way to really see what is taking place in these evolving complex nonlinear vortex plasma structures."

Scientists can move around the virtual object as the plasma transforms from a coherent cylinder to a complex, undulating tornado of energy. New insights emerge in this more naturalistic context.

To bring the augmented reality visualization to life, experts from TACC converted Horton's plasma datasets into a form that could be ingested into the Unity platform, a leading framework for AR content creation. They then overlaid text and audio, making a sharable AR experience that members of Horton's group, collaborators or other researchers can view and interpret.

"The HoloLens project with Dr. Horton served as a successful proof-of-concept for augmented reality scientific visualization," said Greg Foss, one of the lead visualization researchers on the project. "His team is now considering how to best take advantage of the technology."

Horton's fusion model is not the only application that TACC has experimented with. They have created AR representations of physics-based models of clouds and developed a tool that allows air traffic controllers to perceive planes in the sky, even when they are indoors.

VR for data visualization

In addition to collaborating with researchers and developing new tools, TACC is teaching scientists how to design their own custom VR software.

In December 2016, McCann, Kelly Gaither, director of visualization at TACC, and Matthew Vaughn, TACC's director of life sciences, travelled to Cold Springs Harbor Laboratory to lead a 10-day workshop on "Immersive Approaches to Biological Data Visualization."

At the workshop, 10 researchers from around the world learned the principles of information and scientific visualization and then used these principles to develop immersive VR and AR projects. Researchers worked with Unity 3D and Blender, as well as scripting environments such as Bash and Python, to craft data visualizations for the Oculus Rift virtual reality headset and Leap Motion gesture sensor.

For Deena Rennerfeldt, who had no experience programming for VR, the process of developing her petri dish VR tool was remarkably easy. More importantly, it helped her gain insights into the behavior of stem cells, and communicate her results, in ways that traditional data analysis or visualization could not.

As stem cells age, they change, leading them to sometimes be ineffective when reinserted into a patient after being cultured.

"One of the challenging things with my work is conveying the extent of heterogeneity in a group of cells that are supposed to be identical," Rennerfeldt explained. "Making a VR environment where you're the size of the cells was a cool way to visually represent the extent of heterogeneity that never has been done before."

Her VR tool translated lab data of individual cells into moving, growing, and changing blobs, that, she noticed, clump together in fascinating ways.

"Some of the larger cells seemed to have groupings of smaller cells around them," she said. "Having the cells represented as spheres, instead of their two-dimensional form, was helpful for me. I'm now taking away a lot of information, seeing only their dynamics in size, location and generation."

Rennerfeldt plans to adapt the VR experience for groups of cells and will explore this grouping phenomenon in further lab experiments.

TACC experts led a 10-day workshop on "Immersive Approaches to Biological Data Visualization" at Cold Springs Harbor Laboratory in December 2016. There, researchers from around the world learned the principles of information and scientific visualization and developed immersive VR and AR projects. Above, an example project by Travis Sheppard of Stanford University, that visualizes the 3D structure of DNA.

Travis Sheppard, a front-end web developer in the Cherry Lab at Stanford University, developed a VR interface to explore the three-dimensional structure of DNA and chromatin, which packages DNA and gives it its compressed shape, using data from McGill's 3D Human Genome project.

"The 3D structures are easier to navigate in virtual reality," he said. "Also, for science communication and education purposes, the VR aspect makes the content appealing to a younger audience."

Another scholar, Maxime Hebrard, a post-doctoral bioinformatics researcher at RIKEN in Japan, developed a tool to search through scientific literature, finding related, relevant studies and annotations with ease.

"This technology opens a lot of new perspectives to the field," he said. "You can use an almost infinite workspace, spread in all directions, and interact with your data in new ways that are be more intuitive."

Rennerfeldt said she felt like she was on the bleeding edge of something that could be huge. "It has an enormous amount of potential for communicating science in a completely different way... It's an exciting new frontier that I want to be a part of."

Sheppard concurred. "The field is moving so quickly. It's really up to us to write the next chapter of the book."


Writing the next chapter, and helping researchers write it themselves, is what TACC intends to do.

TACC staff continue to experiment with emerging tools and technologies and to seek out research collaborations that would be advanced by VR and AR technologies.

"Virtual and augmented reality capabilities will continue to expand how we explore and discover, both in scientific inquiry and in our daily lives," said Paul Navrátil, Director of Visualization at TACC. "We are proud to be pushing the art of the possible here at TACC in conjunction with a broad and diverse set of partners."


Story Highlights

TACC — a leader in scientific visualization — is working to develop virtual reality and augmented reality technologies that can help researchers develop new insights. Efforts include collaborations with literature professors, neuroscientists and atmospheric scientists.

In addition to assisting researchers, TACC has developed tools for journalists and is teaching immersive visualization workshops for biologists.

The effort extends VR and AR beyond video games and entertainment and into the realm of research and knowledge representation, expanding our understanding of the world and presenting new perspectives.


Contact

Faith Singer-Villalobos

Communications Manager
faith@tacc.utexas.edu | 512-232-5771

Aaron Dubrow

Science And Technology Writer
aarondubrow@tacc.utexas.edu

Jorge Salazar

Technical Writer/Editor
jorge@tacc.utexas.edu | 512-475-9411