Experiments in Virtual Reality at W&L’s IQ Center

by Paul Low and David Pfaff

The Integrative and Quantitative (IQ) Center at W&L is a collaborative space where new technologies are made available to the entire campus – often before we know exactly how those technologies will be used. A recent example of this is virtual reality (VR). At W&L we’ve been experimenting with the technology for a couple years, starting with cell phone-based VR systems (like Google Cardboard) but in the past year, the price and quality of VR hardware has reached a point that has attracted a much wider audience, including retail consumers and small schools like ours and this year we upgraded to a dedicated VR headset, called the HTC Vive. These new VR headsets provide a compelling (and immersive) way to visualize and interact with content but there is very little educational content currently available, especially for higher education. This means that, for the time being, getting the most out of these systems requires either creating original content or adapting existing material to work in VR. Fortunately, when it comes to visualization, many of the workflows that we have developed over the past few years for generating and manipulating 3D content such as molecular modeling, 3D animation, motion capture, photogrammetry, geographic information systems, 360-degree photography and video, etc. translate well to VR platforms with a little work and a healthy respect for the current limitations of the hardware. Developing interactive scenes for VR takes a little more work and some specialized skills but the potential for creating educational tools that facilitate active and blended learning at all levels of education are virtually limitless.

Since we are so new to VR, most of our projects can be generously described as “ongoing”; nevertheless, this summer we had our first team of student VR developers – a group of incoming first year students participating in the month-long W&L Advanced Research Cohort (ARC) program. Their project involved capturing motion of themselves performing various exercises (running, yoga, etc.) then visualizing and analyzing the movement in VR. This fall, we developed our first VR “homework assignment” with Jill Leonard-Pingel for her “General Geology” (100-level) class, and most recently worked with a couple of teams from Gregg Whitworth’s “Molecular Mechanics of Life” class to create interactive scenes that require “player” input to complete complex biochemical reactions in VR. Our ongoing projects include faculty and students in the many STEM fields as well as dance, digital humanities, and theater. The video below shows some of our ongoing projects in VR (most demonstrated by Ashley Ooms ’17), in addition to those mentioned above:

  • Examples of interactive structural biology models (catalyzed phosphorylation reaction)
  • Photogrammetry model of the Liberty Hall Ruins (on the W&L campus) and a laser scan model of a Wooley Mammoth downloaded from the Smithsonian both viewed at 1:1 scale
  • Viewing crystal structures in 3D (from the virtual homework assignment mentioned earlier)
  • Interactive scene developed for a group project in an upper-level biology class (Molecular Mechanics of Life)
  • A 1:1 scale version of downtown Lexington, VA in 1867 created in Sketch-Up by former VMI French Professor Ed Dooley
  • “Grabbable” MRI scans of the brain from the “Glass Brain” project
  • Motion capture animation from a dance class taught by Jenny Davies.