Zach Duer is the Immersive Environment Specialist at the Institute for Creativity, Arts, and Technology (ICAT) at Virginia Tech. He facilitates faculty and student projects in ICAT facilities by developing interfaces and generating content. He is also an educator, artist, musician, and performer.
The meteorological visualization project now has an app to show off the work we did! You can use your phone to see the tornado visualizations, and if you have a Google Cardboard or similar VR headset for a phone, you can see them in VR!
It’s currently Android only, and it requires your phone to have a gyroscope to work.
The ICAT SEAD grant funding for the project BelleIIVR – Subatomic Particle Physics in VR, is coming to a close. Over the last year, we successfully created a virtual reality visualization of the Belle II detector, and tested it in the Cube with physics students. Now, we’re working to integrate the interactive visualization into the undergraduate nuclear & particle physics curriculum at Virginia Tech. We made a video to describe the project and show off our new footage:
When BelleIIVR is used at Virginia Tech, we have the privilege of using the Cube as a lab for untethered, locomotive VR. However, we would also love for people around the world to be able to learn and get excited about particle physics using this visualization. So, we’ve created versions that can work just with the Oculus and a controller, or even just with mouse and keyboard at a computer. Eventually, we may even release a version for mobile platforms. Leo Piilonen will be taking an Oculus version of the simulation to the Belle II General Collaboration Meeting, for the Belle II team as a whole to see. Besides becoming part of the curriculum at Tech, we’re also excited to apply for additional funding to continue building out the simulation and adding more features.
A few days ago, about 30 high school students interested in VR came to visit Virginia Tech. They toured Doug Bowman’s 3D Interaction Lab in the Perform Studio, the DAAS studio, and the ARC Visionarium. They were a great group – very enthusiastic about the opportunities here at Tech. Here’s a photo album of their visit they were kind enough to share!
Two of the major projects I’ve spent time on this year, VR Physics and Vauquois, were present at ICAT Day. I will be posting more soon showing the outcomes of these projects as we finish wrapping up for the semester, but for now I will just post a couple of photos from ICAT Day.
Professors Chris Williams (Mechanical Engineering, Engineering Education) and Tim Long (Chemistry) along with grad student Joseph Kubalak are interested in using the Cube to explore the deviation between 3D models and the 3D printed physical manifestations of those models. So far, we have explored two models – a tissue scaffold (normally about thumbnail size) and a dental model. These models were printed and then CT scanned. The scans were converted into meshes, and analyzed for the difference between the scanned version of the physical object and the original 3D model. That analysis was applied to to the models as color. At that point, they sent me the scanned meshes (with vertex color) and I tossed them into Unity so we could explore them in the Cube with our tetherless VR setup. Here are two quick videos showing what they’ve been looking at:
Over in the VT library, Michael Stamper was recently hired as the Data Visualization Designer & Consultant for the Arts. It made sense for Michael and I start a dialogue and figure out how we can help build the collaborative work between the library and ICAT. As a starting place, we decided to create a 3D VR visualization of the Map of Science (http://www.mapofscience.com/, http://sci.cns.iu.edu/ucsdmap/).
This is the first iteration – a direct translation of the ucsdmap data set into virtual space.
We have to figure out where we want to go from here, but it was an excellent first step!
Although I haven’t posted in a while, a lot of projects are continuing to develop. In fact, there are number of posts I will be making in the forthcoming week with some project updates and new projects.
For today, I just wanted to show some images of the sports field Lucas Freeman (Creative Technologies MFA student) modeled for us. Although this isn’t an official project yet, I’ve been working with Robin Queen in BEAM and Todd Ogle in TLOS to start prototyping two sports simulations. The first is a soccer simulation for VR, which Robin can use in her biomechanics lab to make users feel like they’re actually on the field, in a real soccer situation. She can then capture data on how they react to investigate various sports injuries. The second is a football simulator, designed for quarterback training. They might be in the Cube or in Perform studio wearing a headset and seeing various game scenarios. They then throw the ball (into a net!) based on that stimuli. That’s the very basic outline of the ideas, and you’ll hopefully continue to see those take shape over the coming months.
We already have the functionality in place for these things, so the next step was to get some modeling done. See the images below. Next week, we will do some mocap to capture animation for the virtual players that will go into the simulation. Cool stuff, great job Lucas!
This Wintermester, Tanner and I co-taught a Honors College course hosted in the ICAT Cube and Perform studios. The goal was for students to learn the ins and outs of these facilities and their systems, and to create projects that combine at least two of the following: motion capture, virtual reality, immersive projection, spatial audio. They got to be the first folks to make use of our VR backpacks, which was really fun to see. It was definitely a trial by fire for all our systems, putting everything to the test, but I’m glad to say that it seems like we built things robustly enough that everything held together.
The class was intense, with 4 hours of class-time per day, 5 days per week for two weeks. The students were fantastic to work with. They were very focused and eager, and I think they came away with a strong and novel experience. Their final projects went way beyond our expectations, and we’re very proud of what they accomplished. There 6 groups and 7 projects.
A virtual reality horror experience in the Oculus HMD with the VR backpack. Walk through a haunted maze and be terrified by the spooky spatialized sounds and terrifying visualizations.
Crowning Achievement (a Pun)
Checkers in outerspace. Using VR with the Oculus and backpack. Multiplayer included. Pick up checkers pieces with your hand and move them around the board. Spatialized background music an added bonus.
A game of asteroids, but with 360 immersive projection on the Cyclorama. Hold a wand and point it at the screen to try to destroy the asteroids flying around. The more you destroy, the faster they go. Spatial audio tracks the asteroids as they fly around you, making a satisfying POP when you destroy an asteroid.
An abstract fixed-media art composition, depicting a space of cylinders and mirrors. Projected in stereoscopic on the Cyclorama, with spatial audio. It’s quite an experience to see this metallic, glass world seem to come out of the screen in 3D.
Where art, particles, typography, and 360 3D projection meet. The depth of the stereoscopic projection they accomplished here was incredible, and it was really fun to see the first experiments in typography on the Cyclorama. The spatial audio of music and sound clips to accompany the video was also very effective.
Peek A Boo
A short animation of a girl hiding in the Cube, peeking out from behind a curtain. Sounds simple, maybe, until you find out that they accomplished this by using motion capture to record the animation, apply it to a prebuilt model, and then rendered in 360 3D for the Cyclorama. With the spatial audio call and response with the girl, it puts you in the virtual space.
A recreation of what it’s like to enter Lane Stadium on gameday as a Hokie football player. The spatial audio is the key component here; the students scraped together a series of audio clips they could find to reassemble the aural experience of coming down the tunnel and out onto the field, making great use of the sound system in the Cube. Unfortunately, it doesn’t seem like VT sports has gotten into the 360 video area yet, so the students used a video from another university instead, which looked amazing on the Cyclorama. This would be great material for them to take to VT Sports to try to pitch for the opportunity to take a real recording.
We’ve made a number of tweaks and user interface adjustments to this project, and it’s time to share them.
We now have sprites and colors for EVERY type of particle. It makes it look so much more amazing. Jesse toiled over this during Thanksgiving break. Thanks Jesse!
The user interface is now diegetic, existing in the world of the virtual environment rather than as a heads-up display or thru a physical tablet. Right now it just hangs out at a particular spot. But when the user is in VR, being motion-captured in the Cube, they will be holding an Xbox controller which will also be tracked. The virtual menu will take the position of the Xbox controller – so essentially users just need to look at their hands to look at the menu. They then select whatever part of the menu they are looking at with a simple look-and-click interface.
We’ve also added the ability to select particles by looking at them and pressing a button. When you select them, a display pops up over the particle, showing key info. On that display, if you click the Save button, the info for that particle gets saved to a display up on the wall (which is still in the works).
The Model section now has a preset subsection, so you can save and load different configurations of the BelleII model being on and off, along with the relative transparency values.
There are also two “scoreboards” on the Cube walls now. One shows the current simulation time, so you can glance at it from anywhere in the room. The other allows the user to start and stop a timer with a press of a button on the Xbox controller.
At Leo’s request, we’re also now showing “dead” particle sprites. Instead of a particle sprite disappearing after the data for the sprite has ended, a “shell” takes its place, showing where the particle was when it … passed away? This has caused some new frame rate issues that I’m struggling with. 30k is a lot of individual sprites! They’re not correctly batching right now, which is what I spent most of my day trying to figure out. I learned a few things that increased the frame rate here and there, but it’s still not great yet. It gets worse as the simulation goes on, because more and more sprites are added.
Additionally, the educational side of this project is really starting to kick into gear. Instead of spending most of the meeting time talking about features for the simulation, we’re talking about lesson plan development, which is awesome! That’s exactly where we want to be moving in to next semester. We plan on piloting this thing with Society of Physics Students folks immediately following spring break!