In the spring semester, I worked in collaboration with Erika Meitner (poetry), Charles Nichols (music), and Rachel Rugh (dance) to create a multimedia performance in the Cube, What Bends. I used the cyclorama for surround video throughout the piece, and during one section we also used the motion capture system to allow Rachel to use her hands to draw video onto the screen. Charles performed live with electric violin and electronics, using the massively multi-channel sound system in the Cube and also processing and spatially distributing the poetry readings by Erika. We performed What Bends 5 times in total, on March 11 and May 6, 2017. It was a fantastic experience working with them, and we’re all pleased with what came out the other side.
Documentation or it didn’t happen:
The Economical and Sustainable Materials Strategic Growth Area may be getting a new building. Andres Salazar Del Pozo, a Master in Architecture candidate, was recruited to create an architectural visualization. He brought his model to me, and I put it into Unity so we can do VR walk throughs in the Cube! This is still a work in progress as Andres continues to develop the model and we make it look better in Unity, but check out how it’s going so far!
With the initial phase of development complete, Belle2VR has garnered interest around the world. We’re happy to send the software to anyone that’s interested, and we plan to open source the entire thing soon!
Here are some folks in the High Energy Physics Group at the University of Hawaii Manoa trying it out:
Leo took a laptop and Oculus kit to the Belle II General Collaboration Meeting at the KEK laboratories in Japan and set it up in the lobby:
We hope to continue development and distribute this educational software as widely as possible!
The meteorological visualization project now has an app to show off the work we did! You can use your phone to see the tornado visualizations, and if you have a Google Cardboard or similar VR headset for a phone, you can see them in VR!
It’s currently Android only, and it requires your phone to have a gyroscope to work.
Check it out on the Play Store, it’s free!
The ICAT SEAD grant funding for the project BelleIIVR – Subatomic Particle Physics in VR, is coming to a close. Over the last year, we successfully created a virtual reality visualization of the Belle II detector, and tested it in the Cube with physics students. Now, we’re working to integrate the interactive visualization into the undergraduate nuclear & particle physics curriculum at Virginia Tech. We made a video to describe the project and show off our new footage:
When BelleIIVR is used at Virginia Tech, we have the privilege of using the Cube as a lab for untethered, locomotive VR. However, we would also love for people around the world to be able to learn and get excited about particle physics using this visualization. So, we’ve created versions that can work just with the Oculus and a controller, or even just with mouse and keyboard at a computer. Eventually, we may even release a version for mobile platforms. Leo Piilonen will be taking an Oculus version of the simulation to the Belle II General Collaboration Meeting, for the Belle II team as a whole to see. Besides becoming part of the curriculum at Tech, we’re also excited to apply for additional funding to continue building out the simulation and adding more features.
A few days ago, about 30 high school students interested in VR came to visit Virginia Tech. They toured Doug Bowman’s 3D Interaction Lab in the Perform Studio, the DAAS studio, and the ARC Visionarium. They were a great group – very enthusiastic about the opportunities here at Tech. Here’s a photo album of their visit they were kind enough to share!
Yesterday was ICAT Day! A lot of amazing projects at the nexus of science, engineering, art, and design from around the university were exhibited. Check out the Flickr photo roll here:
Also, some local news stations were there! Here is their coverage:
Two of the major projects I’ve spent time on this year, VR Physics and Vauquois, were present at ICAT Day. I will be posting more soon showing the outcomes of these projects as we finish wrapping up for the semester, but for now I will just post a couple of photos from ICAT Day.
Professors Chris Williams (Mechanical Engineering, Engineering Education) and Tim Long (Chemistry) along with grad student Joseph Kubalak are interested in using the Cube to explore the deviation between 3D models and the 3D printed physical manifestations of those models. So far, we have explored two models – a tissue scaffold (normally about thumbnail size) and a dental model. These models were printed and then CT scanned. The scans were converted into meshes, and analyzed for the difference between the scanned version of the physical object and the original 3D model. That analysis was applied to to the models as color. At that point, they sent me the scanned meshes (with vertex color) and I tossed them into Unity so we could explore them in the Cube with our tetherless VR setup. Here are two quick videos showing what they’ve been looking at:
Over in the VT library, Michael Stamper was recently hired as the Data Visualization Designer & Consultant for the Arts. It made sense for Michael and I start a dialogue and figure out how we can help build the collaborative work between the library and ICAT. As a starting place, we decided to create a 3D VR visualization of the Map of Science (http://www.mapofscience.com/, http://sci.cns.iu.edu/ucsdmap/).
This is the first iteration – a direct translation of the ucsdmap data set into virtual space.
We have to figure out where we want to go from here, but it was an excellent first step!
Although I haven’t posted in a while, a lot of projects are continuing to develop. In fact, there are number of posts I will be making in the forthcoming week with some project updates and new projects.
For today, I just wanted to show some images of the sports field Lucas Freeman (Creative Technologies MFA student) modeled for us. Although this isn’t an official project yet, I’ve been working with Robin Queen in BEAM and Todd Ogle in TLOS to start prototyping two sports simulations. The first is a soccer simulation for VR, which Robin can use in her biomechanics lab to make users feel like they’re actually on the field, in a real soccer situation. She can then capture data on how they react to investigate various sports injuries. The second is a football simulator, designed for quarterback training. They might be in the Cube or in Perform studio wearing a headset and seeing various game scenarios. They then throw the ball (into a net!) based on that stimuli. That’s the very basic outline of the ideas, and you’ll hopefully continue to see those take shape over the coming months.
We already have the functionality in place for these things, so the next step was to get some modeling done. See the images below. Next week, we will do some mocap to capture animation for the virtual players that will go into the simulation. Cool stuff, great job Lucas!