In the spring semester, I worked in collaboration with Erika Meitner (poetry), Charles Nichols (music), and Rachel Rugh (dance) to create a multimedia performance in the Cube, What Bends. I used the cyclorama for surround video throughout the piece, and during one section we also used the motion capture system to allow Rachel to use her hands to draw video onto the screen. Charles performed live with electric violin and electronics, using the massively multi-channel sound system in the Cube and also processing and spatially distributing the poetry readings by Erika. We performed What Bends 5 times in total, on March 11 and May 6, 2017. It was a fantastic experience working with them, and we’re all pleased with what came out the other side.
The Economical and Sustainable Materials Strategic Growth Area may be getting a new building. Andres Salazar Del Pozo, a Master in Architecture candidate, was recruited to create an architectural visualization. He brought his model to me, and I put it into Unity so we can do VR walk throughs in the Cube! This is still a work in progress as Andres continues to develop the model and we make it look better in Unity, but check out how it’s going so far!
Professors Chris Williams (Mechanical Engineering, Engineering Education) and Tim Long (Chemistry) along with grad student Joseph Kubalak are interested in using the Cube to explore the deviation between 3D models and the 3D printed physical manifestations of those models. So far, we have explored two models – a tissue scaffold (normally about thumbnail size) and a dental model. These models were printed and then CT scanned. The scans were converted into meshes, and analyzed for the difference between the scanned version of the physical object and the original 3D model. That analysis was applied to to the models as color. At that point, they sent me the scanned meshes (with vertex color) and I tossed them into Unity so we could explore them in the Cube with our tetherless VR setup. Here are two quick videos showing what they’ve been looking at:
Over in the VT library, Michael Stamper was recently hired as the Data Visualization Designer & Consultant for the Arts. It made sense for Michael and I start a dialogue and figure out how we can help build the collaborative work between the library and ICAT. As a starting place, we decided to create a 3D VR visualization of the Map of Science (http://www.mapofscience.com/, http://sci.cns.iu.edu/ucsdmap/).
This is the first iteration – a direct translation of the ucsdmap data set into virtual space.
We have to figure out where we want to go from here, but it was an excellent first step!
Although I haven’t posted in a while, a lot of projects are continuing to develop. In fact, there are number of posts I will be making in the forthcoming week with some project updates and new projects.
For today, I just wanted to show some images of the sports field Lucas Freeman (Creative Technologies MFA student) modeled for us. Although this isn’t an official project yet, I’ve been working with Robin Queen in BEAM and Todd Ogle in TLOS to start prototyping two sports simulations. The first is a soccer simulation for VR, which Robin can use in her biomechanics lab to make users feel like they’re actually on the field, in a real soccer situation. She can then capture data on how they react to investigate various sports injuries. The second is a football simulator, designed for quarterback training. They might be in the Cube or in Perform studio wearing a headset and seeing various game scenarios. They then throw the ball (into a net!) based on that stimuli. That’s the very basic outline of the ideas, and you’ll hopefully continue to see those take shape over the coming months.
We already have the functionality in place for these things, so the next step was to get some modeling done. See the images below. Next week, we will do some mocap to capture animation for the virtual players that will go into the simulation. Cool stuff, great job Lucas!
Yesterday, we paid our second visit to the Science Museum of Western Virginia’s museum planetarium facility in Roanoke. We (ICAT, Advanced Research Computing, Architecture, Music) are working with museum staff and community volunteers to build a vision for the future use of the amazing space, which is currently under used because of outdated technology.
On this visit, in addition to our minds and some measuring tapes, we brought four projectors (from ARC’s old VisCube setup), a computer, and a Max patch which I rigged up the day before to test stitching and blending a single video across all four projectors. It worked great! There are still some issues to be worked out in terms of perfecting the blending and getting 4k video going smoothly, but the four projectors were able to cover almost the entire screen! Here’s an idea of what it looked like (an image doesn’t really do it justice because the projection is so big, the camera lens can’t capture it all). With chickens. Don’t ask.
To install, you will need to go your phone’s Settings menu, then go to Security, then enable Unknown Sources. This allows your phone to install an app from a source other than the Play Store. You can turn this back off as soon as you’re done installing this app.
In the near future, I will get this demo also built for iOS, and then published to both the Play Store and the Apple Store.
Disclaimer: This material is based upon work supported by the National Science Foundation under Grant No. 1305231. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.