ICAT class tackles transdisciplinarity

(This semester, I’m teaching an Honors course all about ICAT. We spend each class period in a different ICAT studio or on a different ICAT project. This is the second in a series of guest blogs from my students.  Stay tuned for more.  Enjoy!  ~Phyllis)

From guest blogger Pamela Kryschtal:


It’s easy to say that collaboration is important. But why it’s important is a more difficult question. Can disciplines truly and seamlessly coexist? What will the future of this continued collaboration look like?These, among other questions of interdisciplinary work, are the kinds of questions that interest graduate student Kari Zacharias, the guest speaker this past Monday in ourIntroduction to the Institute for Creativity, Art, and Technology (ICAT) class.


As students in this introductory class, we are beginning to expose ourselves to this notion of collaboration among different disciplines. To prepare for this day, we identified vocabulary words that would come in handy. In particular, understanding the difference between multidisciplinary, cross  disciplinary, interdisciplinary, and transdisciplinary. The difference between these concepts includes how they are implemented and how they are utilized to solve complex problems.


Our day with Kari was informative and interactive. Kari began with a general “get to know you” round table, an important step to get the creative juices flowing, particularly when the topic is collaboration. She then continued by giving us a history of the arts and sciences blending together. Everything from the catalyst of “9 Evenings” in the 1960s, to the 1980s where these artist-engineer hybrids began to name themselves, and back to the 1860s, where the foundation for this world began with land grant universities and the transition from solving university problems to real problems.


Kari followed this introduction with an exercise to gain an insight of the possible future of ICAT.She began by asking us what we considered to be challenges with ICAT.We came up with a relatively comprehensive list and included challenges such as improper balance among the disciplines, the inability to communicate, and the job concerns associated with spending too much time outside of your specialty. Kari asked us to pick two of these challenges that stood out to us. We as a class decided on the disciplinary divide in these settings, i.e. how important decisions get made, and the review process, since art is subjective and technology is objective. We placed each of these challenges on a coordinate plane with the disciplinary divide as the y-axis and the review process as the x-axis. Each end of the axes represented the extremes in each of these scenarios, a hierarchal vs. a radically democratic divide and a qualitative vs. a quantitative review process. We split in to groups and each tackled a quadrant of this graph and came up with a scenario for those imaginary settings. For example, what would the world look like if decisions were made in a hierarchy and all decisions were based on qualitative research? Aside from some discrepancies about when qualitative research is, each group came up with applicable and insightful scenarios that could assist in how we approach this world of collaboration.


As an engineering student who does not consider herself very creative, I found this entire day particularly interesting. Kari highlighted that this initial collaboration was intended to “save the soul of the engineer.” In her paper, “Land-Grant Hybrids: From Art and Technology toSEAD,” Kari and author Matthew Wisnioski point out that these types of collaborations at MIT began, “with the intention of “humanizing” MIT’s local and national image through civic art.”was beginning to think that I was studying to become a pencil pushing numbers monger in need of saving until I read that, “MIT provided the equipment and the experts to realize artists’ interpretations in technological media.” I was put at ease with the realization that while I might benefit from an artist’s perspective, the respect is mutual and we all need each other.


Pamela Kryschtal is a seasoned but enthusiastic engineering student at Virginia Tech. She is driven by a passion for experiences and communication.

Honors class visits the Cube

(This semester, I’m teaching an Honors course all about ICAT. We spend each class period in a different ICAT studio or on a different ICAT project. This is the first in a series of guest blogs from my students.  Stay tuned for more.  Enjoy!  ~Phyllis)

From guest blogger Adham Nabhan:

During our visit to The Cube, the class was able to experience firsthand the incredible technologies that the facility has to offer, both visually and acoustically. We began by delving into the splendor of Pink Floyd’s The Dark Side of the Moon using the 130+ speakers that The Cube has at its disposal. Because of the immersive natural of the space, one could listen to guitar riffs float from their right and clash over head as they meet hard drum lines coming from their left. Goose bump inducing.

We were also treated to an incredibly unique experience in which The Cube was able to recreate the sounds that would be heard if the class was on the field in Lane Stadium during a football game. The noise was loud, but not nearly as loud as the battle cry in my soul. A very intense experience to say the least.

After going into greater detail about the programs and platforms that are used to create the immersive acoustic environments that The Cube offers, including the software that allows the user to manipulate the sound in the room with ultimate control, the class was formally introduced to the Cyclorama. This massive ring-like “canvas” gives ICAT’s artists a platform to display their creations like nowhere else on campus. We were shown several examples of the Cyclorama’s capabilities, highlighted by an immersive 360-degree look at elephants performing typical elephant tasks, such as walking, breathing, and walking, in their native Ghana.

Finally, the thirst from both our eyes and ears were quenched together when the class was shown a 360-degree courtside view of a basketball game at Cassell Coliseum, complete with 360 acoustics. Once again, my Hokie blood ran ferociously through my veins. This time, however, not because of the display itself, but because of the incredible work and effort that the people at ICAT have done to create such a unique, creative, and technologically advanced space for the great minds at Virginia Tech to have at their disposal.

Adham Nabhan breathes passion into everything he does. Whether he’s working towards his mechanical engineering degree, running up and down intermural fields, or writing in the third person, his energy and enthusiasm is apparent in all his work.

Wintermester ICAT / University Honors course

This Wintermester, Tanner and I co-taught a Honors College course hosted in the ICAT Cube and Perform studios.  The goal was for students to learn the ins and outs of these facilities and their systems, and to create projects that combine at least two of the following: motion capture, virtual reality, immersive projection, spatial audio.  They got to be the first folks to make use of our VR backpacks, which was really fun to see.  It was definitely a trial by fire for all our systems, putting everything to the test, but I’m glad to say that it seems like we built things robustly enough that everything held together.

The class was intense, with 4 hours of class-time per day, 5 days per week for two weeks.  The students were fantastic to work with.  They were very focused and eager, and I think they came away with a strong and novel experience.  Their final projects went way beyond our expectations, and we’re very proud of what they accomplished.  There 6 groups and 7 projects.

Haunted Halls

A virtual reality horror experience in the Oculus HMD with the VR backpack.  Walk through a haunted maze and be terrified by the spooky spatialized sounds and terrifying visualizations.

Wintermester 2017 – Haunted Halls Final Project
Wintermester 2017 – Haunted Halls Final Project

Crowning Achievement (a Pun)

Checkers in outerspace.  Using VR with the Oculus and backpack.  Multiplayer included.  Pick up checkers pieces with your hand and move them around the board.  Spatialized background music an added bonus.

Wintermester 2017 – Crowning Achievement Final Project
Wintermester 2017 – Crowning Achievement Final Project


A game of asteroids, but with 360 immersive projection on the Cyclorama.  Hold a wand and point it at the screen to try to destroy the asteroids flying around.  The more you destroy, the faster they go.  Spatial audio tracks the asteroids as they fly around you, making a satisfying POP when you destroy an asteroid.

Wintermester 2017 – PhoAsteroids Final Project
Wintermester 2017 – PhoAsteroids Final Project

Michael Rhoades

An abstract fixed-media art composition, depicting a space of cylinders and mirrors.  Projected in stereoscopic on the Cyclorama, with spatial audio.  It’s quite an experience to see this metallic, glass world seem to come out of the screen in 3D.

Wintermester 2017 – Michael Rhoades Final Project
Wintermester 2017 – Michael Rhoades Final Project

Last Words

Where art, particles, typography, and 360 3D projection meet.  The depth of the stereoscopic projection they accomplished here was incredible, and it was really fun to see the first experiments in typography on the Cyclorama.  The spatial audio of music and sound clips to accompany the video was also very effective.

Wintermester 2017 – Last Words Final Project
Wintermester 2017 – Last Words Final Project

Peek A Boo

A short animation of a girl hiding in the Cube, peeking out from behind a curtain.  Sounds simple, maybe, until you find out that they accomplished this by using motion capture to record the animation, apply it to a prebuilt model, and then rendered in 360 3D for the Cyclorama.  With the spatial audio call and response with the girl, it puts you in the virtual space.

Wintermester 2017 – Peek A Boo Final Project
Wintermester 2017 – Peek A Boo Final Project

Lane Stadium

A recreation of what it’s like to enter Lane Stadium on gameday as a Hokie football player.  The spatial audio is the key component here; the students scraped together a series of audio clips they could find to reassemble the aural experience of coming down the tunnel and out onto the field, making great use of the sound system in the Cube.  Unfortunately, it doesn’t seem like VT sports has gotten into the 360 video area yet, so the students used a video from another university instead, which looked amazing on the Cyclorama.  This would be great material for them to take to VT Sports to try to pitch for the opportunity to take a real recording.

Wintermester 2017 – Lane Stadium Final Project
Wintermester 2017 – Lane Stadium Final Project

Phat Nguyen: Life as a Graduate Student by Mariah Flick

Congratulations to Phat for graduating with his MFA in Creative Technologies from the Virginia Tech School of Visual Art!

Mariah Flick created this documentary about Phat and his thesis work. Phat discusses graduate school and developing his final thesis showcase, which was shown on the Cyclorama in the Cube.


Robot Fan Mail

A little while back, Phyllis was forwarded this letter from a pre-school class originally sent to CHCI Interim Director Andrea Kavenaugh.  It’s quite sweet.  See the original letter and Phyllis’ response below:

Phyllis’ response:


March 18, 2016

The Rosa Sharks Class
Blue Mountain School
470 Christiansburg Pike
Floyd, VA 24091

Dear Ms. Stefi and the Rosa Sharks Class,

My friend Dr. Andrea Kavanaugh gave me your wonderful letter and drawings of robots and asked if I could help answer your questions.  I really like the way you explored what you already knew about robots and then asked questions to find out more about them.  I think you must be very good thinkers.

  1. What do robots eat?

This is a great question.  Robots need energy to work just like you do.  They use energy from batteries, solar panels, or an electric cord plugged into the wall.  They also need instructions so that they know what to do and when to do it.  We call those instructions “code,” and code comes from humans who tell the robot what it can do.

  1. Where do most robots live?

Robots live in lots of places.  Some of them live in houses.  Some live at businesses.  Some even live in outer space.  There are some robots on Mars right now, but I don’t know of any on Jupiter.  Robots live wherever humans need them to do a job.  The robot on Mars took pictures and sent them back to Earth.  People can’t live on Mars, so it was a great way to learn about the planet.  Some people keep a robot in their house to do the vacuuming!   (That’s a job they just didn’t want to do.)  You might even have a robot in your toys.  If you have a toy that does something like move or make a sound when you push a button or squeeze it, you have a robot.  My kids have Zhu-Zhu pets and a toy backhoe that are robots.

  1. Do robots go on vacation? Do they like to play?

Robots can’t do anything that humans don’t teach them to do.  A robot could go on vacation or play, but it would only do that if a human told it to.  Most of the time, humans just want robots to do work.  When the robots are not working, they might rest in a closet or on a shelf.  If you know someone with a smartphone, you can ask the smartphone about robots and vacation.  Siri is kind of like a robot, and she might know a better answer than I do.  Everything that Siri says, though, is something that a human told her to say.

  1. Where do robots come from?

Robots are built by people.  Sometimes robots are built by other robots, but people have to tell the building robots what to do.

  1. Do you need tools like a hammer and nails or a wrench to build one?

Robots are things that move and follow instructions.  You can use hammers and wrenches to make the thing and its moving parts.  You can also use scissors and tape and cardboard.  The shape of a robot can be made out of anything.  A lot of robots have motors that help their parts move.  The instructions part of a robot is usually a small computer.  People use coding tools – usually a bigger computer – to put the instructions into the robot’s computer.

  1. Can you build a robot for us to play with?

I suppose I could, but I think it would be more fun if you built one yourself.  You’d have to decide what kind of job you wanted it to do and how it should look.  I like to use a Makey Makey to make a basic robot that makes sounds.  You can also use a program called Scratch For Kids to learn to code.  Another way to learn about robots is to take apart some old toys that work like robots.

I hope I have been able to answer your questions.  Thank you again for asking them.  I hope you will keep asking questions and learning more and more.  I am sending your class a viewfinder that shows some fun things we do at my work.

All the best,


VR particle physics update

We’ve made a number of tweaks and user interface adjustments to this project, and it’s time to share them.

  • We now have sprites and colors for EVERY type of particle.  It makes it look so much more amazing.  Jesse toiled over this during Thanksgiving break.  Thanks Jesse!
  • The user interface is now diegetic, existing in the world of the virtual environment rather than as a heads-up display or thru a physical tablet.  Right now it just hangs out at a particular spot.  But when the user is in VR, being motion-captured in the Cube, they will be holding an Xbox controller which will also be tracked.  The virtual menu will take the position of the Xbox controller – so essentially users just need to look at their hands to look at the menu.  They then select whatever part of the menu they are looking at with a simple look-and-click interface.
  • We’ve also added the ability to select particles by looking at them and pressing a button.  When you select them, a display pops up over the particle, showing key info.  On that display, if you click the Save button, the info for that particle gets saved to a display up on the wall (which is still in the works).
  • The Model section now has a preset subsection, so you can save and load different configurations of the BelleII model being on and off, along with the relative transparency values.
  • There are also two “scoreboards” on the Cube walls now.  One shows the current simulation time, so you can glance at it from anywhere in the room.  The other allows the user to start and stop a timer with a press of a button on the Xbox controller.
  • At Leo’s request, we’re also now showing “dead” particle sprites.  Instead of a particle sprite disappearing after the data for the sprite has ended, a “shell” takes its place, showing where the particle was when it … passed away?  This has caused some new frame rate issues that I’m struggling with.  30k is a lot of individual sprites!  They’re not correctly batching right now, which is what I spent most of my day trying to figure out.  I learned a few things that increased the frame rate here and there, but it’s still not great yet.  It gets worse as the simulation goes on, because more and more sprites are added.

Additionally, the educational side of this project is really starting to kick into gear.  Instead of spending most of the meeting time talking about features for the simulation, we’re talking about lesson plan development, which is awesome!  That’s exactly where we want to be moving in to next semester.  We plan on piloting this thing with Society of Physics Students folks immediately following spring break!

K-12 Visiting the Cube

Editor’s Note – This post is from Phyllis Newbill, the Outreach and Engagement Coordinator here at ICAT.  Enjoy!


I had the pleasure of hosting school groups today from Mountain Vista Governor’s School and Hardin Reynolds Memorial School in Patrick County.  Thanks to Tanner and Zach, the students saw and heard demos of basketball games, elephants, anechoic chambers, cathedrals, dog skeletons, and fantasy landscapes.  The groups also visited the Experience and Create Studios to see 3D printers, the laser cutter, and a project involving lots of pencils that I haven’t quite figured out yet.  I know that graphite is conductive, though, so I think that’s a clue.

I was glad to see Mountain Vista again.  Their school has done a great job of sending projects and students to the Maker Conference each year in the spring.  Today’s group was of tenth graders, so I’m looking forward to seeing them again later this school year or in future years.

The Hardin Reynolds students happened to tour the Create Studio at the moment that Panagiotis, a CHCI grad student, was finishing a 3D print.  He showed them how the 3D printer works, and also gave them a full demo of his 3D printed olive oil factory parts.

My favorite part of the day was when the students got to see a 3D fantasy landscape in the Cube, and they reached out to see if they could touch the objects they were seeing.

VR upgrades in the Cube

While the blog has been quiet recently, ICAT is far from it.  I haven’t posted recently because I don’t have a new flashy video to show off yet, but we’ve made some excellent internal developments.

Foremost in my mind at the moment is our integration of the Oculus CV1 into the Cube motion capture system.  In the past, we’d used the DK2 as our go-to headset.  It was connected by HDMI and USB to a laptop that someone would have to carry around behind whomever had the headset on.  The DK2 itself was OK, but the resolution wasn’t fantastic.  There was also a decent amount of shakiness to the perspective coming from the mocap system.

Oh, how times have changed.  With the CV1, we can use the IMU’s rotational data for the VR perspective.  Not only does this eliminate the shakiness, but it’s super low-latency.  But in order to enable users to walk freely thru the Cube, we’re still using the position data from the motion capture system, and disregarding position data from the Oculus infrared sensor (it still has to be plugged in or the Oculus app throws a fit, but it doesn’t actually serve a purpose).  The combination of these two works *brilliantly*, resulting in by far the best VR experience we’ve ever had in the Cube.

But wait, there’s more.  Right now, we have this running off a laptop with a dedicated graphics card, which is a problem.  The laptop’s battery can’t supply enough juice to the GPU, so when the laptop is unplugged it automatically lowers the GPU’s memory clock which dramatically hurts VR performance.  Enter the MSI VR One laptop/backpack/jetpack(!)(??)/new-kind-of-computer-form-factor-that-doesn’t-really-have-a-name-yet.  This thing is designed for tetherless VR.  You put it on like a backpack, you hook all the VR stuff up to it, tie up all the cords, and you’re free to walk around the room with nary a concern of tripping on cords or having someone else hold the laptop (new meme idea: Hold My Laptop… while I do some VR).  Caveat: we haven’t actually tried this thing yet.  It JUST released, and we have one on the way.  So, it’s possible that it won’t live up to expectations, but I’m hopeful.  If this works, we will get several more, and have true, tetherless, social VR in the Cube for the first time ever.

P.S. I’m still doing a lot of work on the physics VR simulation, and it’s looking great.  I’ve built a diegetic interface.  Once we get this backpack, I’ll take some video of a user exploring the simulation, using the interface, and post that here.