Particle Physics Project Update

We’ve been working extremely hard on the particle physics project.  A work-in-progress version of the simulation will be at the Science Festival!  Saturday, October 8th, 10am-3pm.  This simulation will be in the Cube.  Come check it out!

Before I dig into what we’ve been working on, insert obligatory pretty video:

The main to-do list we’d like to accomplish before the Science Festival:
1. Me – Get all the particle sprites for the different kinds of particles
2. Me – Color code the path lines in coordination with the particles
3. Me – Get the full Belle II detector model in, albeit combined and completely opaque.
4. Tanner – implement draft of sound (OSC communication of particle data is complete, so he can pick it up and make it sound awesome)
5. Topher, Jesse, Sam – Make a flyer for audience to pick up on their way in
6. Topher, Jesse, Sam – Make all the images for the particle sprites, so I have them to import (there’s around 120 different kinds of particles!)

We’ve had 3 team meetings now, which are extremely helpful now that we have a full team on board.  This includes:

Christopher (Topher) Dobson – senior, physics education
Jesse Barber – junior, physics
Samantha (Sam) Spytek – senior, physics education
Tanner Upthegrove – ICAT staff, sound
Zach Duer (me) – ICAT staff, visualization / organization
Dane Webster – faculty, school of visual arts
George Glasson – faculty, school of education
Leo Piilonen – faculty, physics
Nicholas Polys – faculty, computer science
Todd Ogle – master of all things, TLOS

Just in the past few weeks, Sam and Topher have picked up on the game very quickly and set out with Jesse to do a lot of work developing the educational aspect of the simulation.  In the next update I will go into more detail on that.

In the meantime, I’ve been working my butt off optimizing the simulation to keep a decent frame rate. There were 3 major bottle-necks.  I’ll detail them here, along with the solutions:

  1. The model, with all its 400,000+ individual objects slowed the frame rate down to about 10FPS.  It’s a CPU problem – the GPU breezes through it, since it’s only about 2million polys.  The CPU is struggling with this due to the number of draw calls.  A lot of the objects are duplicates (the same mesh with the same material, but a different transform).  This lets Unity reduce the number of draw calls by batch rendering, but just the process of doing that batching slows it way down.  To maintain 60FPS, the full draw cycle has to happen in 16 milliseconds or less, so if there’s one thing that takes 0.01 milliseconds but it happens a few thousand times, it’s a huge problem.  I’m still working on this, but in the meantime, I’m combining each major layer of the BelleII detector into a single object and setting it to an opaque material.  Although this process takes some time, after I’m done the frame rate should be back up to a steady 60FPS
  2.  The lines rendered to show the paths that the particles will take over the course of the simulation, the “pathlines” as I’m calling them, take too long to render if rendered individually.  For the complex events, we’re talking about something like 4 million lines.  Even if they’re done in one draw call, it’s still just too slow if done every frame.  With just rendering these lines, it dropped the framerate to about 3FPS.  So, instead I purchased a package from the Unity Store called FastLineRenderer.  Since all the lines are known when the simulation starts, I can send them all to FastLineRenderer, and it converts them into a series of meshes (something like 64k lines per mesh).  This shoots the framerate back up to almost 60FPS.
  3. I needed to optimize the script controlling particle movement.  Yesterday and today, I completely rewrote the thing to eliminate the need for certain loops that were bottlenecking.  I won’t go into any more detail now, but the resulting code is, in my opinion, much cleaner and easier to read, AND it’s a lot faster.

All that put together, and you can how smooth the above video is.  Not all the meshes from the Belle II detector are finished combining, so I just threw in the Electromagnetic Calorimeter for that video.  The framerate never dropped below 50FPS.  I’m pretty proud of that.

Oh, and I put Lucas Freeman’s (mostly complete) Cube model in the project too, so that vastly improves the aesthetic quality.


Cyclorama Update

We’ve made significant progress recently figuring out how to render video for the Cyclorama, getting Unity working with it, and streaming video in.

Rather than expound on what we’ve learned here, I will refer you to the two new documents I posted to the ICATDrive.  One is one the pipeline for rendering from Maya to the Cyclorama, and the other describes the details of the various types of input for the Cyclorama (e.g. video, image sequences, and streamed video).  If you’re interested in using the Cyclorama, please refer to these documents for information on how to get started on getting your content ready to be played!

We’ve also been testing out a lot of content and a LOT of people have been through to look at it.  Yesterday we brought through two classes from Graphic Design (students from the classes of Katie Meaney and Patrick Finley) with the intention that they will spend some class time making projects for the Cyclorama.  They were enthusiastic about the possibilities, and we’re excited to see what they might come up with!

Graphic Design Students Visit Cyclorama 8/31/2016
Graphic Design Students Visit Cyclorama 8/31/2016
Graphic Design Students Visit Cyclorama 8/31/2016
Graphic Design Students Visit Cyclorama 8/31/2016

Particle Physics Simulation Update

The particle physics simulation project is coming along quite nicely!  We have the vast majority of the Belle II model imported into Unity and looking good now.  There’s still plenty to work on, but check out how it looks as of right now:

Dr. Leo Piilonen has been extremely helpful in working to produce FBX files for the Belle II model rather than VRML files, and Jesse Barber, a physics undergraduate, has been working closely with me to get the model looking great.

Some of the things on my immediate to-do list:

  • Finish importing Belle II (passive EKLM)
  • Make another scale slider for scaling the Belle II without also repositioning it relative to the viewer
  • Fix the opacity slider bug that I’m getting sometimes
  • Fix the particle trails so that when the Belle II is moved they don’t generate trails for that movement itself and stay consistent within the Belle II model
  • Try to improve the framerate when the entire Belle II is displayed

Stepping into the past through visualization: Exploring America’s Forgotten War

Our history visualization team is on the ground in France.

This project has been made possible with generous support from an ICAT SEAD grant.

Our team from TLOS, the School of Education, Department of History, School of Visual Arts, Mining an Minerals Engineering, together with our partners at Arkemine and the Friends of Vauquois Association have begun our comprehensive survey of the Butte de Vauquois, near Verdun, France.


Follow us on Twitter at:


The Cube now has another amazing new feature: the Cyclorama!  We finished installing it last week.  The Cyclorama is a massive cylindrical projection screen for immersive experiences.  At roughly 40′ in diameter and 16′ tall, it can accommodate a maximum of 60 people and allows the projection to fill your entire field of vision from any where you stand.  Check out this timelapse video of the construction, which took almost 4 days:

A huge thanks goes to the Moss Arts Center production crew, who did all the hard work of actually building the thing!

It is operational and we’re starting to get the hang of how to use it.  We’ve had to readjust the motion capture camera arrangement so that we can capture both inside and outside.  I’ll put up some more posts in the coming days with details about the new motion capture setup, how to use it, etc.

Roanoke science museum planetarium

Yesterday, we paid our second visit to the Science Museum of Western Virginia’s museum planetarium facility in Roanoke.  We (ICAT, Advanced Research Computing, Architecture, Music) are working with museum staff and community volunteers to build a vision for the future use of the amazing space, which is currently under used because of outdated technology.

On this visit, in addition to our minds and some measuring tapes, we brought four projectors (from ARC’s old VisCube setup), a computer, and a Max patch which I rigged up the day before to test stitching and blending a single video across all four projectors.  It worked great!  There are still some issues to be worked out in terms of perfecting the blending and getting 4k video going smoothly, but the four projectors were able to cover almost the entire screen!  Here’s an idea of what it looked like (an image doesn’t really do it justice because the projection is so big, the camera lens can’t capture it all).  With chickens.  Don’t ask.

Better with chicken.
Better with chicken.

Belle II particle physics simulation update

We have some exciting updates from the particle physics simulation we’ve been working on for the past month!

First, we now have Unity scripts that read the particle data from a CSV file, sort it correctly, and use it to spawn and move particles.  It’s a very important first step – in fact, a lot of everything from this point on is frills.  Although there’s lots of frills.  But this is the *meat* of the project – taking particle data and visualizing it.  There’s a (slightly confusing to watch, but proof of product) video below:

ALSO, Jesse Barber (Physics sophomore) and Kaelum Hasler (high school student worker) have been working with me to get the VRML model of the Belle II organized.  We had to combine a lot of objects in order to get the object count down to something manageable (it was at something like 300k to begin with), but we also had to CUT a lot of the cylinders into four parts, in order to solve a transparency depth-sorting issue in Unity.  They spent almost all of last week working on that, and we now have a model in Unity!  This model is still incomplete because the VRML export that we’re working with doesn’t quite have the full model, but we’re almost there, and we can really see what it’s going to look like now when it’s all done!  See below for another video of that.

High Performance Wireless VR

We now have a working pipeline for wireless VR that can simultaneously be high performance (computationally and/or graphically).  Here is a flow chart:


High Performance VR in Cube flow chart

Despite how many different parts there are, it works reliably.

Here are the next steps to improving it:

  1. Replace Garfield with the render cluster in the Andrews Information Systems Building.
  2. Replace NVIDIA Gamestream and the Moonlight app with our in-house encoder / streamer / decoder solution that is currently in the works.
  3. Allow for simulation software besides Unity
  4. Replace the current VR headset with something nicer (already purchased, not yet tried).

MAC 218 Android Build

One of our projects is to have a complete model of the Moss Arts Center, both for Mirror Worlds and as part of the Assets we can share with ICAT affiliates.

A recent alum, Lucas Freeman, was assigned to work with me for independent study in his last semester at VT.  I had him model room 218 in the MAC, load the model in Unity and light it appropriately.

Moss Arts Center Room 218, modeled by Lucas Freeman
Moss Arts Center Room 218, modeled by student Lucas Freeman

I then built a little demo of that for Android.  You can download it here:

(Android only)

To install, you will need to go your phone’s Settings menu, then go to Security, then enable Unknown Sources.  This allows your phone to install an app from a source other than the Play Store.  You can turn this back off as soon as you’re done installing this app.

In the near future, I will get this demo also built for iOS, and then published to both the Play Store and the Apple Store.

Disclaimer: This material is based upon work supported by the National Science Foundation under Grant No. 1305231.  Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

ICAT Drive

ICAT Drive is now live –

ICAT Drive is a central repository for all things ICAT.  Manuals, assets, useful code, etc.  The above link is open to anyone.  Most of the documents and assets in that folder are open to the public.  There are a few things, like the passwords document, that require special permissions which you can acquire by contacting ICAT.

We would really like the ICAT Drive to be an important resource for all projects that use the ICAT facilities.  The guides, APIs, and assets within should allow both faculty and students to jump start their projects without having to start from square one.

So far, I have published a series of manuals on how to use the motion capture systems in the Cube and the Perform Studio, and how to interface those motion capture systems with Unity.  That comes with the necessary assets I created to enable the interface.  There is also a guide on how to import point clouds into Unity (along with another unitypackage with scripts to enable this utility).  And holding it all together is a guide with an example project that shows how to put all the other guides together.  The guides are organized modularly, making it easy to refer to and edit any given operation.

There are also assets.  Right now there is a model of the Cube and some neutral character models for use in any project.  This will be greatly expanded on over the next few months.