VT Virtual Weather

The meteorological visualization project now has an app to show off the work we did!  You can use your phone to see the tornado visualizations, and if you have a Google Cardboard or similar VR headset for a phone, you can see them in VR!

It’s currently Android only, and it requires your phone to have a gyroscope to work.

Check it out on the Play Store, it’s free!


Wind Field

(Above video, severe weather visualization project, screen capture of Goshen wind field simulation, May 25, 2016)

First post on our new blogs!

Recently, I’ve been spending a lot of time working on visualizing severe weather, working closely with Trevor White, a grad student in Virginia Tech’s meteorology department, as well as the PI of the project Bill Carstensen, from the Geography department.  The Weather Channel was just here last week, taping some spots on the El Reno tornado of 2013 and Hurricane Charley from 2004.  I will post media of those once they reach broadcast.

The video at the top of this post shows a wind field – a simulation of wind inside a tornado!  The colors are coded as follows:

Bright red is going up fast
Bright blue is going down fast
Bright white is going horizontally fast
Dim red is going up slowly
Dim blue is going down slowly
Dim gray is going horizontally slowly

Most of my time in the severe weather visualization has been devoted to the wind field particle simulation.  I’m not a meteorologist so I can’t explain this 100% effectively, but here’s the basic idea:

We have some data sets that show wind velocities (that is, how fast the wind is going and in what direction) inside a tornado.  This data is derived from a dual-doppler setup, where two doppler radars are positioned at 90 degrees to a storm, allowing them to use triangulation to get the aforementioned wind velocities.  After Trevor has analyzed and cleaned the data, he sends it to me a 3D grid of voxels (that is, a 3D grid of evenly spaced points), each of which has a wind velocity.

I bring that data into Unity, where I use it to create a particle simulation.  I generate particles throughout the volume (the area where we have wind information), and then I figure out which voxel they are in.  I use that voxel’s wind velocity data to move the particle.  Then the next frame, I find the new position of the particle, figure out what voxel it’s now in, and use that wind velocity data to move it again.

Essentially, this gives us a highly accurate 3D visualization of how the wind is flowing at a single moment of the storm.  It looks sweet, and I’ve been building a lot of features to aid the visualization, like motion trails that show where a particle has been, arrows that show the wind direction in each voxel, etc.

Because the simulation requires a lot of calculation, it originally slowed down Unity to a crawl when calculating 60,000 or more particles.  But, I’ve moved the calculations over to a console application which does all the calculations in CUDA, allowing me to use the parallelization of the graphics card to easily push out the calculations necessary.  I then open transfer that data from my console application to Unity using UDP.  Right now, UDP is actually the biggest bottleneck, because I have to transfer the position and color information of each particle AND each point of the motion trail to Unity 60 times per second.  That’s a lot of data.  Hopefully Unity will move out of the stone age soon and upgrade to a more recent .NET framework (it’s stuck in 2/3.5 right now because of the way it uses Mono), at which time I can use the Cudafy library inside Unity itself, or more likely in a plugin I can write.