This report presents my time learning about and implementing an interactable VR fluid simulation in Unity. I was originally inspired by the grid based sandbox, powder, and discovered a substantial amount of literature on fluid simulations after researching it. With these resources as well as help from a few industry members I reached out to, I implemented a 2D, 3D and VR particle based fluid simulation in Unity that runs in real time as to allow for real time user interactions. I did not end up having the time to implement realistic screen space fluid rendering, so this is left as future work for this project.
All sources will be listed in the resources section.
All of these videos were taking in Unity's play mode after various steps during the implementation of the project. More implementation details can be found below. The VR recordings took place on a class provided Meta Quest 2.
As I stated above, I was really inspired by powder and the cool simulated environments you could create in it. Setting up some environment and allowing the underlying physics to run their course was something that I found very satisfying. Powder was only in two dimensions and I felt like adding a third would only serve to add to its immersion - and making it in VR would add even more while also being in the vein of the course. I was especially motivated since there were very few particle sims in VR. However, the simulation in powder includes many different particles and properties that would be very hard to implement in the little time I had so I chose to focus only on simulating water - still a difficult task but there was at least a lot of existing literature on this topic. I some experience with Unity and I knew Unity had great VR support so I opted to implement this project using it.
I used the same math as the related work above as I do not have time to implement my own unique fluid solver as cool as that would be. I only had access to Müller and Li's implementations from the work listed above, but my implementation still was unique in the following ways: I implemented the rendering of particles using Unity's GPU instancing, I made my simulation work in VR for real time VR interactions, and I solved all of the simulation in parallel with shared compute buffers and compute shaders. My implementation being in VR as well as my need for parallel solvers also meant the algorithms I used needed to be updated slightly from the ones in the related work, but they are overall the same.
The follow are my major contributions:
There are two typical ways fluids are simulated in 3D: Eulerian grids and Lagrangian particles. Eulerian methods keep track of variables on a fixed grid, and Lagrangian store fluid variables on individual particles. Zhu and Bridson [2005] discuss a method that combines the strengths of a grid based and particle based simulation: the Fluid-Implicit Particle (FLIP) method. In the FLIP method, particles in the simulation store their position and velocity, and exist in an underlying grid where cells keep track of their incoming/outgoing velocities, pressures, etc. At a glance, one step of the algorithm I implemented is as follows:
Put in all the specific implementation details here, including both hardware and software aspects. Even if you didn't build a piece of hardware, make sure to document what hardware you used, including your headset, computing environment, and major software libraries. If you implemented specific hardware devices, describe how you decided on the design parameters for the hardware. For example, if you built a VR headset, you'd apply the equations you previously introduced in the method section to decide on the values you used in your construction. If you implemented an algorithm, such as volume rendering, then you'd describe the function implementation details here (e.g., GitHub projects you built on, libraries you used, or specific aspects you found challenging and how you resolved them).
We will discuss the more specific implementation details of this project in parts (a few of the simulation steps are presented as videos at the top of the website):
This section should evaluate the benefits and limitations of your approach. Ideally, these should be quantitative details. If you implemented a foveated renderer, then you'd include measurements of frame time and image quality (e.g., PSNR). If you implemented a piece of hardware, then you'd want to show photographs of the results. Make sure to not just record successes here, but to also document limitations and failure cases. Show where your algorithm works and where it needs improvement. If you ran a user study, then you'd tabulate the statistics in this section and try to make some conclusions, based on that data. Ideally, if you had time to implement prior methods, you should include quantitative comparisons to the most promising related work you reviewed.
This wasn't the most quantitative project out there. When I start a simulation, I will instantiate all my particles in a square or something similar. So for
most of my testing, I did a "Dam Break" simulation where I would instantiate all of my particles in a large block on one side of the sim and let them crash
into the other side. Here's an example of a Dam Break with 175k particles on a (20,10,10) grid running at about 45 fps:
And adding the alpha blending doesn't seem to reduce any performance:
Here's an example of the sim running 625k particles at around 12.5 fps:
I think this looks amazing, but it just isn't possible to have real time interaction with a sim this large.
Running the simulation in VR was a lot slower. I think this is due to the stereo shader? It had to render twice the particles, one for each eye, but I
felt like this should be faster. I could run a sim of size (10,5,5) with 100k particles at around 30-40 fps.
The sim above still very response and fast to interact with, the freezing of the "hand" is due to the lower battery life of the controller at
the time of recording. If I try to render a similar sized sim to the one without VR, it will run at about 15 fps which is just not possible to have in VR:
And while it doesn't show up here, on the headset, there is terrible blending and warping of the images, which I imagine is due to the very slow fps.
In non VR mode, I also tested other starting configurations. The different starting positions did nothing for impact performance, but
they look really cool so I'll add them here. Like a Double Dam Break:
A falling situation, not sure what to call this lol
This one actually highlights a bug in my implementation. Notice how the water tends towards the z = 0 axis as it falls.
I'm not sure why, by my simulation likes to tend things towards z = 0 ever so slightly. I am not sure why, and it's hard to
tell in other examples, but it really pained me for a while when I tried to fix but for time constraints, I left it in since
it still looks cool.
There were other issues too. I constantly had to tweak the resting pressure of the simulation or else
it would just break. For example, if I left the pressure too low the following would occur:
Or if it was too high:
Or if I did not include any damping when I imbued cell velocities to particles:
This is what most of my simulations looked like for a while - I did not figure out to add damping for so long.
And the most nefarious bug was this line of separation that exists near the y = 0 axis:
This actually occurs on both the x = z = 0 axis as well, but it only appears here because gravity forces the particles to the y = 0 one.
I think this is due to the face that neighboring cells are the right, upper, or "deeper" cells, so when solving for incompressibility, the bottom
cells dont have a neighbor from below to push them up? So they just want to lay on the bottom... I don't know, other FLIP solvers manually separate particles,
but implementing this separation in parallel is a bit above my skill level and time so I left it!
There also is one elusive bug where the entire sim will bounce up like its jumping for joy. I couldn't record it, but I think it's due to the fact that
my delta time for each simulation step is based off of the delta time in unity, and potentially something causes there to be a large spike between frames. Maybe my computer
is processing something else at that exact time which causes the delta time to spike which causes everything to jump? Unsure, but it rarely bothered me.
Finally, I attempted to make screen space fluid rendering work, but it was so hard due to Unity's weirdly confusing rendering pipeline. I had little experience with shaders
and implementing this turned out to be a bit too much in too little time. I don't have any decent results - the furthest I got before tapping out was implementing a gaussian
blur on a render texture of the particle normals:
My setup was very scuffed - I had a camera rendering a render texture of the simulation with normals attached to the spheres. Then ANOTHER camera pointing at the render texture
which was applying a blur. I would have kept going by implementing a bilateral blur, but I needed to get the depth texture of the image - for some reason, I couldn't grab this
in a render texture in Unity. I'm sure there's a single line of code I was missing, but for the life of me I couldn't find it. So for the sake of my sanity and hairline, I chose
not to continue.
I really did not do anything ground breaking with this project. I implemented someone else's math in Unity's defined rendering pipeline - so I don't feel I have any ground to comment on future work for the field of fluid simulations. HOWEVER, I do have quite a bit of future work for this sim in particular. I need to address the bugs I listed above before I would be comfortable calling this project done. I left them in for the sake of time for this report, but all of them take away from the visual pleasure of my simulation. I also want to add some more complicated environments. Right now the fluid simulation takes place in a simple fish tank, but it would be cool to see this water navigate a maze or some other object. More methods of interaction in VR would make the simulation a lot more novel. One feature I am adding now but will not make it in the report is the ability to rotate the tank. So when the user picks up and rotates the tank, gravity will effectively change and the particles will fall to a new direction. And of course, I really wanted to add screen space fluid rendering to make the simulation look more like actual water - this is my biggest todo.
I had a lot of fun with this project. The purpose of this project wasn't to invent anything new, but to learn about fluid simulations and implement one in Unity. I feel like I accomplished that. In the future, I really want to work more with physics simulations and VFX in general since they're just so darn cool to look at. I think that these cool effects would be valuable in the VR space - the medium of VR/AR makes experiencing these effects a lot cooler. I really don't know what to type here.
This section is for thanking key collaborators that didn't do enough to warrant being a co-author of the paper. If someone financially sponsored your work or mentored you, then it's a good idea to recognize that here. If someone provided really helpful feedback or a great insight, then recognizing that contribution can be done here.
I could NOT have done this project without the help of some industry members:
My sources in no particular order. I know for a real report you need to cite them throughout the report and whatnot - but I got really lazy sorry :P.