Virtual Reality Systems

CSE 493V | Winter 2025

Overview

Modern virtual reality systems draw on the latest advances in optical fabrication, embedded computing, motion tracking, and real-time rendering. In this hands-on course, students will foster similar cross-disciplinary knowledge to build a head-mounted display. This project spans hardware (optics, displays, electronics, and microcontrollers) and software (JavaScript, WebGL, and GLSL). Each assignment builds toward this larger goal. For example, in one assignment, students will learn to use an inertial measurement unit (IMU) to track the orientation of the headset. In another assignment, students will apply real-time computer graphics to correct lens distortions. Lectures will complement these engineering projects, diving into the history of AR/VR and relevant topics in computer graphics, signal processing, and human perception. Guest speakers will participate from leading AR/VR companies.

For a summary of the 2020 edition of CSE 493V, including interviews with the students, please read "New Virtual Reality Systems course turns students into makers", as published by the Allen School News.

Acknowledgments

This course is based on Stanford EE 267. We thank Gordon Wetzstein for sharing his materials and supporting the development of CSE 493V. We also thank John Akers, Brian Curless, Steve Seitz, Ira Kemelmacher-Shlizerman, and Adriana Schulz for their support.

Requirements

This course is designed for senior undergraduates and early MS/PhD students. No prior experience with hardware is required. Students are expected to have completed Linear Algebra (MATH 308) and Systems Programming (CSE 333). Familiarity with JavaScript, Computer Vision (CSE 455), and Graphics (CSE 457) will be helpful, but not necessary. Registration is limited to 50 students.

Teaching Staff
Douglas Lanman
Affiliate Instructor, University of Washington, CSE
Senior Director, Display Systems Research, Reality Labs Research

Douglas is the Senior Director of Display Systems Research at Reality Labs Research, where he leads investigations into advanced display and imaging technologies. His prior research has focused on head-mounted displays, glasses-free 3D displays, light field cameras, and active illumination for 3D reconstruction and interaction. He received a B.S. in Applied Physics with Honors from Caltech in 2002 and M.S. and Ph.D. degrees in Electrical Engineering from Brown University in 2006 and 2010, respectively. He was a Senior Research Scientist at Nvidia Research from 2012 to 2014, a Postdoctoral Associate at the MIT Media Lab from 2010 to 2012, and an Assistant Research Staff Member at MIT Lincoln Laboratory from 2002 to 2005. His recent work focuses on passing the visual Turing test with AR/VR displays.

Evan Zhao
BS/MS Student, University of Washington, CSE

Evan is an undergraduate at University of Washington, majoring in Computer Science. He is passionate about computer graphics and the huge potential of combining graphical programming techniques with fabrication, such as for 3D printing and machine embroidery. Evan is also a member of the UW Reality Lab. He has learned how to design interactive, efficient, and accessible applications that run in virtual reality, but he also wants to make them physically touchable to bring those models to real life. Since discovering the vast potential in computational fabrication, he has decided to become a part of the pioneers in this field and contribute to the goal of making designs for everyone.

Shaan Chattrath
BS Student, University of Washington, CSE

Shaan is a Computer Science student at the University of Washington. He is passionate about exploring the limitless possibilities of virtual reality and computer vision. Through his coursework and internships, he has experience in graphics, AR/VR application development, computer vision research, and more. Most recently, Shaan worked with engineers and artists at SIGGRAPH 2024 to help run the VR Theater. He also was an intern at Amazon last summer.

Headset Development Kit
1 / 15
The Winter 2025 version of the CSE 493V development kit.
2 / 15
Step 1a: Begin assembling the display by connecting the driver board.
3 / 15
Step 1b: Test the display.
4 / 15
Step 2a: Place the display into the enclosure and align the center of the screen.
5 / 15
Step 2b: Tune the distortion correction parameters.
6 / 15
Step 2c: Optimize the headset fit until a clear image is visible.
7 / 15
Step 2d: Continue tuning the distortion correction until lines appear straight.
8 / 15
Step 2e: Disable the virtual background to test augmented reality viewing.
9 / 15
Step 3a: Assemble the IMU components.
10 / 15
Step 3b: Tune the IMU filtering.
11 / 15
Step 3c: Place the IMU components into the enclosure.
12 / 15
Step 4a: Bundle the cables into a sleeve.
13 / 15
Step 4b: Secure the cable sleeve to the side of the headset.
14 / 15
Step 5a: Test the finished headset by watching a 3D movie.
15 / 15
Step 5b: Ensure that IMU-based tracking stabilizes the virtual screen position.

Students will be provided a kit to build their own head-mounted display, including an LCD, an HDMI driver board, an inertial measurement unit (IMU), lenses, an enclosure, and all cabling. Kits must be returned at the end of the course. All software will be developed through the homework assignments. Component details are listed below. The headset kit will be updated for Winter 2025, replacing the VR enclosure with an HRBOX2 AR headset.

Component Model Details
HMD Enclosure Shenzhen Haori AR Headset (HRBOX2) Alibaba
Display Panel Waveshare 5.5″ 2560×1440 LCD Waveshare
Microcontroller Teensy 4.0 PJRC
IMU InvenSense MPU-9250 HiLetgo
Breadboards Elegoo Mini Breadboard Kit Elegoo
Jumper Wires Edgelec 30cm Jumper Wires (Male to Male) Edgelec
HDMI Cable StarTech 6′ High Speed HDMI Cable StarTech
USB Cables Anker 6′ Micro USB Cable (2-Pack) Anker
Velcro Strenco 2″ Adhesive Hook and Loop Tape Strenco
Final Projects

VR Tennis Training by Iris Zhao, Jaylyn Zhang, and Yiqing Wang
Fluid Simulation in VR by James Try and Yafqa Khan
Human Pose Estimation using Ultra-Wideband Radar by Krish Jain and Jason Miller
Peltier-driven Thermal Haptics for VR by Dawson Harris and Hitender Oswal
Real-Time Emotion Recognition in AR by Jenny Peng and Brian Guo
Accessible AR/VR Control Based on Head Orientation by Nikolas McNamee
AR Object Detection and Labeling by Connor Reinholdtsen and Andy Stanciu
VR Video Chat using 3D Gaussian Splatting by Jason Zhang
Grocery Store Simulator by Kelly Zhen and Michelle Arquiza
Interactive Annotations in XR by Arushi Jeloka, Kriti Sharma, and Krishna Panchap
3D Physics Interactions with Hand Tracking by Aidan Gall and Sam Gan
3D Gaussian Splatting in VR by Derek Zhu, Hoang Nguyen, and Rich Chen
StoryboardXR by Kenneth Yang, Dalton Brockett, and Marley Byers
SoundDrift: Lidar-based Spatial Audio by Sai Sunku and Harshitha Rebala
Mono6D: Transforming 360° Video into Immersive VR Experiences by Boyang (Boe) Zhou
Physics-based Sound Simulation by Jiexiao Xu and Nicholas Batchelder
AR Cooking Assistant by Paul Han and Alvin Le
Sentient Sandbox: Modify Worlds with Language Models by Joshua Jung, Michael Li, and Eric Bae
Cooked by Brian Yao and Lawrence Tan
AR Math Helper by Valentina Zhang
Real-Time Environmental Lighting in AR by Ayush Agrawal
AR Navigation HUD by Brian Yao and Lawrence Tan

 Description Materials
 Haptics
 Peltier-driven Thermal Haptics for VR Report and Poster
 AR Tools and Technologies
 StoryboardXR Report and Poster
 Real-Time Environmental Lighting in AR Report and Poster
 AR Math Helper Report and Poster
 AR Navigation HUD Report and Poster
 Hand and Body Tracking
 Human Pose Estimation using Ultra-Wideband Radar Report and Poster
 3D Physics Interactions with Hand Tracking Report and Poster
 Audio
 Physics-based Sound Simulation Report and Poster
 SoundDrift: A Lidar-based Spatial Audio Experience Website and Poster
 Assistive Technologies
 Real-Time Emotion Recognition in AR Report and Poster
 Accessible AR/VR Control Based on Head Orientation Report
 3D Reconstruction
 Mono6D: Transforming 360° Video into Immersive VR Experiences Website and Poster
 VR Video Chat using 3D Gaussian Splatting Report and Poster
 3D Gaussian Splatting in VR Report and Poster
 Generative AI and Assistants
 Sentient Sandbox: Modify Worlds with Language Models Report and Poster
 AR Cooking Assistant Report and Poster
 Interactive Annotations in XR Report and Poster
 AR Object Detection and Labeling Report and Poster
 Games, Education, and Training Applications
 Cooked: Co-Located Collaborative Culinary Challenge Website and Poster
 VR Tennis Training Report and Poster
 Fluid Simulation in VR Report and Poster
 Grocery Store Simulator Report and Poster
Schedule
Lectures are on Wednesdays and Fridays from 4:30pm to 5:50pm in CSE2 G10.

Date Description Materials
Wednesday
January 8
No Class
Friday
January 10
Introduction to VR/AR Systems Slides and Video
Sutherland [1968]
Wednesday
January 15
Optical Architectures for Head-Mounted Displays Slides and Video
Kore [2018]
Friday
January 17
The Graphics Pipeline and OpenGL
Part I: Overview and Transformations
Slides, Video, and Notes
Marschner (Ch. 6 & 7)
Wednesday
January 22
The Graphics Pipeline and OpenGL
Part II: Lighting and Shading
Slides and Video
Marschner (Ch. 10 & 11)
Friday
January 24
The Graphics Pipeline and OpenGL
Part III: OpenGL Shading Language (GLSL)

Slides and Video
Wednesday
January 29
The Graphics Pipeline and OpenGL
Part IV: Stereo Rendering

Slides and Video
Friday
January 31
The Human Visual System Slides and Video
LaValle (Ch. 5 & 6)
Wednesday
February 5
Inertial Measurement Units
Part I: Overview and Sensors
Slides, Video 1/2, and Notes
LaValle (Ch. 9.1 & 9.2)
Friday
February 7
Inertial Measurement Units
Part II: Filtering and Sensor Fusion
Slides and Video
Wednesday
February 12
Positional Tracking
Part I: Overview and Sensors
Slides, Video, and Notes
Friday
February 14
Positional Tracking
Part II: Filtering and Calibration
Slides and Video
Wednesday
February 19
Advanced Topics
Part I: Engines and Emerging Technologies
Slides and Video
Friday
February 21
Advanced Topics
Part II: Immersive Video
Slides and Video
Wednesday
February 26
Advanced Topics
Part III: Mixed Reality Passthrough

Slides and Video
Friday
February 28
Advanced Topics
Part IV: Glasses-free 3D Displays
Slides and Video
Wednesday
March 5
Final Project Working Session
Friday
March 7
Industry Field Trip: Valve
Wednesday
March 12
Final Project Working Session
Friday
March 14
Industry Field Trip: Meta, Reality Labs Research
Friday
March 21
Final Project Demo Session (Open to the Public)
2:30pm to 4:20pm in Allen Center Atrium
Assignments

Students will complete five homeworks and a final project. Each homework is accompanied by a lab (a tutorial video). Labs must be completed before starting the homeworks. We encourage formatting written portions of homework solutions using the CSE 493V LaTeX template. Students must submit a one-page final project proposal and a final report. Final reports may take the form of a website or a conference manuscript.


Solutions.
Due Date Description Materials
Thursday
January 23
Homework 1
Transformations in WebGL
Lab 1 and Video
Assignment and Code
Solutions
Thursday
January 30
Homework 2
Lighting and Shading with GLSL
Lab 2 and Video
Assignment and Code
Solutions
Monday
February 10
Homework 3
Stereoscopic Rendering and Anaglyghs
Lab 3 and Video
Assignment and Code
Solutions
Thursday
February 20
Homework 4
Build Your Own HMD
Lab 4 and Video
Assignment and Code
Python headset model
Solutions
Friday
February 21
Final Project Proposal Directions and Template
Example
Thursday
February 27
Homework 5
Orientation Tracking with IMUs
Lab 5 and Video
Assignment and Code
Solutions
Wednesday
March 19
Final Project Report Template
Grading and Collaboration

The grading breakdown is as follows: homeworks (70%) and final project (30%).

Assignments are due by midnight on the due date. Each student is granted a pool of four late days. Up to two late days can be used on any given homework, with instructor permission required for longer extensions. Beyond permitted delays, late assignments are marked down at a rate of 25% per day. That is, if a student fails to turn in an assignment on time it is worth 75% for the first 24 hours after the deadline, 50% for the next 24 hours, 25% for the next 24 hours, and then it is worth nothing after that. Exceptions will only be given with prior instructor approval.

While the headset development kits will be shared, students are expected to individually write their homework solutions. Students may collaborate to discuss concepts for the homeworks, but are expected to be able to explain their solutions for the purposes of grading by the instructor and TAs. Final project groups can be as large as three students, subject to instructor approval.

Textbooks and Resources

Lectures are supplemented by course notes, journal articles, and textbook chapters. The following textbooks will be used for CSE 493V, which are freely available to University of Washington students via the links below.

All software will be developed using JavaScript, WebGL, and GLSL. Students should review the following tutorials and online resources to prepare for the labs, homeworks, and final projects.

Office Hours and Contacts

We encourage students to post their questions to Ed Discussion. The teaching staff can also be contacted directly at . The instructor and TAs will hold weekly office hours at the following times.

  • Douglas Lanman (Wednesdays, 5:50pm to 6:30pm, CSE2 G10)
  • Evan Zhao (Mondays, 2:00pm to 3:00pm, CSE2 153; Wednesdays, 2:30pm to 3:30pm, CSE2 153)
  • Shaan Chattrath (Tuesdays and Thursdays, 4:30pm to 5:30pm, Allen Center 218)