VR Emulator Demo
VR emulator that maps keyboard and mouse input to headset and controller tracking for development without VR hardware.
The VR emulator demo lets you develop and test VR experiences on a standard PC
without any headset or controllers. It provides a software emulation layer that
maps keyboard and mouse input to VR tracking data, renders a stereo or mono
preview, and exposes the same TrackingSnapshot interface that real OpenXR
hardware would produce.
What It Demonstrates
- VR emulation via
aether-vr-emulator, which simulates a Quest 2 headset at its native resolution and refresh rate. - Stereo rendering with per-eye viewports. The emulator splits the window into left and right halves, each with its own projection and eye offset.
- Controller visualization showing emulated controller positions as colored markers and aim beams projected into the scene.
- Headset presets (
HeadsetPreset::Quest2) that configure field of view, IPD, resolution, and refresh rate to match real hardware specifications. - Headless mode for automated testing -- the emulator can run without a window, making it suitable for CI pipelines.
How to Run
cargo run -p vr-emulator-demo
The emulator opens a window showing the stereo view. No VR headset or runtime is required.
Controls
The emulator maps standard desktop input to VR tracking data:
| Input | VR Equivalent |
|---|---|
| Mouse movement | Head rotation (yaw and pitch) |
W / A / S / D | Head position (forward / left / backward / right) |
Space / Shift | Head position (up / down) |
| Left mouse button | Left controller trigger |
| Right mouse button | Right controller trigger |
Q / E | Left / right controller grip |
ESC | Quit |
Controller positions follow the head with fixed offsets, simulating hands at a natural resting position relative to the headset.
How the Emulator Works
The VrEmulator struct wraps the full emulation pipeline:
- Configuration --
HeadsetPreset::Quest2sets the emulated hardware profile (resolution, FOV, IPD, refresh rate). - Input polling -- each call to
emulator.update(dt)reads keyboard and mouse state and computes aTrackingSnapshotwith head pose, left controller, and right controller transforms. - Framebuffer --
create_framebuffer()allocates a pixel buffer matching the emulated resolution. The application clears it and renders into it. - Stereo display --
emulator.display()returns aStereoDisplaythat defines left and right eye viewports.left_eye_view()andright_eye_view()provide per-eye projection parameters. - Present --
present_with_overlay()composites a debug overlay (showing tracking data) onto the framebuffer and displays it in the window.
Subsystems Showcased
| Crate | Role in this demo |
|---|---|
aether-vr-emulator | VrEmulator, HeadsetPreset, StereoDisplay, EmulatorFrameBuffer |
aether-input | TrackingSnapshot providing the same interface as OpenXR tracking |
Why Use the Emulator
Developing VR applications typically requires wearing a headset and running an OpenXR runtime. The emulator removes this requirement for the majority of development tasks:
- Rapid iteration -- make code changes and test immediately on your desktop.
- CI/CD integration -- headless mode allows automated rendering tests without GPU or VR hardware.
- Accessibility -- contributors without VR hardware can still work on VR-related code.
The emulator produces the same TrackingSnapshot type as the real OpenXR
integration in aether-input, so code written against the emulator works
unchanged when switching to real hardware.
Scene Contents
The demo scene (scene.rs) renders a simple environment to validate the
stereo pipeline:
- A ground grid for spatial orientation.
- Four floating cubes and three spheres at fixed positions.
- Controller markers drawn at the emulated grip positions.
- Aim beams projected from the right controller.
Source Location
All source files live under examples/vr-emulator-demo/src/:
main.rs-- entry point, emulator setup, frame loopscene.rs-- per-eye scene rendering (grid, objects, controller markers)