VR Emulator Demo

VR emulator that maps keyboard and mouse input to headset and controller tracking for development without VR hardware.

The VR emulator demo lets you develop and test VR experiences on a standard PC without any headset or controllers. It provides a software emulation layer that maps keyboard and mouse input to VR tracking data, renders a stereo or mono preview, and exposes the same TrackingSnapshot interface that real OpenXR hardware would produce.

What It Demonstrates

  • VR emulation via aether-vr-emulator, which simulates a Quest 2 headset at its native resolution and refresh rate.
  • Stereo rendering with per-eye viewports. The emulator splits the window into left and right halves, each with its own projection and eye offset.
  • Controller visualization showing emulated controller positions as colored markers and aim beams projected into the scene.
  • Headset presets (HeadsetPreset::Quest2) that configure field of view, IPD, resolution, and refresh rate to match real hardware specifications.
  • Headless mode for automated testing -- the emulator can run without a window, making it suitable for CI pipelines.

How to Run

cargo run -p vr-emulator-demo

The emulator opens a window showing the stereo view. No VR headset or runtime is required.

Controls

The emulator maps standard desktop input to VR tracking data:

InputVR Equivalent
Mouse movementHead rotation (yaw and pitch)
W / A / S / DHead position (forward / left / backward / right)
Space / ShiftHead position (up / down)
Left mouse buttonLeft controller trigger
Right mouse buttonRight controller trigger
Q / ELeft / right controller grip
ESCQuit

Controller positions follow the head with fixed offsets, simulating hands at a natural resting position relative to the headset.

How the Emulator Works

The VrEmulator struct wraps the full emulation pipeline:

  1. Configuration -- HeadsetPreset::Quest2 sets the emulated hardware profile (resolution, FOV, IPD, refresh rate).
  2. Input polling -- each call to emulator.update(dt) reads keyboard and mouse state and computes a TrackingSnapshot with head pose, left controller, and right controller transforms.
  3. Framebuffer -- create_framebuffer() allocates a pixel buffer matching the emulated resolution. The application clears it and renders into it.
  4. Stereo display -- emulator.display() returns a StereoDisplay that defines left and right eye viewports. left_eye_view() and right_eye_view() provide per-eye projection parameters.
  5. Present -- present_with_overlay() composites a debug overlay (showing tracking data) onto the framebuffer and displays it in the window.

Subsystems Showcased

CrateRole in this demo
aether-vr-emulatorVrEmulator, HeadsetPreset, StereoDisplay, EmulatorFrameBuffer
aether-inputTrackingSnapshot providing the same interface as OpenXR tracking

Why Use the Emulator

Developing VR applications typically requires wearing a headset and running an OpenXR runtime. The emulator removes this requirement for the majority of development tasks:

  • Rapid iteration -- make code changes and test immediately on your desktop.
  • CI/CD integration -- headless mode allows automated rendering tests without GPU or VR hardware.
  • Accessibility -- contributors without VR hardware can still work on VR-related code.

The emulator produces the same TrackingSnapshot type as the real OpenXR integration in aether-input, so code written against the emulator works unchanged when switching to real hardware.

Scene Contents

The demo scene (scene.rs) renders a simple environment to validate the stereo pipeline:

  • A ground grid for spatial orientation.
  • Four floating cubes and three spheres at fixed positions.
  • Controller markers drawn at the emulated grip positions.
  • Aim beams projected from the right controller.

Source Location

All source files live under examples/vr-emulator-demo/src/:

  • main.rs -- entry point, emulator setup, frame loop
  • scene.rs -- per-eye scene rendering (grid, objects, controller markers)