Input and VR Interaction

Learn how aether-input provides VR controller tracking, desktop fallback, action mapping, gesture detection, locomotion, and haptic feedback through OpenXR.

Overview

The aether-input crate handles all input processing for the Aether engine, from raw hardware signals to high-level action events. It provides a unified abstraction over VR controllers (via OpenXR), desktop keyboard/mouse, and gamepads. The pipeline transforms raw inputs through dead zones and sensitivity curves, maps them to named actions, detects gestures, and computes locomotion vectors.

Input Pipeline

Raw input flows through a series of processing stages:

Raw Input -> Dead Zone -> Sensitivity Curve -> Action Mapping -> Gesture Detection -> Action Events
use aether_input::{InputPipeline, ActionMap, DeadZoneConfig, SensitivityCurve};

let pipeline = InputPipeline::new(action_map, DeadZoneConfig::default(), SensitivityCurve::Quadratic);
let events = pipeline.process(&raw_state, timestamp_ms);

Dead Zones and Sensitivity

Dead zone processing eliminates thumbstick drift by zeroing small inputs:

use aether_input::deadzone::{DeadZoneConfig, DeadZoneShape, apply_dead_zone};

let config = DeadZoneConfig {
    inner_radius: 0.15,  // Below 15% -> zero
    outer_radius: 0.95,  // Above 95% -> max
    shape: DeadZoneShape::Circular,
};
let (x, y) = apply_dead_zone(raw_x, raw_y, &config);

Sensitivity curves reshape the response after dead zone filtering:

use aether_input::deadzone::{SensitivityCurve, apply_sensitivity};

apply_sensitivity(0.5, &SensitivityCurve::Linear);    // 0.5
apply_sensitivity(0.5, &SensitivityCurve::Quadratic);  // 0.25 (more precision at low input)
apply_sensitivity(0.5, &SensitivityCurve::Cubic);       // 0.125
// Also: SCurve, Custom(Vec<(f32, f32)>)

Action Mapping

Named actions decouple physical inputs from game logic, supporting multiple bindings per action:

use aether_input::mapping::{ActionMap, InputSource, InputGesture};

let mut map = ActionMap::new();
map.bind("jump", InputSource::KeyboardKey(KeyCode::Space), InputGesture::Press);
map.bind("jump", InputSource::VrButton { hand: Hand::Right, button: VrButton::A }, InputGesture::Press);
map.bind("move_forward", InputSource::VrThumbstick { hand: Hand::Left, axis: Axis::Y }, InputGesture::Press);

Gesture Detection

The gesture system uses state machines to detect complex input patterns:

  • Press/Release -- Fires on key down or up
  • Hold -- Fires after a button is held for a minimum duration; early release cancels
  • DoubleTap -- Fires on second press within a configurable interval
  • Combo -- Requires multiple buttons pressed simultaneously
use aether_input::graph::GestureDetector;

let mut detector = GestureDetector::new();
detector.register("charge", InputGesture::Hold { min_duration_ms: 500 });
detector.register("dash", InputGesture::DoubleTap { max_interval_ms: 300 });
let events = detector.update(source, is_pressed, now_ms);

Locomotion

Four movement modes common in VR are supported as pure functions:

use aether_input::movement::*;

// Smooth: continuous analog movement
let delta = compute_smooth_move(direction, speed, acceleration, max_speed, dt);

// Teleport: point-and-commit with validation
match compute_teleport(origin, target, max_distance) {
    TeleportResult::Valid { position } => { /* move */ }
    TeleportResult::OutOfRange => { /* invalid */ }
    TeleportResult::Blocked => { /* obstructed */ }
}

// Snap turn: discrete rotation (wraps at 360)
let yaw = compute_snap_turn(current_yaw, 30.0);

// Smooth turn: continuous rotation
let yaw = compute_smooth_turn(current_yaw, turn_speed, dt);

OpenXR Integration

Session Lifecycle

The session follows the OpenXR state machine: Idle -> Ready -> Synchronized -> Visible -> Focused -> Stopping. Loss pending and exiting states handle disconnection and shutdown.

use aether_input::openxr_session::{SessionManager, SessionState};

let mut session = SessionManager::new();
session.request_begin()?;
match session.state() {
    SessionState::Focused => { /* Full input + rendering */ }
    SessionState::Visible => { /* Rendering, no input focus */ }
    _ => {}
}

Tracking

Per-frame snapshots provide HMD and controller poses with confidence levels:

use aether_input::openxr_tracking::TrackingSnapshot;

let snap: TrackingSnapshot = tracking.poll();
let head = snap.hmd_pose; // position + rotation
let right = &snap.right_controller;
if right.trigger_value > 0.8 { /* trigger pulled */ }

// Optional 26-joint hand tracking
if let Some(joints) = snap.left_hand_joints {
    let tip = joints.joints[HandJoint::IndexTip as usize];
}

Haptic Feedback

use aether_input::openxr_haptics::{HapticDispatcher, HapticEffect, HapticWave};

let mut haptics = HapticDispatcher::new();
haptics.submit(HapticEffect {
    hand: Hand::Right, wave: HapticWave::Pulse,
    amplitude: 0.7, duration_ms: 100,
});

The dispatcher enforces cooldowns and clamps amplitude to [0.0, 1.0].

Swapchain and Reference Spaces

Swapchain management handles image acquire/release for rendering integration. Three reference spaces are supported: Local (seated, origin at initial position), Stage (room-scale, origin at floor), and View (locked to head, for UI overlays).

Desktop Fallback

For development without VR hardware, DesktopAdapter maps keyboard/mouse to the same RuntimeAdapter interface:

use aether_input::desktop::{DesktopAdapter, KeyCode};

let mut adapter = DesktopAdapter::new();
adapter.update_key(KeyCode::W, true);
adapter.update_mouse(dx, dy);
let frame = adapter.poll(); // Same InputFrame format as OpenXR

Key Types Reference

TypeModulePurpose
InputPipelineprocessingFull input processing chain
ActionMapmappingInput-to-action bindings
GestureDetectorgraphHold, double-tap, combo detection
DeadZoneConfigdeadzoneThumbstick dead zone parameters
SensitivityCurvedeadzoneResponse curve shaping
DesktopAdapterdesktopKeyboard/mouse adapter for testing
SessionManageropenxr_sessionOpenXR session lifecycle
TrackingSnapshotopenxr_trackingPer-frame HMD and controller data
HapticDispatcheropenxr_hapticsVibration command submission
SwapchainManageropenxr_swapchainVR swapchain image lifecycle
ReferenceSpaceopenxr_sessionLocal, Stage, or View origin