ISO/IEC 23090-31 — MPEG-I Haptics Coding

Adding Touch to Immersive Media

MPEG-I Haptics Coding defines the standardized representation, compression and delivery of haptic signals—the sensations of touch, force and motion that enrich immersive experiences. Published in January 2025 as ISO/IEC 23090-31, it unifies the encoding of vibrotactile and kinesthetic data, enabling interoperable, high-fidelity tactile feedback across XR, gaming, teleoperation and accessibility applications.

Explore the architecture Get official resources Status: Published IS — January 2025

Architecture & Core Concepts

Dual input model

The standard supports both descriptive (parametric) inputs—high-level tactile effects described in JSON—and PCM (signal-based) inputs—time-sampled vibration or force waveforms. Both are represented in two formats: the human-readable HJIF (Haptic JSON Interchange Format) and the binary packetized MIHS (MPEG-I Haptic Stream).

HJIFMIHS

Hierarchical data structure

Haptic content is organized through five layers: ExperiencePerceptionChannelBandEffect. Metadata at each layer describes modalities (vibration, force, temperature …), device mappings, and body locations, allowing adaptive playback on diverse hardware.

VibrotactileKinestheticThermal

Flexibility and scalability

The architecture supports very low bit-rate (≤ 8 kbps) synthetic effects as well as high-fidelity wavelet-encoded signals (≥ 64 kbps). Developers can mix parametric and wavelet bands within one stream to optimize quality and latency for each modality and device.

Reference software model

A public reference implementation is available on GitHub. It encodes, decodes and renders HJIF/MIHS streams, illustrating both parametric and wavelet pipelines and serving as a baseline for conformance tests.

Encoding Modes & Performance

Vectorial (parametric) coding

Encodes signals as keyframed amplitude and frequency functions with interpolations (Curve, Transient, VectorialWave). Ideal for low-frequency or synthetic effects on mobile devices.

Wavelet coding

Decomposes PCM signals into frequency bands using quantized wavelet coefficients and entropy coding (SPIHT), providing high-accuracy reconstruction for recorded signals.

Hybrid approach

Combines vectorial low-frequency bands with wavelet high-frequency bands (C2VWR mode). This two-layer scheme balances complexity and quality for interactive XR content.

Evaluations

Objective and subjective tests (MUSHRA, 35 participants / 3 labs) show C2VWR achieves excellent perceptual quality above 8 kbps and perceptually lossless at ≥ 64 kbps.

Conformance & Reference Framework

Validation tools

ISO/IEC 23090-33 defines schema and semantic verifiers for HJIF, bitstream verifiers for MIHS, compatibility checks for binary conversion, and decoder comparison tests for bit-exact verification.

Datasets

Two official datasets support testing: a reference dataset (vibrotactile and kinesthetic signals) for performance evaluation, and a conformance dataset with valid and invalid files to verify error detection capability.

Profiles & levels

Simple Parametric Profile targets phones and controllers (Transient, Curve, Vectorial bands). Main Profile adds Wavelet bands for simulators and advanced devices with higher fidelity.

Typical bitrates

2–4 kbps → perceptible distortion | 8–16 kbps → good quality | 32–64 kbps → lossless perception.

Transport & Integration

ISO/IEC 23090-32 — Carriage of Haptics Data

Defines storage and delivery of MPEG Haptics bitstreams in the ISO Base Media File Format (ISOBMFF) and support in DASH. Enables synchronized playback with audio and video streams in immersive media pipelines.

IETF RFC 9695 — RTP payload format

Introduces haptics as a top-level Internet media type and defines real-time transmission of MIHS units via RTP, enabling interactive tactile communication and telepresence use cases.

Example Devices & Evaluation Setups

Reference devices

Tests used a custom vibrotactile handheld device and the 3D Systems Geomagic Touch (kinesthetic 3 DoF force feedback) as reference platforms for codec validation.

Commercial examples

Actronika Skinetic vest (full-body vibration), WeArt TouchDiver Pro glove (multi-modal thermal + force feedback), and XGlove (high-resolution texture output) demonstrate compatibility with the MPEG format.

Applications & Future Work

Integration domains

XR and Metaverse platforms for multisensory immersion; gaming controllers and vests; teleoperation and robotics; medical training and rehabilitation; accessibility devices for enhanced feedback.

Phase 2 development

MPEG is extending the framework to cover interactive haptics, avatars and object-based touch interactions, forming the next phase of the Haptics standard for advanced XR applications.

Official Resources

FAQ

What is the goal of MPEG-I Haptics?

To enable interoperable transmission and rendering of tactile sensations—vibration, force, temperature—across devices and platforms, complementing visual and auditory media.

Is real-time streaming supported?

Yes. Through the MIHS format and RTP transport, the standard supports real-time interactive scenarios such as remote collaboration and teleoperation.

Which devices can play MPEG Haptics?

Any device implementing the standard interface—gloves, vests, gamepads, or robotics actuators—can decode or adapt tactile data using the metadata-driven mapping layer.

What’s next?

Phase 2 will integrate full interaction models (haptic objects, avatars, 3D feedback loops) within the MPEG-I suite for upcoming XR and metaverse standards.

© ISO/IEC JTC 1/SC 29 — MPEG Coding of 3D Graphics and Haptics (WG 7). This page summarizes Haptics Coding for information purposes. For normative details, consult the official specifications.