ISO/IEC 23090-31 — MPEG-I Haptics Coding

Adding Touch to Immersive Media

MPEG-I Haptics Coding defines the standardized representation, compression and delivery of haptic signals—the sensations of touch, force and motion that enrich immersive experiences. Published in January 2025 as ISO/IEC 23090-31, it unifies the encoding of vibrotactile and kinesthetic data, enabling interoperable, high-fidelity tactile feedback across XR, gaming, teleoperation and accessibility applications.

Explore the architecture Get official resources Status: Published IS — January 2025

Architecture & Core Concepts

Dual input model

The standard supports both descriptive (parametric) inputs—high-level tactile effects described in JSON—and PCM (signal-based) inputs—time-sampled vibration or force waveforms. Both are represented in two formats: the human-readable HJIF (Haptic JSON Interchange Format) and the binary packetized MIHS (MPEG-I Haptic Stream). The MIHS binary stream can be stored in a HMPG (Haptic MPEG) file.

HJIFMIHSHMPG

Hierarchical data structure

Haptic content is organized through five layers: ExperiencePerceptionChannelBandEffect. Metadata at each layer describes modalities (vibration, force, temperature …), device mappings, and body locations, allowing adaptive playback on diverse hardware, different user body areas and adapting to user preferences. It is possible to address setups with various devices and various actuators, using a multi-channel approach.

VibrotactileKinestheticMotionThermal

Flexibility and scalability

The architecture supports very low bit-rate (≤ 8 kbps) synthetic effects as well as high-fidelity wavelet-encoded signals (≥ 64 kbps) through one to many scalable layers. Developers can mix parametric and wavelet bands within one stream to optimize quality and latency for each modality, distribution network and device. This allows to address a wide range of applications, from communication on mobile networks to live sport streaming, high-end movie and XR experiences.

MobileBroadcastStreamingLive

Reference software model

A public reference implementation is available on GitHub. It encodes, decodes and renders HJIF/MIHS streams, illustrating both parametric and wavelet pipelines and serving as a baseline for conformance tests. Inputs are PCM signals (.wav) or parametric description of effects (.hjif, .ahap).

Encoding Modes & Performance

Parametric coding

Encodes transient or continuous signals as keyframed amplitude and frequency functions or vectorial control points with interpolations (linear and non-linear). Ideal for low-frequency or synthetic effects on mobile devices.

TransientContinous

Wavelet coding

Decomposes PCM signals into frequency bands using quantized wavelet coefficients and entropy coding (SPIHT), providing high-accuracy reconstruction for recorded signals.

Signals

Hybrid approach

Combines parametric low-frequency bands with wavelet high-frequency bands. This two-layers (or more) scheme balances complexity and quality for interactive XR content.

Evaluations

Objective and subjective tests (MUSHRA, 35 participants / 3 labs) show performances from excellent perceptual quality above 8 kbps and perceptually lossless at ≥ 64 kbps for the test set used.

Conformance & Reference Framework

Validation tools

ISO/IEC 23090-33 defines schema and semantic verifiers for HJIF, bitstream verifiers for MIHS, compatibility checks for binary conversion, and decoder comparison tests for bit-exact verification.

Datasets

Two official datasets support testing: a reference dataset (vibrotactile and kinesthetic signals) for performance evaluation, and a conformance dataset with valid and invalid files to verify error detection capability.

Profiles & levels

Simple Parametric Profile targets phones and controllers and supports only parametric coding with a limited number of channels, bands and modalities.Main Profile adds Wavelet encoding and un limited number of bands, channels and modalities, targeting simulators and advanced devices with higher fidelity.

Typical bitrates per channel

2–4 kbps → perceptible distortion | 8–16 kbps → good quality | 32–64 kbps → lossless perception.

Transport & Integration

ISO/IEC 23090-32 — Carriage of Haptics Data

Defines storage and delivery of MPEG Haptics bitstreams in the ISO Base Media File Format (ISOBMFF) and support in DASH. Enables synchronized playback with audio and video streams in media and immersive pipelines.

IETF RFC 9695 — RTP payload format

Introduces haptics as a top-level Internet media type and defines real-time transmission of MIHS units via RTP, enabling interactive tactile communication and telepresence use cases.

Example Devices & Evaluation Setups

Reference devices

Tests used a custom vibrotactile handheld device and the 3D Systems Geomagic Touch (kinesthetic 3 DoF force feedback) as reference platforms for codec validation and user tests.

Commercial examples

Actronika Skinetic vest (full-body vibration), Razer Freyja cushion Wolverine controllers Kraken headset, Sony PS5 controller, WeArt TouchDiver Pro glove (multi-modal thermal + force feedback), XGlove (high-resolution texture output) demonstrate compatibility with the MPEG format, as well as all smartphones through adaptation to the iOS and Android APIs.

Applications & Future Work

Integration domains

Mobile communication, live+streaming sports/TV/movies media delivery with sensory feedback, XR and Metaverse platforms for multisensory immersion; gaming platforms with controllers, seats and vests for physical interactions; automotive; teleoperation and robotics; medical training and rehabilitation; accessibility devices for enhanced feedback.

Phase 2 development

MPEG is extending the framework to cover spatial interactive haptics, avatars and object-based touch interactions, forming the next phase of the Haptics standard for advanced XR applications. Support of Haptics in CMAF is also considered.

Official Resources

FAQ

What is the goal of MPEG-I Haptics?

To enable interoperable transmission and rendering of tactile sensations—vibration, force, temperature—across devices and platforms, complementing visual and auditory media.

Is real-time streaming supported?

Yes. Through the MIHS format and either DASH or RTP transport, the standard supports streaming and live distribution as well as real-time interactive scenarios such as remote collaboration and teleoperation.

Which devices can play MPEG Haptics?

Any device implementing the standard interface—smartphones, gloves, vests, gamepads, seats, headphones, or robotics actuators—can decode or adapt tactile data using the metadata-driven mapping layer.

What’s next?

Phase 2 will integrate full interaction models (haptic objects & scenes, avatars, 3D feedback loops, propagation models) within the MPEG-I suite for upcoming XR and metaverse standards.

© ISO/IEC JTC 1/SC 29 — MPEG Coding of 3D Graphics and Haptics (WG 7). This page summarizes Haptics Coding for information purposes. For normative details, consult the official specifications.