VR Training with Electrotactile Haptics: Beyond “Training in VR” to “Training Like Reality” (1)
- wavecompany
- Sep 9
- 3 min read
Hello everyone this is Wave Company. 😊
Today isn’t a product launch or sales notice; it’s a research share grounded in peer-reviewed work.
We’d like to show what Wave Company has been researching for, what we’ve achieved so far, and how we keep moving forward step by step.
Some of this may feel a bit technical or unfamiliar, but we hope it gives you a sense of the effort and research that ultimately shape our products. 😉

Why this research?
Problem awareness
VR already delivers strong immersion through sight and sound. But when the skin doesn’t feel anything, it’s hard to reach that “this feels real” moment.
Research goal
We set out to verify using data whether providing a tactile sensation precisely when an on-screen event occurs can shift users’ perception from “I’m training in VR” to “I’m training in a way that feels like the real world.”
Method overview
We built an electrotactile haptics device and synchronized it with Meta Quest + Unity content. We compared sessions with and without electrical stimulation, and evaluated outcomes using both subjective and objective measures.

What did we build, and how?
System at a glance
When an event happens in VR (e.g., a collision, or gripping and holding a lever), the event is instantly translated into a tactile signal. Two things are crucial:
Timing alignment The moment something happens on screen should be the moment a sensation reaches the skin.
Logical mapping The character of the event should make sense with the “feel” of the tactile sensation.
Implementing VR training content
We framed the content as “basic tool-use training.”
For short, sharp collisions/impacts, we designed a skin sensation that feels like a quick “tap!”
For gripping and holding a lever, we used gentler or periodic sensations to recreate a “pressing and holding” feel.
For fine assembly/meshing moments, we avoided excessive intensity and used subtler sensations to reduce any mismatch.

Stimulus design: waveform library & tuning
We prepared various waveform combinations and adjusted frequency, intensity, and duration according to scene context. Treating it as a waveform library, we repeatedly measured and tuned which waveforms produced the most convincing “feel” for each situation.
A quick glossary:
Waveform: the “shape” of the sensation delivered to the skin
Frequency: how many times per second the stimulus repeats
Intensity: how strong the skin feels the stimulus
Duration: how long a single stimulus lasts
Throughout the study, we iteratively tuned these four elements so that what happens on screen and what the skin feels don’t drift apart.

How did we measure the waveforms? Leveraging ElecSil R&D
Wave Company has long researched ElecSil, our conductive-silicone electrode. The test setups and measurement know-how we accumulated there let us measure and validate a wide range of waveforms reliably and safely.

Synchronization and latency management
Any timing mismatch among vision, audio, and touch immediately breaks immersion. So we checked frame stability from the scene design stage and aligned the start of sound effects with the onset of tactile stimuli. The goal was simple: when something happens on screen, sound and skin sensation should arrive together.
Automatic intensity scaling
We linked stimulus intensity to scene variables like speed or acceleration. A harder hit feels stronger; a light touch feels gentler. By letting changes in user motion translate directly into tactile changes, we increased the sense of coupling between action and feedback.
Consistent fit and electrode placement
Stable stimulation requires electrodes to sit in the same place each time. We established reference positions and secured placement so repeated donning/doffing wouldn’t shift them, preserving both data quality and perceived consistency. We also kept cables and hardware from intruding on the experience to protect immersion.
These topics are a bit more technical than usual. But if you take away one thought, let it be this:
“They invest this kind of research and effort so they can deliver better products to users.”
Today we focused on how we connected the device and how we generated tactile sensationsan implementation-first look.
In Part 2, we’ll summarize performance validation and results.


Comments