전기촉각 헙틱스를 활용한 VR 트레이닝: "VR에서 훈련하다"를 넘어 "현실처럼 훈련하다"로 (1)
- wavecompany
- 7일 전
- 3분 분량
안녕하세요 여러분, Wave Company입니다. 😊
오늘은 제품 출시나 판매 안내가 아닌, 동료 심사 논문에 기반한 연구 내용을 공유하려 합니다.
Wave Company가 어떤 것을 연구해왔고, 지금까지 어떤 성과를 이뤄냈으며, 한 걸음 한 걸음 어떻게 앞으로 나아가고 있는지 보여드리고 싶습니다.
일부 내용은 조금 기술적이거나 생소할 수 있지만, 최종적으로 저희 제품을 만들어내는 노력과 연구에 대한 감을 전해드리고자 합니다. 😉

이 연구를 하는 이유
문제 인식
VR은 이미 시각과 청각을 통해 강력한 몰입감을 제공합니다. 하지만 피부가 아무것도 느끼지 못하면 "이것은 진짜 같다"는 순간에 도달하기 어렵습니다.
연구 목표
저희는 화면 이벤트가 발생할 때 정확하게 촉각을 제공하면 사용자의 인식이 "VR에서 훈련하고 있다"에서 "현실처럼 훈련하고 있다"로 전환될 수 있는지를 데이터로 검증하고자 했습니다.
방법 개요
저희는 전기촉각 헙틱스 장치를 개발하고 Meta Quest + Unity 콘텐츠와 동기화했습니다. 전기 자극이 있는 세션과 없는 세션을 비교하고, 주관적 및 객관적 측정을 모두 사용하여 결과를 평가했습니다.

What did we build, and how?
System at a glance
When an event happens in VR (e.g., a collision, or gripping and holding a lever), the event is instantly translated into a tactile signal. Two things are crucial:
Timing alignment The moment something happens on screen should be the moment a sensation reaches the skin.
Logical mapping The character of the event should make sense with the “feel” of the tactile sensation.
Implementing VR training content
We framed the content as “basic tool-use training.”
For short, sharp collisions/impacts, we designed a skin sensation that feels like a quick “tap!”
For gripping and holding a lever, we used gentler or periodic sensations to recreate a “pressing and holding” feel.
For fine assembly/meshing moments, we avoided excessive intensity and used subtler sensations to reduce any mismatch.

Stimulus design: waveform library & tuning
We prepared various waveform combinations and adjusted frequency, intensity, and duration according to scene context. Treating it as a waveform library, we repeatedly measured and tuned which waveforms produced the most convincing “feel” for each situation.
A quick glossary:
Waveform: the “shape” of the sensation delivered to the skin
Frequency: how many times per second the stimulus repeats
Intensity: how strong the skin feels the stimulus
Duration: how long a single stimulus lasts
Throughout the study, we iteratively tuned these four elements so that what happens on screen and what the skin feels don’t drift apart.

How did we measure the waveforms? Leveraging ElecSil R&D
Wave Company has long researched ElecSil, our conductive-silicone electrode. The test setups and measurement know-how we accumulated there let us measure and validate a wide range of waveforms reliably and safely.

Synchronization and latency management
Any timing mismatch among vision, audio, and touch immediately breaks immersion. So we checked frame stability from the scene design stage and aligned the start of sound effects with the onset of tactile stimuli. The goal was simple: when something happens on screen, sound and skin sensation should arrive together.
Automatic intensity scaling
We linked stimulus intensity to scene variables like speed or acceleration. A harder hit feels stronger; a light touch feels gentler. By letting changes in user motion translate directly into tactile changes, we increased the sense of coupling between action and feedback.
Consistent fit and electrode placement
Stable stimulation requires electrodes to sit in the same place each time. We established reference positions and secured placement so repeated donning/doffing wouldn’t shift them, preserving both data quality and perceived consistency. We also kept cables and hardware from intruding on the experience to protect immersion.
These topics are a bit more technical than usual. But if you take away one thought, let it be this:
“They invest this kind of research and effort so they can deliver better products to users.”
Today we focused on how we connected the device and how we generated tactile sensationsan implementation-first look.
In Part 2, we’ll summarize performance validation and results.

댓글