JumpMod - Haptic Backpack that Modifies Users’ Perceived Jump
June 2022 - September 2022
Published: CHI '23; Also showcased at SIGGRAPH '23 and demoed at Argonne National Lab
Authors: Romain Nith, Jacob Serfaty, Sam Shatzkin, Alan Shen, Pedro Lopes
Affiliation: UChicago Human-Computer Integration Lab
Video Showcase
Abstract
Vertical force-feedback is extremely rare in mainstream interactive experiences. This happens because existing haptic devices capable of sufficiently strong forces that would modify a user's jump require grounding (e.g., motion platforms or pulleys) or cumbersome actuators (e.g., large propellers attached or held by the user). To enable interactive experiences to feature jump-based haptics without sacrificing wearability, we propose JumpMod, an untethered backpack that modifies one's sense of jumping. JumpMod achieves this by moving a weight up/down along the user's back, which modifies perceived jump momentum—creating accelerated & decelerated jump sensations. In our second study, we empirically found that our device can render five effects: jump higher, land harder/softer, pulled higher/lower. Based on these, we designed four jumping experiences for VR & sports. Finally, in our third study, we found that participants preferred wearing our device in an interactive context, such as one of our jump-based VR applications.
What did I contribute?
Wrote and refactored all code for controlling hardware
Created an OSC <-> BLE communication protocol in Python so the JumpMod backpack can communicate with Unity demos
Refactored old VR demo application to fix detrimental bugs and add support for the new communication protocol
Programmed two jump prediction algorithms, one based on IMU acceleration data and the other based on VR HMD position data
IMU algorithm: Uses Semi-implicit Euler Integration to estimate position, frequently offsets position to account for accumulating error. Usable everywhere, but not always accurate.
VR HMD position algorithm: Directly uses position data to classify the user between predefined jump phases. Usable only with VR headsets, but near-perfect accuracy.
Wrote a wrapper API for the position tracking algorithms that can automatically queue and trigger haptic effects
Designed a VR demo game in the Unity game engine, JumpMod Escape Room, that showcases the backpack's haptic effects through platforming puzzles that require substantial real-world jumps
Conducted a 12-person user study to determine the impact and enjoyability of the haptic effects in the JumpMod Escape Room game