The project explored the design and evaluation of midair gestural techniques for translation tasks in large-display interaction. Translation—moving objects to a predefined target while preserving size and rotation—is a core interaction for large public displays, smart spaces, and augmented reality.
The study aimed to identify efficient, accurate, and user-preferred gestural techniques by systematically comparing four gestures: fist, palm, pinch, and sideways.
To evaluate the performance, usability, and user preferences of different midair gestural techniques for object translation at short and long distances. The goal was to inform the design of intuitive and fatigue-free gesture-based interfaces for large-display and touchless environments.
Interaction designer, Developer, UX researcher
Development: C++, Intel RealSense SDK
Data analysis: SPSS
UX researcher/developer
Apr - Aug 2016 (5 monthes)
Reviewed 64 prior studies to identify existing translation gestures and their usability challenges.
Identified user needs: accuracy, physical comfort, low fatigue, and natural movement patterns.
Observed that while swipe gestures are popular, they often cause fatigue in long-distance interactions, leading to their exclusion.
Selected four promising gesture techniques for detailed testing: fist, palm, pinch, sideways.
Designed gestures using two mechanics: clutch-and-release (fist, palm, pinch) and autoscrolling (sideways).
Piloted designs with five participants to refine gesture parameters and ensure reliable detection.
Developed a custom experimental software prototype allowing users to move 2D objects across a large display using the mid-air gestures.
Integrated visual and audio feedback to guide users through clutching, translating, and releasing actions.
Implemented precise tracking of hand positions and gesture states using the Intel® RealSense™ SDK.
Fist, Palm, Pinch: Clutch to grab → Move to target → Release.
Sideways: Move hand to the screen edge to initiate translation → Hold position → Remove hand to stop.
Participants:
30 adults (19–45 years old), most with little or no prior experience with gesture interfaces.
Methods:
Participants translated objects over short and long distances using each gesture. Collected data on:
Movement Time;
Error Rate;
Target Re-entries;
Subjective Ratings (usability, comfort, satisfaction)
Quantitative Results:
Palm gesture achieved the fastest and most accurate results overall.
Fist gesture performed well, especially for short distances.
Pinch had the slowest times and highest error rates, with users reporting recognition difficulties and fatigue.
Sideways was efficient at long distances but received lower subjective ratings
Qualitative Insights:
Palm and fist were rated as pleasant, accurate, and easy to learn.
Users appreciated the natural feel of the fist gesture.
Pinch gesture caused fatigue and recognition issues.
Sideways required more body movement than preferred.
Participants highlighted the need for gestures that minimize unintended activations (Midas touch issue observed for palm).
Palm and fist gestures recommended for future midair translation interfaces.
The study provided actionable guidelines for designing usable and efficient midair gestural interactions in public display and immersive environments.
Insights contribute to improving gesture-based UX for diverse applications, from gaming to smart spaces.
Remizova, Vera, Gizatdinova, Yulia, Surakka, Veikko, Midair Gestural Techniques for Translation Tasks in Large-Display Interaction, Advances in Human-Computer Interaction, 2022, 9362916, 13 pages, 2022. https://doi.org/10.1155/2022/9362916