Yang-Keun Ahn, Kwang-Soon Choi, Young-Choong Park



Spatial Gesture Interaction System

pdf PDF


This study proposes a method for controlling information output on a table-type display through hand movements in the air using a single depth camera without any actual touching of devices. First, the hardware configuration of the AirTouchTable system, which enables the control of a table-type display based on spatial interactions, was described. To control of the system, a user detection zone, gesture recognition zone, and object recognition zone are determined. When a user enters the system control area, the user can control the system by controlling the coordinates of a mouse using his/her hand and through gesture recognition-based control. The user can also place an object on a predetermined area to gain information about it. To verify the excellence of the system proposed by this study, the recognition rates of snap gestures and the existence of objects were measured.


Hand Tracking, Hand Detection, Depth Camera, Screen Control System, Gesture Recognition


[1] http://www.lgtfd.com

[2] D.A Norman, "The Way I See It: Natural User Interfaces Are Not Natural," Interactions, 2010.

[3] http://www.microsoft.com/en-us/kinectforwindows

[4] Mikkel B. Stegmann, "Image Warping", Informatics and Mathematical Modeling, 2001.

[5] http://en.wikipedia.org/wiki/Low-pass_filter

Cite this paper

Yang-Keun Ahn, Kwang-Soon Choi, Young-Choong Park. (2018) Spatial Gesture Interaction System. International Journal of Circuits and Electronics, 3, 29-34


Copyright © 2018 Author(s) retain the copyright of this article.
This article is published under the terms of the Creative Commons Attribution License 4.0