ResearchBib Share Your Research, Maximize Your Social Impacts
Sign for Notice Everyday Sign up >> Login

TouchAnalyzer: A System for Analyzing User's Touch Behavior on a Smartphone

Journal: International Journal of Computer Science and Mobile Computing - IJCSMC (Vol.7, No. 1)

Publication Date:

Authors : ; ; ; ;

Page : 25-38

Keywords : touch operations; smartphone operation context recognition; machine learning;

Source : Downloadexternal Find it from : Google Scholarexternal

Abstract

Many research efforts have been made on human context recognition, especially activity recognition using sensors such as accelerometers embedded in smartphones. However, few studies are conducted for recognizing human context while operating smartphones (smartphone operating context or so-context in short). Examples of so-contexts are smoking or eating while operating smartphones and playing a game while in the train. Estimating so-contexts will bring new services such as notification timing optimization and user interface optimization. In this paper, aiming to provide a tool toward so-context estimation, we propose a system that monitors, recognizes and outputs user's touch operations on Android phones. To realize so-context estimation, the system needs to satisfy three requirements: (1) it should work on any android device, (2) it should run in the background of any application, and (3) it should identify touch operations in high-level format and output the identified operations with the detailed information like finger position and movement for so-context recognition. For the above requirements, we developed our proposed system as an Android application which analyzes raw data output by the OS including a time series of points on the screen, and recognizes 7 representative high-level touch operations such as swipe and rotate with information on the number of fingers used, the pressure level and the track between the start-point and the end-point. We evaluated our system and confirmed that it achieved recognition accuracies of 100% for single-or double-finger swipe and single-finger touch operations (swipe), and 98% for two-finger touch operations (pinch, rotate, etc.). Moreover, to show the applicability of our system, we tried to recognize the phone holding style as a so-context from touch operations. As a result, we confirmed that it achieved F-measure of 96.5% for classification among 8 different holding styles.

Last modified: 2018-01-17 15:58:50