Zhao, Guangrong, Shen, Yiran, Li, Feng, Liu, Lei, Cui, Lizhen and Wen, Hongkai (2025) Ui-Ear : on-face gesture recognition through on-ear vibration sensing. IEEE Transactions on Mobile Computing, 24 (3). pp. 1482-1495. doi:10.1109/TMC.2024.3480216 ISSN 1536-1233.
Preview |
PDF
WRAP-Ui-Ear-on-face-gesture-recognition-through-on-ear vibration-sensing-2024.pdf - Accepted Version - Requires a PDF viewer. Download (7MB) | Preview |
Abstract
With the convenient design and prolific functionalities, wireless earbuds are fast penetrating in our daily life and taking over the place of traditional wired earphones. The sensing capabilities of wireless earbuds have attracted great interests of researchers on exploring them as a new interface for human-computer interactions. However, due to its extremely compact size, the interaction on the body of the earbuds is limited and not convenient. In this paper, we propose Ui-Ear, a new on-face gesture recognition system to enrich interaction maneuvers for wireless earbuds. Ui-Ear exploits the sensing capability of Inertial Measurement Units (IMUs) to extend the interaction to the skin of the face near ears. The accelerometer and gyroscope in IMUs perceive dynamic vibration signals induced by on-face touching and moving, which brings rich maneuverability. Since IMUs are provided on most of the budget and high-end wireless earbuds, we believe that Ui-Ear has great potential to be adopted pervasively. To demonstrate the feasibility of the system, we define seven different on-face gestures and design an end-to-end learning approach based on Convolutional Neural Networks (CNNs) for classifying different gestures. To further improve the generalization capability of the system, adversarial learning mechanism is incorporated in the offline training process to suppress the user-specific features while enhancing gesture-related features. We recruit 20 participants and collect a realworld datasets in a common office environment to evaluate the recognition accuracy. The extensive evaluations show that the average recognition accuracy of Ui-Ear is over 95% and 82.3% in the user-dependent and user-independent tasks, respectively. Moreover, we also show that the pre-trained model (learned from user-independent task) can be fine-tuned with only few training samples of the target user to achieve relatively high recognition accuracy (up to 95%). At last, we implement the personalization and recognition components of Ui-Ear on an off-the-shelf Android smartphone to evaluate its system overhead. The results demonstrate Ui-Ear can achieve real-time response while only brings trivial energy consumption on smartphones.
| Item Type: | Journal Article |
|---|---|
| Subjects: | T Technology > TK Electrical engineering. Electronics Nuclear engineering |
| Divisions: | Faculty of Science, Engineering and Medicine > Science > Computer Science |
| Library of Congress Subject Headings (LCSH): | Mobile communication systems, Bluetooth technology, Image processing -- Digital techniques, Human-computer interaction, Vibration, Acoustical engineering, Wireless communication systems, Wireless earphones |
| Journal or Publication Title: | IEEE Transactions on Mobile Computing |
| Publisher: | IEEE |
| ISSN: | 1536-1233 |
| Official Date: | March 2025 |
| Dates: | Date Event March 2025 Published 14 October 2024 Available 9 October 2024 Accepted |
| Volume: | 24 |
| Number: | 3 |
| Page Range: | pp. 1482-1495 |
| DOI: | 10.1109/TMC.2024.3480216 |
| Status: | Peer Reviewed |
| Publication Status: | Published |
| Re-use Statement: | © 2024 Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. |
| Access rights to Published version: | Restricted or Subscription Access |
| Date of first compliant deposit: | 10 October 2024 |
| Date of first compliant Open Access: | 21 October 2024 |
| RIOXX Funder/Project Grant: | Project/Grant ID RIOXX Funder Name Funder ID 2022HWYQ040 National Natural Science Foundation of China ZR2024ZD12 National Natural Science Foundation of China 61972230 National Natural Science Foundation of China 62072278 National Natural Science Foundation of China |
| Related URLs: | |
| Persistent URL: | https://wrap.warwick.ac.uk/188087/ |
Request changes or add full text files to a record
Repository staff actions (login required)
![]() |
View Item |

