E²GO : Free Your Hands for Smartphone Interaction

Shaoming Yan 1,  Yuanliang Ju1, Rong Quan 1,   Huawei Tu 2,  Dong Liang 1  

1 College of Computer Science and Technology, Nanjing University of Aeronautics and Astronautics, Nanjing, China
2 Department of Computer Science and Information Technology, La Trobe University, Melbourne, Australia

Visualization Results

Demonstration of E²GCO Eight Actions

    The actions from left to right are: (a) Gazing at the “Bottom” area to swipe the page up, (b) Gazing at the “Top” area to swipe the page down, (c) Staring at the “Left” area to switch to the left page, (d) Staring at the “Right” area to switch to the right page, (e) Swiftly shifting gaze from “Bottom” to “Up” to switch to the next short video, (f) Swiftly shifting gaze from “Up” to “Bottom” to switch to the previous short video, (g) Closing the right eye and keeping the left eye open to click the target, (h) Closing the left eye and keeping the right eye open to returning.


Demonstration of Event Data.

    With event camera technology, dynamic changes in the user's eye area can be captured..


Demonstration of User testing.

    The first of which shows the subjects' demonstrations for all movements, and the second of which shows some videos of the testing process.


Abstract


Current eye-gaze interaction technologies for smart- phones are considered inflexible, inaccurate, and power- hungry. These methods typically rely on hand involve- ment and accomplish partial interactions. In this paper, we propose a novel eye-gaze-based smartphone interaction method named Event-driven Eye-Gaze Operation (E²GO), which can realize comprehensive interaction using only eyes and gazes to cover various interaction types. Be- fore the interaction, an anti-jitter gaze estimation method was exploited to stabilize human eye fixation and predict accurate and stable human gaze locations on smartphone screens to further explore refined time-dependent eye-gaze interactions. We also integrated an event-triggering mecha- nism in E²GO to significantly decrease its power consump- tion to deploy on smartphones. We have implemented the prototype of E²GO on different brands of smartphones and conducted a comprehensive user study to validate its effi- cacy, demonstrating E²GO’s superior smartphone control capabilities across various scenarios.

    Method

    Our Proposed Anti-Jitter Strategy (AJS-based) Gaze Estimation. Case 1: Predicted gaze position is modified to the previous frame’s position ; Case 2: the predicted gaze position is kept unchanged.

    Our Proposed Motion Event Detector (MED). T2: When event percent exceeds the threshold, Gaze Estimation is invoked; T1: When event percent is less than threshold, the input image is recaptured.

    The first and second rows show the actions and user tests corresponding to the E²GO (a) Gazing at the “Bottom” area to swipe the page up, (b) Gazing at the “Top” area to swipe the page down, (c) Staring at the “Left” area to switch to the left page, (d) Staring at the “Right” area to switch to the right page, (e) Swiftly shifting gaze from “Bottom” to “Up” to switch to the next short video, (f) Swiftly shifting gaze from “Up” to “Bottom” to switch to the previous short video, (g) Closing the right eye and keeping the left eye open to click the target, (h) Closing the left eye and keeping the right eye open to returning. The eye blink images of (g) and (h) mean that executing these two actions needs eye blinks. (i)-(p) correspond to the test results of the (a)-(h) action, respectivel.

Contributions

  1. We propose an eye-gaze smartphone interaction method E²GO that supports users to perform complex interaction tasks with the help of four pairs of interactive actions.

  2. We introduced an Anti-Jitter Strategy (AJS) for E²GO to ensure accurate and stable gaze positions on a screen.

  3. We introduced a Motion Event Detector (MED) for E²GO to decrease energy consumption significantly.

  4. he user study results demonstrated that our proposed E²GO can realize accurate eye-gaze-based smartphone control in various situations.

Results

Success Rate

    The test results of E²GO in different postures. The bar chart represents the average success rate of the users, and the black error lines represent the standard deviation of the user tests, respectively.


Task Timing

    Time performance of E²GO across different postures. The bar chart represents the average task times, and the black error lines represent the standard deviation of the tests.


MED Studies

    Experiment of MED. A comparison of the usage time for the battery life with and without MED for E²GO.