Implicit gaze based annotations to support second language learning

Ayano Okoso, Kai Kunze, Koichi Kise

Research output: Chapter in Book/Report/Conference proceedingConference contribution

13 Citations (Scopus)

Abstract

This paper explores if implicit gaze based annotations can support reading comprehension tasks of second language learners. We show how to use eye tracking to add implicit annotations to the text the user reads and we start by annotating physical features (reading speed, re-reading, number of fixation areas) to documents using eyetracking. We show initial results of an ongoing experiment. So far,we recorded the eye gaze of 2 students for 2 documents. We gather initial feedback by presenting the annotated documents to two English teachers. Oerall, we beliee implicit annotations can be a useful feedback mechanism for second language learners.

Original languageEnglish
Title of host publicationUbiComp 2014 - Adjunct Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing
PublisherAssociation for Computing Machinery, Inc
Pages143-146
Number of pages4
ISBN (Electronic)9781450330473
DOIs
Publication statusPublished - 2014
Externally publishedYes
Event2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing, UbiComp 2014 - Seattle, United States
Duration: 2014 Sept 132014 Sept 17

Publication series

NameUbiComp 2014 - Adjunct Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing

Other

Other2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing, UbiComp 2014
Country/TerritoryUnited States
CitySeattle
Period14/9/1314/9/17

Keywords

  • Eye moements
  • Language expertise
  • Mobile eye tracker

ASJC Scopus subject areas

  • Software

Fingerprint

Dive into the research topics of 'Implicit gaze based annotations to support second language learning'. Together they form a unique fingerprint.

Cite this