BibSLEIGH corpus
BibSLEIGH tags
BibSLEIGH bundles
BibSLEIGH people
Open Knowledge
XHTML 1.0 W3C Rec
CSS 2.1 W3C CanRec
email twitter
Used together with:
eye (32)
base (18)
use (15)
interact (11)
interfac (11)

Stem gaze$ (all stems)

94 papers:

CHICHI-2015-AndristMT #motivation
Look Like Me: Matching Robot Personality via Gaze to Increase Motivation (SA, BM, AT), pp. 3603–3612.
CHICHI-2015-ChengSSYD #comprehension
Gaze-Based Annotations for Reading Comprehension (SC, ZS, LS, KY, AKD), pp. 1569–1572.
CHICHI-2015-KawaguchiKS #case study #image
Study on Gaze Direction Perception of Face Image Displayed on Rotatable Flat Display (IK, HK, YS), pp. 1729–1737.
CHICHI-2015-TurnerABG #multi
Gaze+RST: Integrating Gaze and Multitouch for Remote Rotate-Scale-Translate Tasks (JT, JA, AB, HG), pp. 4179–4188.
CHICHI-2015-VidalBBG #interactive #social
The Royal Corgi: Exploring Social Gaze Interaction for Immersive Gameplay (MV, RB, AB, HG), pp. 115–124.
CSCWCSCW-2015-ChengSMFHD #eye tracking #online #social
Social Eye Tracking: Gaze Recall with Online Crowds (SC, ZS, XM, JLF, SEH, AKD), pp. 454–463.
Caregiver’s Eye Gaze and Field of View Presumption Method During Bathing Care in Elderly Facility (AY, TA, HCBJ, NK, AG, TO), pp. 524–532.
HCIDUXU-UI-2015-ChngC #3d #case study #interactive #multi
User Study on 3D Multitouch Interaction (3DMi) and Gaze on Surface Computing (EC, NC), pp. 425–433.
HCIHCI-IT-2015-FukumotoTE #detection #using
Improvement of Accuracy in Remote Gaze Detection for User Wearing Eyeglasses Using Relative Position Between Centers of Pupil and Corneal Sphere (KF, TT, YE), pp. 13–23.
HCIHCI-IT-2015-MeleMR #communication #predict #type system #user interface
Beyond Direct Gaze Typing: A Predictive Graphic User Interface for Writing and Communicating by Gaze (MLM, DM, CER), pp. 66–77.
HCIHCI-IT-2015-MurataMF #modelling #performance
Effects of Target Shape and Display Location on Pointing Performance by Eye-Gaze Input System — Modeling of Pointing Time by Extended Fitts’ Law (AM, MM, DF), pp. 94–106.
HCIHCI-IT-2015-MurataMT #comparison
Optimal Scroll Method for Eye-Gaze Input System — Comparison of R-E and R-S Compatibility (AM, MM, YT), pp. 86–93.
HCIHIMI-IKC-2015-OgusuOKY #generative #music
Enhancing Abstract Imaginations of Viewers of Abstract Paintings by a Gaze Based Music Generation System (TO, JO, JK, SY), pp. 457–464.
ICPCICPC-2014-WaltersSSK #developer #traceability
Capturing software traceability links from developers’ eye gazes (BW, TS, BS, HHK), pp. 201–204.
CHICHI-2014-KangasARIMR #feedback #gesture #mobile
Gaze gestures and haptic feedback in mobile devices (JK, DA, JR, PI, PM, RR), pp. 435–438.
Depth perception with gaze-contingent depth of field (MM, SC, MAN, DV), pp. 217–226.
CHICHI-2014-PanS #multi
A gaze-preserving situated multiview telepresence system (YP, AS), pp. 2173–2176.
CHICHI-2014-SchwarzMLHM #gesture #interface
Combining body pose, gaze, and gesture to determine intention to interact in vision-based interfaces (JS, CCM, TL, SEH, JM), pp. 3443–3452.
Smart photo selection: interpret gaze as personal interest (TCW, AS, SS), pp. 2065–2074.
Caregiver’s Gaze and Field of View Presumption Method During Bath Care in the Elderly Facility (AY, NK, AG, TO, TA, SY, HBJ), pp. 78–87.
HCIHCI-AIMT-2014-MohammedSS #predict
Gaze Location Prediction with Depth Features as Auxiliary Information (RAAM, LS, OGS), pp. 281–292.
Appearance-Based Gaze Tracking with Free Head Movement (CCL, YTC, KWC, SCC, SWS, YPH), pp. 1869–1873.
ICPRICPR-2014-LaiSTH #3d #using
3-D Gaze Tracking Using Pupil Contour Features (CCL, SWS, HRT, YPH), pp. 1162–1166.
ICPRICPR-2014-SchneiderSS #estimation #independence
Manifold Alignment for Person Independent Appearance-Based Gaze Estimation (TS, BS, RS), pp. 1167–1172.
ICPRICPR-2014-XiongHL #3d #estimation
Gaze Estimation Based on 3D Face Structure and Pupil Centers (CX, LH, CL), pp. 1156–1161.
Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets (SS, RD), pp. 285–294.
CHICHI-2013-ZhangBG #interactive #interface #named
SideWays: a gaze interface for spontaneous interaction with situated displays (YZ, AB, HG), pp. 851–860.
CSCWCSCW-2013-HarperM #artificial reality #social
The mocking gaze: the social organization of kinect use (RHRH, HMM), pp. 167–180.
HCIHCI-IMT-2013-EbisawaF #detection #using #video
Head-Free, Remote Gaze Detection System Based on Pupil-Corneal Reflection Method with Using Two Video Cameras — One-Point and Nonlinear Calibrations (YE, KF), pp. 205–214.
HCIHCI-IMT-2013-MurataHMH #case study #interface #using
Study on Character Input Methods Using Eye-gaze Input Interface (AM, KH, MM, TH), pp. 320–329.
HCIHCI-IMT-2013-MurataHO #estimation #interface
Proposal of Estimation Method of Stable Fixation Points for Eye-gaze Input Interface (AM, TH, KO), pp. 330–339.
HCIHCI-IMT-2013-MurataUH #case study
Study on Cursor Shape Suitable for Eye-gaze Input System (AM, RU, TH), pp. 312–319.
HCIHIMI-HSM-2013-YamamotoNEN #development #mobile #tablet
Development of a Mobile Tablet PC with Gaze-Tracking Function (MY, HN, KE, TN), pp. 421–429.
CHICHI-2012-AndristPMG #design #effectiveness
Designing effective gaze mechanisms for virtual agents (SA, TP, BM, MG), pp. 705–714.
CHICHI-2012-BullingAS #security #using #visual notation
Increasing the security of gaze-based cued-recall graphical passwords using saliency masks (AB, FA, AS), pp. 3011–3020.
CHICHI-2012-HuangWB #web
User see, user point: gaze and cursor alignment in web search (JH, RW, GB), pp. 1341–1350.
CHICHI-2012-KimBGCV #3d #estimation #named
TeleHuman: effects of 3d perspective on gaze and pose estimation with a life-size cylindrical telepresence pod (KK, JB, AG, JC, RV), pp. 2531–2540.
Look & touch: gaze-supported target acquisition (SS, RD), pp. 2981–2990.
CHICHI-2012-VitakIDEG #learning
Gaze-augmented think-aloud as an aid to learning (SAV, JEI, ATD, SE, AKG), pp. 2991–3000.
CSCWCSCW-2012-NussliJ #interactive #programming #quality
Effects of sharing text selections on gaze cross-recurrence and interaction quality in a pair programming task (MAN, PJ), pp. 1125–1134.
ICPRICPR-2012-LuSOS #image #synthesis
Head pose-free appearance-based gaze sensing via eye image synthesis (FL, YS, TO, YS), pp. 1008–1011.
ICPRICPR-2012-MaedaNI #image #modelling
Surface matching by curvature distribution images generated via gaze modeling (MM, TN, KI), pp. 2194–2197.
CSCWCSCW-2011-BednarikSP #bidirectional #collaboration
Bidirectional gaze in remote computer mediated collaboration: setup and initial results from pair-programming (RB, AS, SP), pp. 597–600.
HCIHCI-ITE-2011-AbeOO #analysis #detection #image
Eye-gaze Detection by Image Analysis under Natural Light (KA, SO, MO), pp. 176–184.
HCIHCI-ITE-2011-BarbuceanuADR #analysis #artificial reality #interactive #user interface
Attentive User Interface for Interaction within Virtual Reality Environments Based on Gaze Analysis (FB, CA, MD, ZR), pp. 204–213.
HCIHCI-ITE-2011-ParkAK #interactive #interface #mobile
Gaze-Directed Hands-Free Interface for Mobile Interaction (GsP, JgA, GJK), pp. 304–313.
CHICHI-2010-ForgetCB #visual notation
Shoulder-surfing resistance with eye-gaze entry in cued-recall graphical passwords (AF, SC, RB), pp. 1107–1110.
CHICHI-2010-KernMS #named #visual notation
Gazemarks: gaze-based visual placeholders to ease attention switching (DK, PM, AS), pp. 2093–2102.
CHICHI-2010-VertanenM #performance #speech #using
Speech dasher: fast writing using speech and gaze (KV, DJCM), pp. 595–598.
Gaze-Motivated Compression of Illumination and View Dependent Textures (JF, MH, MJC), pp. 862–865.
ICPRICPR-2010-ValentiLSDG #estimation #visual notation
Visual Gaze Estimation by Joint Head and Eye Information (RV, AL, NS, CD, TG), pp. 3870–3873.
ICPRICPR-2010-YonetaniKHM #estimation
Gaze Probing: Event-Based Estimation of Objects Being Focused On (RY, HK, TH, TM), pp. 101–104.
Arrow tag: a direction-key-based technique for rapidly selecting hyperlinks while gazing at a screen (AM, HI, MA), pp. 1025–1028.
CHICHI-2009-MajarantaAS #performance #type system
Fast gaze typing with an adjustable dwell time (PM, UKA, OS), pp. 357–360.
Disambiguating ninja cursors with eye gaze (KJR, OS), pp. 1411–1414.
CSCWCSCW-2008-SteptoeWMGRSRS #analysis #collaboration
Eye-tracking for avatar eye-gaze and interactional analysis in immersive collaborative virtual environments (WS, RW, AM, EG, JR, PMS, DJR, AS), pp. 197–200.
ICMLICML-2008-PuolamakiAK #learning #query
Learning to learn implicit queries from gaze patterns (KP, AA, SK), pp. 760–767.
ICPRICPR-2008-ChenJ #3d #estimation #information retrieval
3D gaze estimation with a single camera without IR illumination (JC, QJ), pp. 1–4.
Gaze tracking by Binocular Vision and LBP features (HL, CW, YWC), pp. 1–4.
ICPRICPR-2008-XuKB #adaptation #mobile
Information-based gaze control adaptation to scene context for mobile robots (TX, KK, MB), pp. 1–4.
SIGIRSIGIR-2008-BuscherDE #feedback #query #using
Query expansion using gaze-based feedback on the subdocument level (GB, AD, LvE), pp. 387–394.
CHICHI-2007-KumarPW #named #using
EyePoint: practical pointing and selection using gaze and keyboard (MK, AP, TW), pp. 421–430.
CHICHI-2007-SchrammelGST #exclamation #quote #using
“Look!”: using the gaze direction of embodied agents (JS, AG, RS, MT), pp. 1187–1190.
HCIHCI-MIE-2007-KomogortsevK #design #interface
Kalman Filtering in the Design of Eye-Gaze-Guided Computer Interfaces (OK, JIK), pp. 679–689.
HCIHCI-MIE-2007-LeePWP #artificial reality #robust
Robust Gaze Tracking Method for Stereoscopic Virtual Reality Systems (ECL, KRP, MCW, JP), pp. 700–709.
CHICHI-2006-SantellaADSC #automation #interactive
Gaze-based interaction for semi-automatic photo cropping (AS, MA, DD, DS, MFC), pp. 771–780.
ICPRICPR-v1-2006-ZhuJB #estimation
Nonlinear Eye Gaze Mapping Function Estimation via Support Vector Regression (ZZ, QJ, KPB), pp. 1132–1135.
CHICHI-2005-OuOYF #collaboration
Effects of task properties, partner actions, and message content on eye gaze patterns in a collaborative task (JO, LMO, JY, SRF), pp. 231–240.
Conversing with the user based on eye-gaze patterns (PQ, SZ), pp. 221–230.
Extracting a Gaze Region with the History of View Directions (NU, AS, MK), pp. 957–960.
CHICHI-2003-GarauSVBSS #communication #quality
The impact of avatar realism and eye gaze control on perceived quality of communication in a shared immersive virtual environment (MG, MS, VV, AB, AS, MAS), pp. 529–536.
CHICHI-2003-VertegaalWSC #named #using #video
GAZE-2: conveying eye contact in group video conferencing using eye-controlled camera direction (RV, IW, CS, CC), pp. 521–528.
CHICHI-2002-FukayamaOMSH #embedded #interface
Messages embedded in gaze of interface agents — impression management with agent’s gaze (AF, TO, NM, MS, NH), pp. 41–48.
CSCWCSCW-2002-VertegaalD #question
Explaining effects of eye gaze on mediated group conversations: : amount or synchronization? (RV, YD), pp. 41–48.
ICPRICPR-v3-2002-SugimotoNM #detection #visual notation
Detecting a Gazing Region by Visual Direction and Stereo Cameras (AS, AN, TM), pp. 278–282.
ICPRICPR-v4-2002-MorimotoAF #detection
Detecting Eye Position and Gaze from a Single Camera and 2 Light Sources (CHM, AA, MF), pp. 314–317.
CHICHI-2001-GarauSBS #communication #using
The impact of eye gaze on communication using humanoid avatars (MG, MS, SB, MAS), pp. 309–316.
Eye gaze patterns in conversations: there is more the conversational agents than meets the eyes (RV, RS, GCvdV, AN), pp. 301–308.
CHICHI-2000-SalvucciA #interface
Intelligent gaze-added interfaces (DDS, JRA), pp. 273–280.
CHICHI-2000-SibertJ #evaluation #interactive
Evaluation of eye gaze interaction (LES, RJKJ), pp. 281–288.
CHICHI-2000-TaylorR #communication #consistency #semantics #using
Gaze communication using semantically consistent spaces (MJT, SMR), pp. 400–407.
ICPRICPR-v4-2000-MurakamiTK #interactive
An Interactive Facial Caricaturing System Based on the Gaze Direction of Gallery (KM, MT, HK), pp. 4710–4713.
A Calibration-Free Gaze Tracking Technique (SWS, YTW, JL), pp. 4201–4204.
CHICHI-1999-Vertegaal #collaboration #communication #multi
The GAZE Groupware System: Mediating Joint Attention in Multiparty Communication and Collaboration (RV), pp. 294–301.
Manual and Gaze Input Cascaded (MAGIC) Pointing (SZ, CM, SI), pp. 246–253.
Variation of sitting posture in work with VDU’s. The effect of downward gaze (KIF, IL), pp. 56–60.
HCIHCI-EI-1999-SuzukiA #difference #process
Difference in the ANS activity between gaze angles while seated (KS, DRA), pp. 69–73.
ICDARICDAR-1997-EglinE #development #documentation #grid #segmentation #visual notation
Logarithmic Spiral Grid and Gaze Control for the Development of Strategies of Visual Segmentation on a Document (VE, HE), pp. 689–692.
An Active Gazing-Line System for Improving Sign-Language Conversation (SL, KI, SI), pp. 283–286.
ICPRICPR-1996-HansenS #estimation #using
Active depth estimation with gaze and vergence control using Gabor filters (MH, GS), pp. 287–291.
Gaze Direction and Ocular Surface Area in VDT Work (MS, SS, ST, TS, SS), pp. 750–755.
HCIHCI-SHI-1993-MoriiKT #animation #detection #evaluation #eye tracking #realtime #using
Evaluation of a Gaze Using Real-Time CG Eye-Animation Combined with Eye Movement Detector (KM, FK, NT), pp. 1103–1108.
HCIHCI-SHI-1993-SchryverG #3d #interface
Eye-Gaze and Intent: Application in 3D Interface Control (JCS, JG), pp. 573–578.
CHICHI-1990-StarkerB #self
A gaze-responsive self-disclosing display (IS, RAB), pp. 3–10.

Bibliography of Software Language Engineering in Generated Hypertext (BibSLEIGH) is created and maintained by Dr. Vadim Zaytsev.
Hosted as a part of SLEBOK on GitHub.