BibSLEIGH
BibSLEIGH corpus
BibSLEIGH tags
BibSLEIGH bundles
BibSLEIGH people
CC-BY
Open Knowledge
XHTML 1.0 W3C Rec
CSS 2.1 W3C CanRec
email twitter
Used together with:
visual (44)
user (27)
peopl (17)
motor (13)
use (13)

Stem impair$ (all stems)

77 papers:

CHICHI-2015-AhmedHCCK #behaviour #people #privacy #visual notation
Privacy Concerns and Behaviors of People with Visual Impairments (TA, RH, KC, DJC, AK), pp. 3523–3532.
CHICHI-2015-FlatlaATKS #identification #named #people
ColourID: Improving Colour Identification for People with Impaired Colour Vision (DRF, ARA, RDT, DLK, CS), pp. 3543–3552.
CHICHI-2015-KimY #3d #towards #visual notation
Toward 3D-Printed Movable Tactile Pictures for Children with Visual Impairments (JK, TY), pp. 2815–2824.
CHICHI-2015-MaluF #artificial reality #personalisation #smarttech
Personalized, Wearable Control of a Head-mounted Display for Users with Upper Body Motor Impairments (MM, LF), pp. 221–230.
CHICHI-2015-WaddingtonLGHH #design #game studies #people #video
Participatory Design of Therapeutic Video Games for Young People with Neurological Vision Impairment (JW, CL, KMG, KH, TLH), pp. 3533–3542.
HCIDUXU-IXD-2015-CaiLLH #case study #experience #research #speech #user interface
User Experience Research on the Rehabilitation System of Speech-Impaired Children — A Case Study on Speech Training Product (WC, JL, QL, TH), pp. 562–574.
ICEISICEIS-v3-2015-NetoMKBRG #recognition #smarttech #visual notation
A Wearable Face Recognition System Built into a Smartwatch and the Visually Impaired User (LdSBN, VRMLM, FLK, MCCB, AdRR, SKG), pp. 5–12.
CASECASE-2014-PuTC #design #development #smarttech
Design and development of the wearable hand exoskeleton system for rehabilitation of hand impaired patients (SWP, SYT, JYC), pp. 996–1001.
CHICHI-2014-WuA #network #online #social #visual notation
Visually impaired users on an online social network (SW, LAA), pp. 3133–3142.
CHICHI-2014-YeMOF #mobile #people #smarttech #visual notation
Current and future mobile and wearable device use by people with visual impairments (HY, MM, UO, LF), pp. 3123–3132.
HCIDUXU-DP-2014-ShafiqCIFAAI #design #effectiveness #interactive #personalisation #visual notation
Skill Specific Spoken Dialogues Based Personalized ATM Design to Maximize Effective Interaction for Visually Impaired Persona (MS, JGC, MI, MF, MA, IA, AI), pp. 446–457.
HCIDUXU-ELAS-2014-MedeirosJG #learning #memory management #named #student
Logograms: Memory Aids for Learning, and an Example with Hearing-Impaired Students (LM, MBJ, LVG), pp. 207–216.
HCIHIMI-AS-2014-KinoeN #design #quality #visual notation
Qualitative Study for the Design of Assistive Technologies for Improving Quality of Life of Visually Impaired (YK, AN), pp. 602–613.
ICEISICEIS-v3-2014-PerezAA #adaptation
Mintzatek, Text-to-Speech Conversion Tool Adapted to Users with Motor Impairments (JEP, MA, JA), pp. 112–119.
RERE-2014-AlkhaniferL #design #elicitation #requirements #towards #visual notation
Towards a situation awareness design to improve visually impaired orientation in unfamiliar buildings: Requirements elicitation study (AA, SL), pp. 23–32.
SACSAC-2014-PedrosaP #using
Text entry using a foot for severely motor-impaired individuals (DdCP, MdGCP), pp. 957–963.
CASECASE-2013-QianY #named #navigation #performance #visual notation
NCC-RANSAC: A fast plane extraction method for navigating a smart cane for the visually impaired (XQ, CY), pp. 261–267.
ITiCSEITiCSE-2013-CapovillaH #education #spreadsheet #student
Teaching spreadsheets to visually-impaired students in an environment similar to a mainstream class (DC, PH), pp. 99–104.
CHICHI-2013-AnthonyKF #people
Analyzing user-generated youtube videos to understand touchscreen use by people with motor impairments (LA, YK, LF), pp. 1223–1232.
CHICHI-2013-HaradaSAKTA #experience #people #visual notation
Accessible photo album: enhancing the photo sharing experience for people with visual impairment (SH, DS, DWA, SK, HT, CA), pp. 2127–2136.
CHICHI-2013-SaliviaH #named
PointAssist: assisting individuals with motor impairments (GS, JPH), pp. 1213–1222.
HCIHCI-AMTE-2013-OhKHJ #design #perspective #process
User Centered Inclusive Design Process: A “Situationally-Induced Impairments and Disabilities” Perspective (HJO, HCK, HH, YGJ), pp. 103–108.
HCIHCI-AS-2013-ItoYTUITH #evaluation
Evaluation of an Information Delivery System for Hearing Impairments at a School for Deaf (AI, TY, KT, KU, TI, HT, YH), pp. 398–407.
HCIHCI-IMT-2013-CarusoCLMRSSC #architecture #named #people #physics
My-World-in-My-Tablet: An Architecture for People with Physical Impairment (MC, FC, FL, MM, AR, FS, LS, TC), pp. 637–647.
HCIHCI-IMT-2013-LeeBN #visual notation
Use of Reference Frame in Haptic Virtual Environments: Implications for Users with Visual Impairments (JYL, SB, CSN), pp. 610–617.
HCIHCI-IMT-2013-LiuBCN #behaviour #visual notation
Behavioral Characteristics of Users with Visual Impairment in Haptically Enhanced Virtual Environments (SL, SB, HC, CSN), pp. 618–625.
HCIHCI-UC-2013-LinHW #learning #using #visual notation
Establishing a Cognitive Map of Public Place for Blind and Visual Impaired by Using IVEO Hands-On Learning System (QWL, SLH, JLW), pp. 193–198.
HCIHIMI-HSM-2013-HirayamaKK #speech #user interface #visual notation
A Dialog Based Speech User Interface of a Makeup Support System for Visually Impaired Persons (MJH, NK, YK), pp. 261–268.
HCIHIMI-HSM-2013-ItohKMYYO #behaviour #comparison
Comparison of Cognitively Impaired, Healthy Non-Professional and Healthy Professional Driver Behavior on a Small and Low-Fidelity Driving Simulator (MI, MK, KM, KY, SY, MO), pp. 490–496.
CHICHI-2012-FallahABF #navigation #using #visual notation
The user as a sensor: navigating users with visual impairments in indoor spaces using tactile landmarks (NF, IA, KEB, EF), pp. 425–432.
CHICHI-2012-GoelFW #mobile #named #using
WalkType: using accelerometer data to accomodate situational impairments in mobile touch screen text entry (MG, LF, JOW), pp. 2687–2696.
CHICHI-2012-GuyT #named #navigation #visual notation
CrossingGuard: exploring information content in navigation aids for visually impaired pedestrians (RTG, KNT), pp. 405–414.
CHICHI-2012-YataniBT #feedback #named #people #representation #using #visual notation
SpaceSense: representing geographical information to visually impaired people using spatial tactile feedback (KY, NB, KNT), pp. 415–424.
ICPRICPR-2012-SimoesSW #detection #image #using
Using local texture maps of brain MR images to detect Mild Cognitive Impairment (RS, CHS, AMvCvW), pp. 153–156.
HCIDUXU-v1-2011-Hasan #multi #online #visual notation #word
Multi-language Online Word Processor for Learners and the Visually Impaired (SIH), pp. 256–260.
HCIHCI-MIIE-2011-CostaFVCBH #detection #navigation #people #visual notation
Landmarks Detection to Assist the Navigation of Visually Impaired People (PC, HF, VV, PC, JB, LJH), pp. 293–300.
HCIHCI-UA-2011-LiJN #learning #user interface #visual notation
Haptically Enhanced User Interface to Support Science Learning of Visually Impaired (YL, SLJ, CSN), pp. 68–76.
HCIHCI-UA-2011-VellaVR #people
Investigating Drag and Drop Techniques for Older People with Cognitive Impairment (FV, NV, PR), pp. 530–538.
HCIHIMI-v2-2011-IkegamiIIO #development #game studies #visual notation
Development of a Tracking Sound Game for Exercise Support of Visually Impaired (YI, KI, HI, MO), pp. 31–35.
ICEISICEIS-v4-2011-MartinsP #implementation #visual notation
A Method Proposal for Implementing Accessibility in Desktop Applications for Visually Impaired Users (LCGM, BEP), pp. 287–290.
KEODKEOD-2010-KehagiasT #ontology
An Ontology for Mobility Impaired user Needs and Services (DK, DT), pp. 408–411.
CHICHI-2009-Hornof #design
Designing with children with severe motor impairments (AJH), pp. 2177–2180.
SACSAC-2009-NetoFP #named #people #using
MATRACA: a tool to provide support for people with impaired vision when using the computer for simple tasks (FGdON, JMF, RRGP), pp. 158–159.
ITiCSEITiCSE-2008-NorteL #game studies #people
A sudoku game for people with motor impairments (SN, FGL), p. 319.
CHICHI-2008-GajosWW #interface #performance
Improving the performance of motor-impaired users with automatically-generated, ability-based interfaces (KZG, JOW, DSW), pp. 1257–1266.
CHICHI-2008-PlimmerCBB #collaboration #multimodal #people
Multimodal collaborative handwriting training for visually-impaired people (BP, AC, SAB, RB), pp. 393–402.
CHICHI-2008-WuBRBM #distributed #memory management #product line
Collaborating to remember: a distributed cognition account of families coping with memory impairments (MW, JPB, BR, RB, MM), pp. 825–834.
ICPRICPR-2008-TanakaG #people #smarttech
Text-tracking wearable camera system for visually-impaired people (MT, HG), pp. 1–4.
CSEETCSEET-2007-DistanteH #challenge #education #lessons learnt #programming #re-engineering #student
Challenges and Lessons Learned in Teaching Software Engineering and Programming to Hearing-Impaired Students (DD, SH), pp. 344–354.
ITiCSEITiCSE-2007-ArmstrongM #education
Remote and local delivery of cisco education for the vision-impaired (HA, IM), pp. 78–81.
CHICHI-2007-KuberYM #feedback #internet #towards #visual notation
Towards developing assistive haptic feedback for visually impaired internet users (RK, WY, GM), pp. 1525–1534.
CHICHI-2007-TanYM #adaptation #approach #people #visual notation #web
An adaptive & adaptable approach to enhance web graphics accessibility for visually impaired people (CCT, WY, GM), pp. 1539–1542.
CHICHI-2006-WobbrockM #people
Trackball text entry for people with motor impairments (JOW, BAM), pp. 479–488.
CHICHI-2005-HornofC #named
EyeDraw: enabling children with severe motor impairments to draw with their eyes (AJH, AC), pp. 161–170.
CHICHI-2005-TeeMFMMPF #visual notation
A visual recipe book for persons with language impairments (KT, KM, LF, EM, JM, BP, SF), pp. 501–510.
CHICHI-2004-JackoBKMEES #feedback #multimodal #visual notation
Isolating the effects of visual impairment: exploring the effect of AMD on the utility of multimodal feedback (JAJ, LB, TK, KPM, PJE, VKE, FS), pp. 311–318.
CSCWCSCW-2004-WinbergB #design #interface #towards #visual notation
Assembling the senses: towards the design of cooperative interfaces for visually impaired users (FW, JB), pp. 332–341.
ICEISICEIS-v5-2004-FernandesPC #visual notation
Accessibility and Visually Impaired Users (ARF, JRP, JCC), pp. 75–80.
ICPRICPR-v1-2004-PnevmatikakisP #comparison
Comparison of Eigenface-Based Feature Vectors under Different Impairments (AP, LP), pp. 296–299.
ICPRICPR-v2-2004-EzakiBS #detection #image #towards #visual notation
Text Detection from Natural Scene Images: Towards a System for Visually Impaired Persons (NE, MB, LS), pp. 683–686.
CHICHI-2003-HwangKLC #multi
Multiple haptic targets for motion-impaired computer users (FH, SK, PL, PJC), pp. 41–48.
CHICHI-2003-JackoSSBEEKMZ #feedback #multimodal #performance #question #visual notation #what
Older adults and visual impairment: what do exposure times and accuracy tell us about performance gains associated with multimodal feedback? (JAJ, IUS, FS, LB, PJE, VKE, TK, KPM, BSZ), pp. 33–40.
ICPRICPR-v3-2002-KawaiT #3d #interface #using #visual notation
A Support System for Visually Impaired Persons to Understand Three-dimensional Visual Information Using Acoustic Interface (YK, FT), pp. 974–977.
HTHT-2001-HarperGS #prototype #tool support #visual notation
Prototype mobility tools for visually impaired surfers (SH, CAG, RS), pp. 33–34.
HTHT-2000-GobleHS #visual notation #web
The travails of visually impaired web travellers (CAG, SH, RS), pp. 1–10.
HCIHCI-CCAD-1999-AttreeRB #artificial reality #assessment #memory management
Virtual reality in assessment and rehabilitation of impaired memory following brain damage (EAA, FDR, BMB), pp. 1100–1104.
HCIHCI-CCAD-1999-MeleES #case study #user interface
The impact of mobility on user interfaces for subjects who are motor impaired: case study in a wireless LAN environment (IVM, KE, AS), pp. 1018–1022.
HCIHCI-CCAD-1999-PetrucciRAP #visual notation #web
An audio browser for increasing access to world wide web sites for blind and visually impaired computer users (LP, PR, AA, TP), pp. 995–998.
HCIHCI-CCAD-1999-Seki #using #visual notation
Systematic auditory training of obstacle sense for the visually impaired by using acoustical VR system (YS), pp. 999–1003.
HCIHCI-CCAD-1999-ZajicekPR #evaluation #interface #visual notation #web
Evaluation of a world wide web scanning interface for blind and visually impaired users (MZ, CP, CR), pp. 980–984.
HCIHCI-EI-1999-ChenC #case study #interface #visual notation
A study on tactile interfaces as an aid to home electronic appliance operation for the visually impaired (WZC, WKC), pp. 740–744.
HCIHCI-CC-1997-ItohSY #using #visual notation
A Drawing Support System for the Visually Impaired Computer User Using a Virtual Sound Screen (KI, MS, YY), pp. 401–404.
HCIHCI-CC-1997-Jones
Computer Assistive Remote Control for the Motor-Impaired User (PEJ), pp. 393–396.
HCIHCI-CC-1997-SavidisVS #library
Embedding Scanning Techniques Accessible to Motor-Impaired Users in the WINDOWS Object Library (AS, GV, CS), pp. 429–432.
CHICHI-1996-MereuK #3d #interface #visual notation
Audio Enhanced 3D Interfaces for Visually Impaired Users (SWM, RK), pp. 72–78.
HCIHCI-ACS-1993-HjelmquistJ #visual notation
Work Possibilities for Visually Impaired: The role of Information Technology (EH, BJ), pp. 498–503.
CHIINTERCHI-1993-AlmTEN #people
Computer aided conversation for severely physically impaired non-speaking people (NA, JT, LE, AFN), pp. 236–241.

Bibliography of Software Language Engineering in Generated Hypertext (BibSLEIGH) is created and maintained by Dr. Vadim Zaytsev.
Hosted as a part of SLEBOK on GitHub.