Please use this identifier to cite or link to this item: http://hdl.handle.net/1893/35275
Appears in Collections:Computing Science and Mathematics Conference Papers and Proceedings
Author(s): Swingler, Kevin
Grigson, Chris
Contact Email: kevin.swingler@stir.ac.uk
Title: A Haptic Interface for Guiding People with Visual Impairment using Three Dimensional Computer Vision
Editor(s): Back, Thomas
van Stein, Bas
Wagner, Christian
Garibaldi, Jonathan
Lam, H K
Cottrell, Marie
Doctor, Faiyaz
Filipe, Joaquim
Warwick, Kevin
Kaprzyk, Janusz
Citation: Swingler K & Grigson C (2022) A Haptic Interface for Guiding People with Visual Impairment using Three Dimensional Computer Vision. In: Back T, van Stein B, Wagner C, Garibaldi J, Lam HK, Cottrell M, Doctor F, Filipe J, Warwick K & Kaprzyk J (eds.) <i>Proceedings of the 14th International Joint Conference on Computational Intelligence (IJCCI 2022) - NCTA</i>. 14th International Conference on Neural Computation Theory and Applications, Valletta, Malta, 24.10.2022-26.10.2022. Setubal, Portugal: SCITEPRESS - Science and Technology Publications, pp. 315-322. https://doi.org/10.5220/0011307800003332
Issue Date: 2022
Date Deposited: 23-Mar-2023
Conference Name: 14th International Conference on Neural Computation Theory and Applications
Conference Dates: 2022-10-24 - 2022-10-26
Conference Location: Valletta, Malta
Abstract: Computer vision technology has the potential to provide life changing assistance to blind or visually impaired (BVI) people. This paper presents a technique for locating objects in three dimensions and guiding a person’s hand to the object. Computer vision algorithms are used to locate both objects of interest and the user’s hand. Their relative locations are used to calculate the movement required to take the hand closer to the object. The required direction is signaled to the user via a haptic wrist band, which consists of four haptic motors worn at the four compass points on the wrist. Guidance works both in two and three dimensions, making use of both colour and depth map inputs from a camera. User testing found that people were able to follow the haptic instructions and move their hand to locations on vertical or horizontal surfaces. This work is part of the Artificial Intelligence Sight Loss Assistant (AISLA) project.
Status: VoR - Version of Record
Rights: Licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International http://creativecommons.org/licenses/by-nc-nd/4.0/
Licence URL(s): http://creativecommons.org/licenses/by-nc-nd/4.0/

Files in This Item:
File Description SizeFormat 
113078.pdfFulltext - Published Version3.46 MBAdobe PDFView/Open



This item is protected by original copyright



A file in this item is licensed under a Creative Commons License Creative Commons

Items in the Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

The metadata of the records in the Repository are available under the CC0 public domain dedication: No Rights Reserved https://creativecommons.org/publicdomain/zero/1.0/

If you believe that any material held in STORRE infringes copyright, please contact library@stir.ac.uk providing details and we will remove the Work from public display in STORRE and investigate your claim.