Development of an Eye- and Gaze-Tracking Mechanism in an Active and Assisted Living Ecosystem
This paper details the development of an open-source eye- and gaze-tracking mechanism
designed for open, scalable, and decentralized Active and Assisted Living (AAL) ecosystems
built on Wireless Sensor and Actuator Networks (WSANs). Said mechanism is deliberately
conceived as yet another service-feature in an on-going implementation of an extended
intelligent built-environment framework, one motivated and informed by both Information and
Communication Technologies (ICTs) as well as by emerging Architecture, Engineering, and
Construction (AEC) considerations. It is nevertheless designed as a compatible and
subsumable service-feature for existing above-characterized AAL frameworks. The eye- and
gaze-tracking mechanism enables the user (1) to engage (i.e., open, shut, slide, turn-on/-off,
etc.) with a variety of actuable objects and systems deployed within an intelligent built-
environment via sight-enabled identification, selection, and confirmation; and (2) to extract
and display personal identity information from recognized familiar faces viewed by the user.
The first feature is intended principally (although not exclusively) for users with limited
mobility, with the intention to support independence with respect to the control of remotely
actuable mechanisms within the built-environment. The second feature is intended to
compensate for loss of memory and/or visual acuity associated principally (although not
exclusively) with the natural aging process. As with previously developed service-features, the
present mechanism intends to increase the quality of life of its user(s) in an affordable,
intuitive, and highly intelligent manner. Read more here.
Authors:
- Alexander Liu Cheng
- Néstor Llorca Vega
- Galoget Latorre
Comentarios
Publicar un comentario