The eGlasses project is focused on development of an open platform in a form of multisensory electronic eyeglasses and on integration and designing of new intelligent interaction methods using the eGlasses platform. This is an initial development on long-term research and technological innovation in perceptual and super-perceptual (e.g., heart rate, temperature) computing. It is an emerging technology that is also focused on creation of mobile, perceptual media.

Perceptual media refers to multimedia devices with added perceptual user interface capabilities. These devices integrate human-like perceptual awareness of the environment, with the ability to respond appropriately. It could be realized using automatic perception of an object’s properties and delivering information about the object’s status as a result of reasoning operations. For example, using the eGlasses, it will be possible to control the device, which was recognized in the field of view using the interactive menu, associated with the identified device. Other examples include presentation of a recognized person name, recognition of people with abnormal physiological parameters (alerts at airports), protection against possible head injury, etc.

The platform will use already available user-interaction methods, new methods developed in the framework of this project (e.g. a haptic interface) and enabling further extensions to introduce and test next generation user-interaction algorithms. Furthermore, the goal of this project is to propose and evaluate new and intelligent user interactions, which are dedicated for healthcare professionals, people with disabilities or at risk of exclusion, and to create and evaluate behavioural models of mobile users.


The main scientific and technological objectives of the project are to design and evaluate:

  • eye gaze-tracking hardware and algorithms for a user, who is mobile in the noisy real world environment,
  • algorithms for perceptual media and for super perceptual computing,
  • methods for locating objects and guiding vision towards the identified objects,
  • methods of interactions with users and objects (menu of activities for the identified person or object),
  • the haptic interface in a form of the peripheral proximity radar,
  • algorithms for recognition of the user’s own gestures and recognition of gestures of the observed person,
  • algorithms for empirical studies of a user’s behaviour,
  • algorithms for reference applications.

The result of the project will be the open platform in the form of multisensory electronic multimedia glasses and a set of new methods and algorithms for intelligent user interactions, especially in the context of perceptual media.

Project duration: 01.01.2014 - 31.12.2016.