Last Updated on Tuesday, 5 April, 2022 at 11:22 am by Andre Camilleri
Kenneth Bone is the Project Manager of Wildeye and managing director of Seasus, (www.seasus.com), a multi-award winning Malta-based technology service provider. He is a serial entrepreneur in the digital technology field, having spearheaded research and innovation projects in areas such as digital payments, travel behaviour analysis, digital forensics and AI.
The presence of electronic devices, ranging from mobile phones to computers and in-car infotainment systems is now pervasive in our lives and we are in near constant interaction with these devices. The most common method for this interaction involves the use of one’s hands, either via a touch screen or via devices such as a keyboard or a mouse. In recent years, the use of speech or voice commands to devices has also gained popularity, as speech recognition technology became more accurate. But what if one is unable to use any of these methods to interact with a device, such as in the case of injury or disability?
In such situations, eye-gaze or eye-tracking is an alternative way of accessing your computer using a mouse that you control with your eyes. Eye movements have long been recognised to provide an alternative channel for communication with, or the control of, a machine such as a computer.
Project Wildeye is a collaboration between the Department of Systems and Control Engineering at the University of Malta and Seasus Ltd, funded by Fusion, the R&I Programme administered by MCST, that developed a passive eye-gaze tracking platform aimed to provide an alternative communication channel primarily for persons with physical disabilities. The application is primarily aimed for users with motor and/or communication impairment, permitting them to perform mundane activities such as operating a computer.
The achievements of the project were discussed during a public engagement event held recently at the Esplora Planetarium Building, with the attendance of Owen Bonnici, Minister for Equality, Research and Innovation and Dr Melchior Cini, deputy director, R&I Programmes Unit, who delivered speeches during this event.
Most commercially available eye-gaze tracking products are of the active type, relying on dedicated additional hardware devices that make use of infra-red illumination directed to the face or to the eyes themselves. On the other hand, Wildeye developed a passive eye-gaze tracking technology, whereby the eye-gaze is tracked using only a standard web camera, which is commonly available on computers. This makes the technology more accessible by doing without the need for the additional cost and setup of such dedicated hardware devices.
In its Action Control Mode, the Wildeye application displays a number of buttons on screen representing the most common actions used within the operating system and other applications. The user would simply look at the icon of interest and “click” on it either with a long blink or my longing their gaze on that specific icon. This can be thought of as the equivalent of a remote control for computer applications.
The Short Message Mode of the Wildeye application is more focused on users with both motor and communication impairments. In this scenario, the application displays a set of message buttons which can be quickly and easily focused on and triggered by the user. Buttons display a single message upon “hit”, (such as Yes or No) or serve as a category header which opens a button subset (such as I Am, which leads to additional message options). One can think of this as the equivalent of a hand sign language.
The project will now move into the next phase, Wildeye+, which will primarily look into the use of state-of-the-art deep learning techniques and neural networks to enhance the capabilities of the existing algorithms for eye and head gaze estimation. Such enhancement will enable the algorithms to autonomously adapt to diverse usage environments such as sub-optimal lighting conditions, facial features, hardware and other parameters that can affect accuracy.
Another area of interest for the project is the possibility to integrate with IoTs (Internet of Things), such as wi-fi enabled switches, which would enable users to gain control of hardware such as appliances, lights, door locks and so forth by purely using their eyes. The application would also provide open-source APIs that would allow third party providers of software and hardware to integrate directly with the Wildeye platform. This would enable software providers to directly enable control of their application via eye-gaze based inputs.
In addition, the project will look into broader applications, such as user behaviour analysis and usage on mobile devices.
For further information visit www.facebook.com/wildeyemlt