I applaud projects like this, but it really needs hardware specialized for eye tracking. Unfortunately, every hard word project I've encountered is focused on research and its price reflects that when it's in a thousands of dollars. Way out of reach for those that are disabled. https://www.tobii.com/. tobii is by far the most price accessible around $250. Unfortunately, their licensing is hostile against those who need an accessible product. They won't even allow you to create a Python wrapper to make it easy to integrate with accessibility projects. (Even if you're not storing gaze data). So there needs to be accessible hard worth accessible licensing for those with disabilities.
>They won't even allow you to create a Python wrapper(...)
What if you just.... ignore what they want? Publish under a pseudonym that appears to be from a country noncompliant with US intellectual "property" nonsense.
Usually the disabled get the hardware through insurance or grants - in Texas, the state provides grants for assistive communication devices out of a fund sourced from a phone bill tax. My dad's Tobii i16 came out to ~$15k. Ex-USA, I'm sure the cost of these things is ... unreasonably prohibitive.
I'm actually kinda excited to find PyGaze, we might play with it and see if I can tweak a few things to improve his experience.
As I am living an almost mouse free life, automatically setting the input focus at the window (or input field) that I am currently looking on, would be a very nice comfort feature.
Last time I researched this, I could not find a solution, that works on Linux and also has widescreen support.
I am mostly mouse-free too, so I feel inclined to answer.
For many pieces of software it's possible to use it with a keyboard-only. This is neccessary for accessibility, but it's also just what the good old hotkeys do.
Doesn't work for everything but it already gets you very far.
On top of that, there is some software that is made to cater specifically to keyboard-only use. Many tiling window manager work well with just a keyboard ( I personally am partial to i3[0]) and even for webbrowsers there are extensions like tridactyl[1] that make it easier to use without a mouse. That doesn't work for everything. It's possible to do 3d work without a mouse - but I find it easier to have access to a mouse. Specifically a trackpad.
This can of course be turned on it's head, and you can use something like dasher[2] to do your typing and - to get this back on topic - dasher can be used together with eye tracking to live a "keyboard free life"
Eye trackers are an incredibly common tool for research. They're obviously useful for studying eye movements themselves. This may sound niche, but eye movements are tightly integrated with perception and attention. Eye tracking can thus tell you about the features or cues that help (say) hit a baseball, read some text, or find a tumor. Experts in a field often make different patterns of eye movements than novices, so there are some nascent efforts at using eye tracking to help training (e.g., "no, keep your eye on the ball").
On the application side, they have potential as an interface device too: you can make eye movements incredibly quickly (80-100ms, under some conditions) and accurately. This persists even in some neurodegenerative conditions.
Foveated rendering is another neat idea. The fovea, the "high resolution" part of your retina, is surprisingly small: it captures a thumbnail (at arms' length) sized region of the world around the point of gaze in color and detail. Outside that region, the visual input is increasingly blurry and colorless. It doesn't seem that way because your brain assembles percepts by making many eye movements (2-4 per second) stitching the input together. One popular idea is to exploit this when rendering images (e.g., for VR): spend most of your cycles making the point of gaze pretty and provide just enough elsewhere to complete the illusion. Doing this obviously requires knowing exactly where a person is looking, and thus, an eye tracker.
actually, for foveated rendering knowing exactly where the person is looking is not good enough, you need to know where the person will be looking after the frame is rendered.
I'm also very hyped about foveated rendering for ultra wide monitors.
I was involved in some great research looking at the difference between how artists and non artists look at images. Eye trackers heavily used.
Outcome: artists are more likely to track away from obvious points of interest. So in the case of a portrait, the artist might be looking at a brick by the subject's feet, whilst everyone else's gaze would be fixed on the subject's eyes.
Looks like they are made for research and the price reflects that as well (5500$). I guess they for example can be used to track test buyers to test store layouts and similar applications