Hacker News new | past | comments | ask | show | jobs | submit login

I once worked on a piece of lab equipment where I needed to fit a plate into a robotic dispenser and then click the button to "start program." I didn't want to keep switching my hands from the dispenser to the mouse so I wired up a piano pedal to the left mouse button and it improved the ergonomics significantly.



I've programmed AutoHotKey on my work PC to map pressing both shift keys at the same to double click the left mouse button at the pointer's current location.


Replying to myself to add:

So what I really need then is mouse cursor movement by eye tracking + both-shift-keys for left double click.

Why isn't eye tracking for mouse movement a thing yet?


It is, but the first party software is awful for it. The videos you linked below are a pretty primitive version of what's currently available.

My own project: https://talonvoice.com (only requires a single eye tracker, has realtime precise movement similar to the eye+head projects below, and integrates with a very fast and powerful voice control system).

https://21clicks.net/share/-L5eU8SeIbUDcgsQa3_l - click Replay. This is completely hands-free using Talon's eye tracking and a noise recognizer (hiss to click). I consider what I have right now to be an unpolished tech demo, which will get significantly better as I cross many accuracy/speed TODOs off my list.

These projects require both a eye and head tracker (are still fast and precise):

- https://github.com/trishume/PolyMouse

- https://github.com/trishume/FusionMouse

- http://precisiongazemouse.com/

Miscellaneous, not quite as well integrated:

- http://kinesicmouse.xcessity.at/

- http://iris.xcessity.at/


This looks interesting, I'll have look later.

Thanks for reminding me of Dragon by Nuance. I first saw Dragon being used by a quadriplegic guy I used to work for. This was ~10 years ago, he had a bit of a speech impediment too and it worked reasonably well.

I occasionally think about Dragon and think it might be useful in my 2D CAD and laser cutter workflow. It actually seems reasonably priced, so maybe I'll give it a whirl.


When I say "fast and powerful", I mean what I've built which can use Dragon as a backend, not Dragon itself. Their user-facing command system is very limited (single trigger phrase, bad support for passing arguments to commands, must pause after talking). I have continuous recursive command recognition (you can say hundreds of commands in a minute, my record for perfect accuracy is ~280) and a command system that is leagues beyond Dragon's (and it's free on top of Dragon).

I plan to build/support professional workflows like CAD. I'd love to hear about your needs. I strongly believe eye + voice can greatly surpass keyboard/mouse in many environments.


I’ve sent an email to the one listed in your profile. Thanks.


I think everyone with two monitors has lamented the lack of a 'focus-follows-eyes'. The problem is that you are often reading from window A while typing into window B. So you'd have to be able to switch in and out of ffe. And I don't know about you, but I would never remember to switch. Pretty sure that would defeat the purpose.

I'd love to see someone play with the UI though.


Ah, yeah, that's a very good point.

Perhaps wink-left and wink-right could be used as modifiers to prevent FFE, or whatever customisation the user prefers.


I worked on a project a few years ago to build a system that used eye-tracking for context-sensitive voice commands. Imagine being able to say "zoom in" and have the computer zoom in at the place you're looking. Unfortunately the eye-tracking was never accurate enough for it to provide a good experience.


Because your eyes are constantly moving. https://en.m.wikipedia.org/wiki/Saccade


So I did some clicking and it turns out laptops and monitors are available, as well as a peripheral device, that appears to do eye tracking quite well.

https://tobiigaming.com/products/

https://www.youtube.com/watch?v=SYwd9Lt1ve4

https://www.youtube.com/watch?v=6mBlgcnqttg

Might have to get one of these...


I did something like this on work. In our intern ordering process we have to copy/paste over 10 data fields. So I made a box with an Arduino Micro and some buttons on it to copy/paste all fields in once.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: