Customizing Assistive Context-Aware Toolkit (ACAT)
Hello to all the amazing Open-Source developers out there!
I've never written a line of code in my life. I'm here after visiting a friend with MS, sadly she's been unable to express herself for almost 2 years - I can't even begin to imagine her isolation and frustration as her brain is still 100%, while her body and vocal cords have failed.
I came across ACAT quite by chance, it's an incredible achievement by the developers and a invaluable contribution to the handicapped. Hats off to you all! I've played around with the software and picked up a few 'bugs' for which I hope to find a solution. I also want to replace the F12 option with a finger-mounted IR sensor (similar to the cheek sensor used by Stephen Hawking), or exploring the option of using a mouse-click as an F12 trigger. In this specific case the Webcam is not an option (limited facial movement). I would also like to introduce the software to others in our city with similar illnesses, no longer able to communicate.
Help from developers on the ACAT project would be highly appreciated.