Expression: A dyadic conversation aid using Google Glass for people who are blind or visually impaired
Expression assists people with visual impairments in perceiving social signals during a natural dyadic conversation. It performs following tasks:
- Detect and track faces
- Detect facial and behavioral expression (smile, yawn, sleepy etc.)
- Detect head movements (look up/down, look left/right, tiltleft/right)
- Provide audio feedbacks to user
Details can be found in this article https://web.archive.org/web/20190430120824id_/https://eudl.eu/pdf/10.4108/icst.mobicase.2014.257780
- Google glass
- Android Studio
- C++
- Visual Studio
- CLM-Z face tracker https://github.com/TadasBaltrusaitis/CLM-framework
- OpenCv
- Build Expression app from "app" folder using android studio
- Install the app in Google Glass
- Download CLM-Z and build server code from "server_code" folder using visual studio
- Run expression "mainExpression.cpp"
- Start app in Google Glass