Skip to content

A dyadic conversation aid using Google Glass for people who are blind or visually impaired

License

Notifications You must be signed in to change notification settings

salammemphis/Expression

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Expression: A dyadic conversation aid using Google Glass for people who are blind or visually impaired

Background

Expression assists people with visual impairments in perceiving social signals during a natural dyadic conversation. It performs following tasks:

  1. Detect and track faces
  2. Detect facial and behavioral expression (smile, yawn, sleepy etc.)
  3. Detect head movements (look up/down, look left/right, tiltleft/right)
  4. Provide audio feedbacks to user

System architecture

Details can be found in this article https://web.archive.org/web/20190430120824id_/https://eudl.eu/pdf/10.4108/icst.mobicase.2014.257780

Requiremets

  1. Google glass
  2. Android Studio
  3. C++
  4. Visual Studio
  5. CLM-Z face tracker https://github.com/TadasBaltrusaitis/CLM-framework
  6. OpenCv

How to use:

  1. Build Expression app from "app" folder using android studio
  2. Install the app in Google Glass
  3. Download CLM-Z and build server code from "server_code" folder using visual studio
  4. Run expression "mainExpression.cpp"
  5. Start app in Google Glass

About

A dyadic conversation aid using Google Glass for people who are blind or visually impaired

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 81.2%
  • Java 18.8%