Skip to content

Latest commit

 

History

History
23 lines (19 loc) · 2.02 KB

README.md

File metadata and controls

23 lines (19 loc) · 2.02 KB

SmartGlove Project

Project Overview

The SmartGlove is an innovative wearable device designed to recognize hand gestures and convert them into audio commands, as well as control smart home automation systems. This project aims to develop a responsive, user-friendly solution that empowers individuals with speech impairments to communicate through hand gestures while also providing intuitive gesture-based control over connected smart devices. Using advanced flex sensors and a powerful microcontroller, the SmartGlove captures precise finger movements and translates them into specific audio outputs and control signals for home automation. The development process will be documented, from initial design and prototyping to final implementation, providing a detailed overview of each stage.

SmartGloveHome

Proposed Features

  • Gesture Recognition: Detect and interpret hand gestures using advanced flex sensors.
  • Audio Output: Convert gestures into real-time audio feedback to assist with communication for speech-impaired individuals.
  • Smart Home Integration: Control smart home devices such as lights, fans, or media systems using recognized hand gestures.
  • Machine Learning: Incorporate a TensorFlow Lite machine learning model to enhance gesture recognition accuracy.
  • User-Friendly Interface: Enable easy customization of gestures and commands, making the SmartGlove adaptable to different user needs.

Proposed Technologies and Tools

  • Programming Languages: Python, C/C++
  • Libraries: TensorFlow Lite, OpenCV (for potential future expansions)
  • Hardware:
    • Flex Sensors: To detect the bending of fingers.
    • ESP32: As the main microcontroller for handling sensor input and wireless communication.
    • Audio Output Module: For converting gestures into audio commands.
  • Development Environment: VSCode, Arduino IDE, PlatformIO
  • Machine Learning Framework: TensorFlow Lite for gesture recognition.