- Group Name: 2b || !2b
- Team Members:
- Paniz Ojaghi
- Zhifan Li
- Ahraz Yousuf
The Sign Language Translator glove project, developed as part of the SE 101 Course, aimed to create a device that could effectively translate sign language gestures into text and speech. Motivated by the desire for a project that balanced complexity and feasibility, our group chose this idea to explore the concept of translating sign language through gestures.
Our research drew from Arduino guides for a sign language glove translator and flex sensor functionality. Utilizing flex sensors that measure bending or deflection, we referred to a guide titled "Interfacing Flex Sensor with Arduino" from Last Minute Engineers, providing insights into flex sensor characteristics and usage with Arduino microcontrollers. Another key resource was an Arduino project on Project Hub, offering materials, schematics, and code segments for our project.
The project involved both hardware and software components. Setting up the breadboard, connecting wires, and integrating components like flex sensors, Bluetooth module, and accelerometer were crucial. The Arduino Nano R3 microcontroller served as the central control, managing flex sensors and power distribution. The software collected data from flex sensors, processed the angles, and printed corresponding letters based on predefined parameters.
Each team member played a vital role in the project. Paniz led the group, contributed to coding, and coordinated team activities. Zhifan handled breadboard setup and wiring, contributing to software development. Ahraz provided research, materials, and assisted in debugging, while Tristan contributed materials, aided in soldering, and worked on report documentation.
The glove evolved from an initial state of only reading values to printing letters and phrases, demonstrating improved performance. While time constraints and a faulty flex sensor limited the range of translated letters, the system achieved the goal of translating common phrases such as "I love you."
Design trade-offs included adjustments to resistor configurations for optimal current flow through flex sensors. Simplifying the output display by using the Arduino IDE screen instead of an LED screen was a pragmatic choice. Limitations on resources led to compromises in implementing additional features like an independent LED/OLED screen and a smaller breadboard.
Potential enhancements involve refining flex sensor accuracy for complete letter translation, improving aesthetics for a polished appearance, implementing an internal power supply, and exploring dual-glove systems for two-handed sign language. Language settings, voice modulation, and expanded phrase translation could further enhance functionality.
- satyamker80. "A glove that translates sign language into text and speech." Arduino Project Hub. Retrieved from Arduino Project Hub.
- Last Minute Engineers. "In-depth: Interfacing flex sensor with Arduino." Last Minute Engineers. Retrieved from Last Minute Engineers.