0% found this document useful (0 votes)
30 views7 pages

Deaf Mute Communication Interpreter

Uploaded by

nishwa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views7 pages

Deaf Mute Communication Interpreter

Uploaded by

nishwa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/280877359

Deaf-Mute Communication Interpreter

Article · May 2013

CITATIONS READS

55 9,088

3 authors, including:

Munirathnam Dhanalakshmi
Sri Sivasubramaniya Nadar College of Engineering
20 PUBLICATIONS 123 CITATIONS

SEE PROFILE

All content following this page was uploaded by Munirathnam Dhanalakshmi on 24 August 2015.

The user has requested enhancement of the downloaded file.


International Journal of Scientific Engineering and Technology (ISSN : 2277-1581)
Volume 2 Issue 5, pp : 336-341 1 May 2013

Deaf-Mute Communication Interpreter


Anbarasi Rajamohan, Hemavathy R., Dhanalakshmi M.

Department of B.M.E., Sri Sivasubramania Nadar College of Engineering, Chennai, Tamil Nadu, India.

[email protected], [email protected], [email protected]

Abstract - Communications between deaf-mute and a data processing. Another challenge in image and video
normal person have always been a challenging task. The processing includes variant lighting conditions, backgrounds
project aims to facilitate people by means of a glove and field of view constraints and occlusion. The sensor
based deaf-mute communication interpreter system. The based technique offers greater mobility.
glove is internally equipped with five flex sensors, tactile
sensors and accelerometer. For each specific gesture, the The main aim of this paper is to present a system that can
flex sensor produces a proportional change in resistance efficiently translate American Sign Language[1] gestures to
and accelerometer measures the orientation of hand. The both text and auditory voice. The interpreter here makes use
processing of these hand gestures is in Arduino. The of a glove based technique comprising of flex sensor [7],
glove includes two modes of operation – training mode to tactile sensors [2] and accelerometer. For each hand gesture
benefit every user and an operational mode. The made a signal is produced by the sensors corresponding to
concatenation of letters to form words is also done in the hand sign [3] [9] the controller matches the gesture with
Arduino. In addition, the system also includes a text to pre-stored inputs.
speech conversion (TTS) block which translates the
The device not only translates alphabets but can also forms
matched gestures i.e. text to voice output.
words using made gestures. Training mode is offered in
device so that it fits every user and accuracy is increased.
Keywords—Deaf-Mute, communication, gesture,
The device can also be made to translate larger gestures that
Flex sensor, tactile sensors, accelerometer, TTS.
require single hand movement.
I. INTRODUCTION II. MATERIALS AND METHODS
About nine billion people in the world are deaf and dumb.
How often we come across these people communicating
with the normal world? The communication between a deaf
and hearing person poses to be a serious problem compared
to communication between blind and normal visual people.
This creates a very little room for them with communication
being a fundamental aspect of human life. The blind people
can talk freely by means of normal language whereas the
deaf-dumb have their own manual-visual language known as
sign language. Sign language is a non-verbal form of
intercourse which is found amongst deaf communities in
world. The languages do not have a common origin and
hence difficult to interpret. Deaf-Mute communication
interpreter is a device that translates the hand gestures to Figure 1. Block diagram of system

auditory speech. Figure 1 shows the entire block of Deaf-Mute


communication interpreter device. The controller used in the
A gesture in a sign language, is a particular movement of the device is an Arduino. Five flex sensors are used to measure
the degree of bending of the fingers. The flex sensors are
hands with a specific shape made out of them. Facial
interfaced with the controller using the voltage divider
expressions also count toward the gesture, at the same time. circuit. Accelerometer is directly interfaced to the digital
A posture on the other hand, is a static shape of the hand to ports as it includes the signal conditioning circuit. Three
indicate a sign. tactile sensors are used to improve accuracy of letters M, N
and T. The device contains two more tactile sensors for
Gesture recognition is classified into two main categories training mode and word formations. This is interfaced with
i.e. vision based and sensor based [5][6]. The disadvantage the digital ports of controller to feed in the digital
of vision based techniques includes complex algorithms for data.Arduino processes the data for each particular gesture

IJSET@2013 Page 336


International Journal of Scientific Engineering and Technology (ISSN : 2277-1581)
Volume 2 Issue 5, pp : 336-341 1 May 2013

made. The controller has two modes of operation – training D. ACCELEROMETER


mode and operational mode. In training mode the gesture are Accelerometers are used for tilt sensing. They measure both
made by user and the voltage levels are stored in EEPROM. static and dynamic acceleration. The sensor has a g-select
In operational mode the data is being compared with
input which switches the accelerometer between ± 1.5g and
predefined values and the matched gestures are sent to text
to speech conversion module. The module consists of TTS ±6g measurement ranges. It has a signal conditioning unit
block and SpeakJet. The output is processed and heard via a with a 1-pole low pass filter, temperature compensation,
speaker. self-test, and 0g-detect which detects linear free fall.
A. ARDUINO WITH BUILT IN ATMEGA 328 E. TEXT TO SPEECH CONVERSION
Arduino is an open source platform based on simple The translator consist of text to speech module that allows
microcontroller board. The controller used in the device is the device to translate the gesture once recorded. The
Arduino duemilanove with inbuilt atmega328 in it. encoder (TTS256)-synthesis (SpeakJet) pair includes text to
Atmega328 has 32KB on-chip flash memory for storing speech modality without loading microcontroller. The
codes of which 2KB used for boot loader. It also includes a output text of Atmega 328 is converted to sound by
2KB of SRAM and 1KB of EEPROM. The program that is TTS.The TTS256 is a 28 pin DIP, 8 bit microprocessor
developed is to be stored on the flash memory of the programmed with 600 letters to sound rules. The built-in
controller. The Arduino software also includes a serial algorithm allows real time translation of English ASCII
monitor which allows data to be sent to or from the Arduino characters to allophone addresses. This is used along with
board. SpeakJet to facilitate text to speech conversion.
B. FLEX SENSOR
Flex sensors are resistive carbon elements. When bent, the
sensor produces a resistance output correlated to the bend
radius [9]. The variation in resistance is approximately 10 to
30 KOhm‘s. An unflexed sensor has 10Kohm resistance and
when bent the resistance increases to 30Kohm at 90 o [3].
The sensor is about ¼ inch wide, 4-1/2 inches long.

To analog
port Arduino

Figure 3. Block diagram of speech synthesizer


Vout = Vin [ R1/ R1 + R2 ]
Figure 2. Voltage divider circuit The technique behind TTS256 is that it accepts serial
The sensor is incorporated in device using a voltage divider data ASCII characters and translates to syllabic sounds.
network. Voltage divider is used to determine the output SpeakJet generates an audio signal using five sine-synthesis
voltage across two resistances connected in series i.e. generators for the allophones. A ready signal is sent from
basically resistance to voltage converter. The resistor and SpeakJet to TTS to indicate its ready state. The output from
flex forms a voltage divider which divides the input voltage SpeakJet is amplified usingLM386 audio amplifier. The
by a ratio determined by the variable and fixed resistors. voice output is through a speaker.
C. TACTILE SENSORS
A tactile switch also known as momentary button or III. RESULTS AND DISCUSSIONS
push-to-make switch is commonly used for inputs and
controller resets. These types of switches create a temporary The evaluation of Deaf-mute communication interpreter was
electrical connection when pressed. One pin is supplied with carried out for ten beginners for letters ‗A‘ ‗B‘ ‗C‘ ‗D‘ ‗F‘
+5 volts and the other pin is grounded. This is connected to ‗I‘ ‗L‘ ‗O‘ ‗M‘ ‗N‘ ‗T‘ ‗S‘ ‗W‘. Word formation from letters
the digital pin of Arduino. Output is grounded as switch is is also performed using an end signal.
pressed and high otherwise.
The hand glove is mounted with five flex sensor, an

IJSET@2013 Page 337


International Journal of Scientific Engineering and Technology (ISSN : 2277-1581)
Volume 2 Issue 5, pp : 336-341 1 May 2013

accelerometer and tactile sensors. Table 1 shows Table 2 shows the output voltage across a voltage divider
the network with constant resistance of 10Kohms, the digital
Output voltage across a voltage divider network value and the corresponding resistance for different bending
with constant resistance of 22Kohms, the digital angles of flex 4.5‖ mounted in index, middle and ring
value and the corresponding resistance for different fingers.
bending angles of flex 2.5‖ mounted in thumb and
pinky fingers. TABLE 2 : RESISTANCE AND CORRELATED
BENDING – FLEX 4.5‖
TABLE 1: RESISTANCE AND CORRELATED
VOLTAGE RESISTANCE
BENDING – FLEX 2.5‖ DEGREE
(ANALOG)
DIGITAL
(OHMS)

VOLTAGE RESISTANCE 0 1.720 350 11536.585


DEGREE DIGITAL
(ANALOG) (OHMS)
10 1.842 377 12832.172
0 3.655 748 27200 20 1.955 400 14124.795
10 3.725 764 29215.686 30 2.072 424 15572.618
20 3.797 777 31585.366 45 2.250 462 18000
30 3.866 791 34094.828 60 2.434 497 20686.277
45 3.968 812 38483.412 75 2.624 534 24296.296
60 4.071 833 43842.105 90 2.776 568 27460.432
75 4.174 854 50532.544 The value of resistance increases with increase in degree of
90 4.276 875 59121.622 bending as in figure 6 and the output voltage of the voltage
The value of resistance increases with increase in degree of divider network also increases with increase in resistance as
bending as in figure 4 and the output voltage of the voltage in figure 7.
divider network also increases with increase in resistance as
in figure 5. DEGREE VS RESISTANCE
40000
Resistance

DEGREE VS RESISTANCE 30000


100000 20000
Resistance

80000 10000
resistance
60000 0
40000 -50 0 50 100 150
20000 resistance
0 Degree
-100 0 100 200

Degree Figure 6: Plot of degree vs. resistance of flex 4.5‖

VOLTAGE VS RESISTANCE
Figure 4: Plot of degree vs. resistance of flex 2.5‖
40000
Resistance

30000
VOLTAGE VS RESISTANCE
20000
100000
10000
Resistance

80000 resistance
60000 0
40000 0 1 2 3
20000 resistance
0 Voltage
0 2 4 6

Voltage Figure 7: Plot of voltage vs. resistance of flex 4.5‖

Figure 8 to Figure 15 shows the plot of gestures ‗A‘ ‗B‘ ‗C‘


Figure 5: Plot of voltage vs. resistance of flex 2.5‖ ‗D‘ ‗F‘ ‗I‘ ‗L‘ ‗O‘ ‗S‘ ‗W‘ obtained using flex sensors with

IJSET@2013 Page 338


International Journal of Scientific Engineering and Technology (ISSN : 2277-1581)
Volume 2 Issue 5, pp : 336-341 1 May 2013

time along X-Axis and digital values along Y-Axis. The red
line indicates the signal from thumb, green the signal from
index finger, blue the signal from middle finger and black
from pinky finger. Plot is observed to be unique for each
gesture which is plotted using matlab software.

Figure 11. Plot of Gesture ―F‖

Figure 8. Plot of Gesture ―A‖

Figure 12. Plot of Gesture ―I‖


Figure 9. Plot of Gesture ―C‖

Figure 13. Plot of Gesture ―O‖


Figure 10. Plot of Gesture ―D‖

IJSET@2013 Page 339


International Journal of Scientific Engineering and Technology (ISSN : 2277-1581)
Volume 2 Issue 5, pp : 336-341 1 May 2013

TABLE 5

S T U V W X Y Z
X 234 227 242 240 243 243 246 244
Y 303 300 300 304 301 308 307 306
Z 130 135 127 127 127 125 125 126

The letters are concatenated to form words. The output from


serial monitor of Arduino is depicted in figure 16 and 17.

Figure 14. Plot of Gesture ―S‖

Figure 16: output of serial monitor Figure 17: Output of serial monitor
The device is also designed to input values of gestures
Figure 15. Plot of Gesture ―W‖
automatically for each user by using training mode prior to
Table 3 to Table 5 shows the average value taken for ten
the usage of device. LCD output of training mode and the
persons from an accelerometer mounted in wrist for 27
digitalized voltage levels of flex sensor is shown in figure 18
gestures.
and 20.
TABLE 3

A B C D E F G H I
X 230 243 238 246 241 242 251 262 232
Y 314 308 304 307 308 300 355 364 300
Z 134 128 129 124 126 127 119 123 134

TABLE 4 Figure 18: Training Mode ―Gesture A‖

J K L M N O P Q R
X 232 231 239 229 227 237 259 257 236
Y 334 327 315 302 302 306 317 318 307
Z 134 136 127 130 135 129 118 115 130

Figure 19: Training Mode ―Gesture C‖

IJSET@2013 Page 340


International Journal of Scientific Engineering and Technology (ISSN : 2277-1581)
Volume 2 Issue 5, pp : 336-341 1 May 2013
Ninth International Conference on Wearable and Implantable Body Sensor
Networks

iii. Nazrul H. ADNAN, Khairunizam WAN, Shahriman A.B., M.


Hazwan ALI, M. Nasir Ayob and Azri A. AZIZ, Development of Low
Cost GloveMAP Based on Fingertip Bending Tracking Techniques for
Virtual Interaction, International Journal of Mechanical & Mechatronics
Engineering IJMME-IJENS, Vol:12, No:04

iv. Syed Faiz Ahmed, Syed Muhammad Baber Ali, Sh. Saqib and
Figure 20: Training Mode ―Gesture L‖ Munawwar Qureshi, Electronic Speaking Glove for Speechless Patients: A
Tongue to a Dumb, Proceedings of the 2010, IEEE Conference on
Sustainable Utilization and Development in Engineering and Technology,
The overall gesture recognition for the letters and words
UniversitiTunku Abdul Rahman, 20 & 21 November 2010, Faculty of
showed accuracy of about 90 percentage. Engineering, Kuala Lumpur, Malaysia

IV. CONCLUSION
v. Kirsten Ellis and Jan Carlo Barca, Exploring Sensor Gloves for
Teaching Children Sign Language, Faculty of Information Technology,
The project proposes a translational device for deaf-mute Monash University, Clayton Campus, VIC 3800, Australia, 20 June 2012.
people using glove technology. The proposed technique has
enabled the placement of five flex sensor, 5 tactile sensors vi. AjinkyaRaut, Vineeta Singh, Vikrant Rajput and
RuchikaMahale, Hand Sign Interpreter, The International Journal of
and an accelerometer on to a glove. The results demonstrate
Engineering And Science (IJES), Volume 1, Issue 2, Pages 19-25, 2012.
that sensor glove design with tactile sensor helps to reduce
the ambiguity among gestures and shows improved vii. Tan TianSwee, Sh-HussainSalleh, A.K. Ariff, Chee-Ming Ting,
accuracy. Further the device will be an apt tool for deaf- Siew Kean Seng, and Leong SengHuat, Malay Sign Language Gesture
RecognitionSystem, International Conference on Intelligent and Advanced
mute community to learn gesture and words easily.The
Systems 2007
project can be enhanced to include two or more
accelerometer‘s to capture the orientation of hand viii. Luis M. Borges, Norberto Barroca, Fernando J. Velez Antonio
movements once the gesture is made. This will expand the and S. Lebres, Smart-Clothing Wireless Flex Sensor Belt Network for
Foetal Health Monitoring Digital Object, Identifier:
capability to translate larger gestures.
10.41081/CST.PERVAS/VEHEALTH 2009

V. REFERENCE ix. SupawadeeSaengsri, VitNiennattrakul and Chotirat Ann


Ratanamahatana, TFRS: Thai Finger-Spelling Sign Language Recognition
i. KunalKadam, RuchaGanu, AnkitaBhosekar and Prof.S.D.Joshi, System, 978-1-4673-0734-5/12 ©2012 IEEE
American Sign Language Interpreter, 2012, IEEE Fourth International
Conference on Technology for Education x. V-risJaijongrak, SaritChantasuban and SurapaThiemjarus,
Towards a BSN-based Gesture Interface for Intelligent Home Applications,
ii. NetchanokTanyawiwat and SurapaThiemjarus, Design of an ICROS-SICE International Joint Conference 2009, August 18-21, Fukuoka
Assistive Communication Glove using Combined Sensory Channels, 2012, International Congress Center, Japan

IJSET@2013 Page 341

View publication stats

You might also like