SlideShare a Scribd company logo
Jambhulkar. R. Kunal, International Journal of Advance Research, Ideas and Innovations in Technology.
© 2017, IJARIIT All Rights Reserved Page | 18
ISSN: 2454-132X
Impact factor: 4.295
(Volume3, Issue1)
Available online at: www.ijariit.com
Mems Sensor Based Approach for Gesture Recognition to
Control Media in Computer
Kunal R. Jambhulkar
Department of Embedded System & Computing,
Nagpur Institute of Technology, India
krjambhulkar13@gmail.com
Abstract: Gesture Recognition is the method of identifying and understanding meaningful movements of the arms, hands,
face, or sometimes head. It is one of the most important aspects in the field of Human-Computer interface. There has been a
continuous research in this field because of its ability for application in user interfaces. Gesture Recognition is one of the
important areas of research for engineers and scientists. Nowadays the industry is working on the different implementation for
the trouble free, natural and easy product which can be easy to handle. This paper proposed a method to work with motion
sensors and interpret the motion of hand into various applications in a virtual interface. The Micro-Electro-Mechanical
Systems (MEMS) accelerometers are used to capture the dynamic hand gesture. These sensors information is transferred to
the microcontroller from where these data are transferred wirelessly to the computer system for actual processing of the data
with the use of various algorithms.
Keywords: Gesture Recognition, MEMS, Teensy, Accelerometer.
I. INTRODUCTION
Gesture recognition Technique is one of the important tools in the field of Human-computer interaction (HCI). As the Technology
Advances in time, Researchers devoted to this, are struggling to develop the new way to find an intelligent, easy and more natural
way for human-computer interaction. There are several techniques introduced in human-computer interaction such as face
detection, speech recognition and motion gesture. Hand gesture provides an intuitive and more natural form of communication for
humans. Gesture recognition system interprets human gestures into some commands, which can be used to control useful devices.
There are two main techniques in gesture recognition.
A. Glove based Approach
A Glove-Based Gesture Recognition system consists of sensors mounted on the glove for motion capturing, microcontroller
for information processing and power supply. The wearable glove senses the orientation of user’s hand along with the motion.
The users are required to carry additional equipment which may feel inconvenient and disturbing, troubling the actual
interaction. Because of which they are unfit for spontaneous interaction due to the complicated arrangement.
B. Vision-based Approach
Vision-based techniques use visual inputs like the camera to capture the gesture or expression to be used in the gesture
recognition system. Vision-based techniques can deal with the problems of Glove-based techniques, but it has some of its own
problems. Portability is a problem for most vision-based systems that require still arrangements of the video cameras. Video
processing of information has several problems as they are highly dependent on light conditions, video camera settings and an
environment.
In our Proposed system we are following the Glove based gesture recognition approach because it has high accuracy and fast
reaction speed. To minimize the wearable part of the system we used the Teensy USB microcontroller development board. It is
a small sized chip with onboard microcontroller and Analog to digital converter and uses very low power (3.3v – 5v), which is
perfectly suitable for our system. Also, we are using 6 MEMS accelerometer for a complete orientation data of hand and one
flex sensor for any clicking operation. Zigbee transceiver is used to make the system wireless. The receiver module is attached
to the computer where the sensor data further processed with the use of various algorithms and performs related tasks.
II. LITERATURE SURVEY
This section describes some of the related works already done in the field of Gesture recognition.Yikai Fang, Kongqiao Wang,
Jian Cheng and HanqingLu [1], proposed a real-time hand gesture recognition system. In this system, a particular gesture is
required to trigger the hand detection which is followed by tracking. Then hand is segmented using motion and color cues and
Jambhulkar. R. Kunal, International Journal of Advance Research, Ideas and Innovations in Technology.
© 2017, IJARIIT All Rights Reserved Page | 19
applied the proposed method to the navigation of image browsing. It locates hands without separate segmentation mechanism
and the classifier is learned from a small set of image samples, so the generalization is very limited.
S. Zhou, Z. Dong, W. J. Li, and C. P. Kwong [2], proposed a system in which a Micro Inertial Measurement Unit (MIMU) is
based on Micro Electro Mechanical Systems (MEMS) sensors is applied to capture the motion information produced by
alphabets written by the human user in air. The MIMU is built to capture the three-dimensional accelerations and angular
velocities of the motions during writing by hand. In this paper, they used SOM as a character recognition method for training
and classifying data transformed by DCT. However, SOM has some limitations which affect its classification results, such as
its sensitivity to data input sequence and signal level.
SiddharthSwarup Rautaray and Anupam Agrawal [3], proposed a computer vision-based gesture recognition techniques
and developed a vision-based low inexpensive device to control the VLC player by hand gestures. VLC media player consists
of a computational module which is used to analyze the Principal Component for hand gesture information and finds the
feature vectors of the gesture and saved it in the memory. The identification of the gesture is done by K-Nearest Neighbour
algorithm. This application is less robust in recognition phase. Robustness of the application can be increased by applying
some more robust algorithms to reduce noise and blur motion. For controlling VLC, the application uses global keyboard
shortcut in VLC and making keyboard event of that global shortcut with keybd_event () function. It’s not the smart way of
controlling any application.
Ruize Xu, Shengli Zhou and Wen J. Li [4], proposed three different gesture recognition models which are capable of
recognizing seven hand gestures, i.e., up, down, left, right, tick, circle, and cross, based on the gesture information provided by
MEMS triaxial accelerometers. The accelerations of a hand in motion in three perpendicular planes are detected by three
accelerometers respectively and transmitted to a PC by using Bluetooth transceiver. An automatic gesture segmentation
algorithm is developed for identifying the individual gestures in a sequence. The segmentation algorithm used in this paper has
some limitations such as inaccurate results in finding the terminal points of gestures. Hence the accuracy of the system is
limited.
Chetana S. Ingulkar and A. N. Gaikwad [5], proposed a real-time Human-Computer Interaction (HCI) based on the hand
data glove and K-NN classifier for gesture recognition is proposed. The gestures classified are categorized as clicking, rotating,
dragging, pointing and ideal position. There are two types of sensors are used flex sensor and accelerometer, which are
completely different with each other hence makes system complex to calculate & recognize a perfect gesture.
Meenaakumari.M. and M.Muthulakshmi [6], proposed a portable gesture recognition system that has been developed
with the use of trajectory recognition algorithm. The portable device consists of a 3-axes accelerometer, a microcontroller, and
a Zigbee wireless transceiver module. Users can use this portable system to write digits and make hand gestures at normal
speed. The limitation of the proposed trajectory recognition algorithm is that it can only identify a letter or a number drawn
with a single stroke.
R. Suriya and V. Vijayachamundeshwari [7], discussed work done in the area of hand gesture recognition using various
methods like Hidden Markov Model, simple mouse control and MEMS accelerometer. They have described hand detection
methods in the pre-processed image for detecting the hand image, which is the main process in gesture recognition.
O. Sidek and M. A. Hadi [8], proposed the development of wireless Bluetooth hand gesture recognition system using six
3-axis accelerometers embedded in a glove with a database in a computer for samples. This system can identify any sampled
data saved in the database while making the device portable and mobile to the user by using wireless Bluetooth technology.
The system analyzed gesture data such as static data, dynamic data, and average recognition rates relationships are discussed
in this paper. In this paper, the system only recognizes the gestures on the basis of the existing set of gestures in the database
but it is not dedicated for a particular application.
III. PROPOSED WORK
This paper proposed the technique for development of wireless hand gesture recognition system using six 3-axis
accelerometers and a flex sensor embedded in a glove and a media player application in the computer which can be controlled
by making gesture using the glove. This system can recognize any sampled data saved in the database while promoting
maximum portability and mobility to the user via wireless technology.
The system has six 3-axis accelerometers – one on each finger and one on the back of the palm integrated into the glove
to detect hand positions and motions. Also, one flex sensor is used to provide any clicking operation for the media player. All
the accelerometers are connected to a microcontroller and the raw data received are mapped and arranged in an array before it
is transferred serially to a wireless module. Data acquired by the computer by the wireless module are saved in a database
called gesture library by means of a graphical user interface (GUI) in a computer. The GUI system is created to ease the
collection of sample data and it can also be used to recognize gestures from the glove. The recognition system will return the
recognized gesture based on the highest probabilities score. According to the detected gesture pattern by the recognition
system, the appropriate commands are given to the media player to control the various functions of it.
The components selected for the proposed system are as follows.
Jambhulkar. R. Kunal, International Journal of Advance Research, Ideas and Innovations in Technology.
© 2017, IJARIIT All Rights Reserved Page | 20
A. Accelerometer Fig 1 ADXL335 Accelerometer
Fig 2 ADXL335 Circuit diagram
Accelerometers are the sensors which sense acceleration. For achieving analog input values we select the ADXL335 analog 3-
axes accelerometer. The ADXL335 is a small, thin, low power, 3-axis accelerometer with signal based voltage outputs. The
sensor measures acceleration with a range of ±3 g. The sensor can be used to measure static acceleration in tilt-sensing
applications, and dynamic acceleration arises from motion, shock, or vibration. It uses extremely low power (only 320uA)
which is perfectly suitable for our battery based application.
It is a 3-axis accelerometer means it can sense the acceleration in three dimensions X, Y, and Z at the same time,
provides better judgment towards detection of hand movement.
The user can select the bandwidth of the accelerometer using the CX, CY, and CZ capacitors at the X, Y, and Z pins.
Bandwidths can be selected to according to the application, with a range of 0.5 Hz to 1600 Hz for X and Y axes, and a range
of 0.5 Hz to 550 Hz for the Z axis.
A. Teensy
Fig 3 Teensy 3.2 Pin diagram
The Teensy board is a complete USB-based microcontroller development system in a very small size, capable of implementing
many types of projects. It consists of 32 bit ARM cortex M4 core which is based on ARM 7 architecture. It also had on chip
16 bit analog to digital converter (ADC) and works on 3.3 to 5 volts.
Teensy comes pre-flashed with a boot loader called Teensy loader, so it can be programmed using the onboard USB
connection, No external programmer needed. We can program for the Teensy in your favourite program editor using C or you
can install the Teensyduino, an add-on for the Arduino IDE and write Arduino sketches for Teensy.
The main features of Teensy which make it suitable for our system are,
1) Dimensions: its dimensions are 1.4 x 0.7" (~35 x 18 mm), it is so small that can be easily mounted on the glove and
handled easily.
Jambhulkar. R. Kunal, International Journal of Advance Research, Ideas and Innovations in Technology.
© 2017, IJARIIT All Rights Reserved Page | 21
2) Power: It works on 3.3v to 5v only, hence suitable for a battery based system. Also, It has an on-chip voltage regulator
which can provide a regulated voltage of 3.3 volts and up to 100mA, which is sufficient to power all the sensors and
wireless module of our system.
3) Microcontroller: It has 32-bit ARM Cortex M4 core based on ARM 7 architecture, which is efficient and dependable.
4) ADC: It has on-chip two 16 bit analog to digital converters (ADC), which is required in our system to convert analog
sensor values to digital.
5) Analog Inputs: It has 21 Analog inputs of 16 bit each, which is the most important requirement of our system to
connect 6 accelerometers of 3 input each and 1 flex sensor.
6) Programming: is easy to program using onboard USB provided using any c compiler or Teensyduino IDE based on
Arduino.
B. Flex Sensor
Fig 4 A Flex Sensor Fig 5 Basic Flex Sensor Circuit
Flex sensors are passive resistive devices that can be used to detect bending in one direction. They were popularized by being
used in the Nintendo Power-Glove as a gaming interface. The change in resistance depends upon the amount of bending on the
sensor. As the angle of bend increases the resistance increases. A flat un-flexed sensor has a resistance of 10 KΩ. As the flex
sensor is bent, the resistance gradually changes. When flexed all the way the resistance rises to 20KΩ.
In our System, the flex sensor is connected to a 10KΩ resistor in series to form a voltage bridge. The voltage at the
middle of the bridge is provided to the analog input of the Teensy to detect the changes in bending of the sensor by voltage
values received and converts it into digital form using it’s built in ADC.
C. Wireless Module
Fig 6 A nRF24L01+ wireless module
The nRF24L01+ is an ultra-low power wireless trans-receiver module, which operates in the 2.4 GHz band. The configuration
of the module can be done by a Serial Peripheral Interface (SPI) where data is sent in both ways at once but on separate lines.
NRF24L01+ has internal FIFOs which ensure smooth data flow with the use of 6 data pipe multilevel. The module has a built-
in state machine which controls the transmission between its operating modes and the configuration registers are accessible in
all operational modes. It uses GFSK modulation. A user can configure parameters like frequency channel, output power and
transmission rate. NRF24L01+ supports the data transmission rate of 250 kbps, 1 Mbps and 2Mbps.
Two wireless modules are used in our system one acts as a transmitter, situated on the glove itself and second acts as a
receiver, which is connected to the computer. The sensor data are transferred from the sensor glove to the computer by this
wireless module.
D. Computer
A computer is required which collects the sensor data by the wireless receiver module attached to it. The actual processing of
data is implemented here with the help of some algorithms consist of three main components quantizer, Model and Classifier.
As an accelerometer continuously transfers a sequence of data, there is a need of quantizer to cluster or make groups of the
gesture data. Here, a common k-mean algorithm can be used. The model part of processing has to be a hidden Markov model
since it is mostly used in the gesture recognition and delivers reliable results for patterns. The remaining component is a
classic Bayes-classifier. In addition to these main components, we can use two filters for pre-processing of the sensor data, an
“idle state” and a “directorial equivalence” filter. Both are reliable to reduce and simplify the incoming sensor data.
Jambhulkar. R. Kunal, International Journal of Advance Research, Ideas and Innovations in Technology.
© 2017, IJARIIT All Rights Reserved Page | 22
CONCLUSIONS
This paper proposed a MEMS sensor-based gesture recognition system which is wireless and modular in nature. Since the
hardware used are of small size and efficient, the system becomes easy to handle. The applicability of the proposed system is
defined by controlling a media player application created for the purpose and user can control its functionality with the use of
hand gestures. The system used different algorithms such as K-mean, Hidden Markov model and Bayes classifier to efficiently
process the sensor information and to detect the gestures accurately to perform the assigned task to that particular gesture.
REFERENCES
[1] O. Sidek and M. A. Hadi, “Wireless Gesture Recognition System using MEMS Accelerometer”, International Symposium
on Technology Management and Emerging Technologies- ISTMET-2014.
[2] R. Suriya and V. Vijayachamundeshwari, “A Survey on Hand Gesture Recognition for Simple Mouse Control”, ICICES-
2014.
[3] Meenaakumari.M. And M.Muthulakshmi “MEMS Accelerometer Based Hand Gesture Recognition”, (IJARCET), Volume.2,
No.5, May 2013.
[4] Chetana S. Ingulkar and A. N. Gaikwad, “Hand Data Glove: A wearable real time device for human computer Interaction”,
International Journal of Science and Engineering, Volume 1, Number 2, 2013.
[5] RuizeXu, Shengli Zhou, Wen J. Li. “MEMS Accelerometer Based Nonspecific-User Hand Gesture Recognition”, Volume
12, IEEE, 2012.
[6] SiddharthSwarup Rautaray and Anupam Agrawal, “A Vision based Hand Gesture Interface for Controlling VLC Media
Player”, International Journal of Computer Applications, Volume 10, 2010.
[7] S. Zhou, Z. Dong, W. J. Li, and C. P. Kwong, “Hand-written character recognition using MEMS motion sensing technology,”
in Proc.IEEE/ASME Int. Conf. Advanced Intelligent Mechatronics, 2008.
[8] A Yikai Fang, Kongqiao Wang, Jian Cheng, HanqingLu, ”A Real-Time hand gesture recognition method”, In International
Conference on Multimedia and Expo, IEEE, July 2007.

More Related Content

What's hot (20)

PPTX
Part 1 - Gesture Recognition Technology
Patel Saunak
 
PPTX
Real time gesture recognition
Jaison2636
 
PDF
Z4501149153
IJERA Editor
 
PPTX
gesture recognition!
mehran kordavani
 
PDF
IRJET - A Smart Assistant for Aiding Dumb People
IRJET Journal
 
PDF
Digital Pen for Handwritten Digit and Gesture Recognition Using Trajectory Re...
IOSR Journals
 
PPTX
Gesturerecognition
Mariya Khan
 
PDF
Final Year Project-Gesture Based Interaction and Image Processing
Sabnam Pandey, MBA
 
PDF
IRJET- A Survey on Control of Mechanical ARM based on Hand Gesture Recognitio...
IRJET Journal
 
PDF
Password Based Hand Gesture Controlled Robot
IJERA Editor
 
PPTX
Hand Gesture Recognition Based on Shape Parameters
Nithinkumar P
 
PDF
Medical Handsfree System - Project Paper
Guy Peleg
 
PDF
Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...
IJERA Editor
 
PDF
Accessing Operating System using Finger Gesture
IRJET Journal
 
PPTX
[VFS 2019] Human Activity Recognition Approaches
Nexus FrontierTech
 
PDF
Design of Image Projection Using Combined Approach for Tracking
IJMER
 
PDF
Gesture Based Interface Using Motion and Image Comparison
ijait
 
PDF
J04302076081
ijceronline
 
PPTX
A Framework For Dynamic Hand Gesture Recognition Using Key Frames Extraction
NEERAJ BAGHEL
 
PPTX
Vision System and its application,Problems
Nikhil Chavda
 
Part 1 - Gesture Recognition Technology
Patel Saunak
 
Real time gesture recognition
Jaison2636
 
Z4501149153
IJERA Editor
 
gesture recognition!
mehran kordavani
 
IRJET - A Smart Assistant for Aiding Dumb People
IRJET Journal
 
Digital Pen for Handwritten Digit and Gesture Recognition Using Trajectory Re...
IOSR Journals
 
Gesturerecognition
Mariya Khan
 
Final Year Project-Gesture Based Interaction and Image Processing
Sabnam Pandey, MBA
 
IRJET- A Survey on Control of Mechanical ARM based on Hand Gesture Recognitio...
IRJET Journal
 
Password Based Hand Gesture Controlled Robot
IJERA Editor
 
Hand Gesture Recognition Based on Shape Parameters
Nithinkumar P
 
Medical Handsfree System - Project Paper
Guy Peleg
 
Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...
IJERA Editor
 
Accessing Operating System using Finger Gesture
IRJET Journal
 
[VFS 2019] Human Activity Recognition Approaches
Nexus FrontierTech
 
Design of Image Projection Using Combined Approach for Tracking
IJMER
 
Gesture Based Interface Using Motion and Image Comparison
ijait
 
J04302076081
ijceronline
 
A Framework For Dynamic Hand Gesture Recognition Using Key Frames Extraction
NEERAJ BAGHEL
 
Vision System and its application,Problems
Nikhil Chavda
 

Similar to Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer (20)

PDF
Controlling Computer using Hand Gestures
IRJET Journal
 
PDF
Gesture Recognition System
IRJET Journal
 
PDF
Ay4103315317
IJERA Editor
 
DOC
Test
theextraaedge
 
PDF
Natural Hand Gestures Recognition System for Intelligent HCI: A Survey
Editor IJCATR
 
PDF
Real time hand gesture recognition system for dynamic applications
ijujournal
 
PDF
Real time hand gesture recognition system for dynamic applications
ijujournal
 
PDF
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
IRJET Journal
 
PDF
Media Control Using Hand Gesture Moments
IRJET Journal
 
PDF
Gesture final report new
chithiracyriac
 
PDF
Gesture recognition document
Nikhil Jha
 
PDF
Paper id 25201413
IJRAT
 
PDF
IRJET- Finger Gesture Recognition Using Linear Camera
IRJET Journal
 
PDF
Hand Gesture Recognition using OpenCV and Python
ijtsrd
 
PPTX
Gesture Recognition
Murlidhar Sarda
 
PDF
G0342039042
ijceronline
 
PDF
A Survey Paper on Controlling Computer using Hand Gestures
IRJET Journal
 
PPTX
qwerasdfzxcv
Ram Sharma
 
PDF
40120140503005 2
IAEME Publication
 
PPTX
Cursor movement by hand gesture.pptx
RastogiAman
 
Controlling Computer using Hand Gestures
IRJET Journal
 
Gesture Recognition System
IRJET Journal
 
Ay4103315317
IJERA Editor
 
Natural Hand Gestures Recognition System for Intelligent HCI: A Survey
Editor IJCATR
 
Real time hand gesture recognition system for dynamic applications
ijujournal
 
Real time hand gesture recognition system for dynamic applications
ijujournal
 
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
IRJET Journal
 
Media Control Using Hand Gesture Moments
IRJET Journal
 
Gesture final report new
chithiracyriac
 
Gesture recognition document
Nikhil Jha
 
Paper id 25201413
IJRAT
 
IRJET- Finger Gesture Recognition Using Linear Camera
IRJET Journal
 
Hand Gesture Recognition using OpenCV and Python
ijtsrd
 
Gesture Recognition
Murlidhar Sarda
 
G0342039042
ijceronline
 
A Survey Paper on Controlling Computer using Hand Gestures
IRJET Journal
 
qwerasdfzxcv
Ram Sharma
 
40120140503005 2
IAEME Publication
 
Cursor movement by hand gesture.pptx
RastogiAman
 
Ad

Recently uploaded (20)

PPTX
Water resources Engineering GIS KRT.pptx
Krunal Thanki
 
PDF
CAD-CAM U-1 Combined Notes_57761226_2025_04_22_14_40.pdf
shailendrapratap2002
 
PPTX
Online Cab Booking and Management System.pptx
diptipaneri80
 
PPTX
Inventory management chapter in automation and robotics.
atisht0104
 
PDF
4 Tier Teamcenter Installation part1.pdf
VnyKumar1
 
PDF
All chapters of Strength of materials.ppt
girmabiniyam1234
 
PPTX
Sensor IC System Design Using COMSOL Multiphysics 2025-July.pptx
James D.B. Wang, PhD
 
PDF
The Complete Guide to the Role of the Fourth Engineer On Ships
Mahmoud Moghtaderi
 
PDF
67243-Cooling and Heating & Calculation.pdf
DHAKA POLYTECHNIC
 
PDF
Construction of a Thermal Vacuum Chamber for Environment Test of Triple CubeS...
2208441
 
PPTX
Ground improvement techniques-DEWATERING
DivakarSai4
 
PPTX
Fluid statistics and Numerical on pascal law
Ravindra Kolhe
 
PDF
AI-Driven IoT-Enabled UAV Inspection Framework for Predictive Maintenance and...
ijcncjournal019
 
PPTX
Unit 2 Theodolite and Tachometric surveying p.pptx
satheeshkumarcivil
 
PPTX
cybersecurityandthe importance of the that
JayachanduHNJc
 
PPTX
Introduction to Fluid and Thermal Engineering
Avesahemad Husainy
 
PDF
Zero carbon Building Design Guidelines V4
BassemOsman1
 
PDF
2025 Laurence Sigler - Advancing Decision Support. Content Management Ecommer...
Francisco Javier Mora Serrano
 
PPTX
Unit II: Meteorology of Air Pollution and Control Engineering:
sundharamm
 
PDF
勉強会資料_An Image is Worth More Than 16x16 Patches
NABLAS株式会社
 
Water resources Engineering GIS KRT.pptx
Krunal Thanki
 
CAD-CAM U-1 Combined Notes_57761226_2025_04_22_14_40.pdf
shailendrapratap2002
 
Online Cab Booking and Management System.pptx
diptipaneri80
 
Inventory management chapter in automation and robotics.
atisht0104
 
4 Tier Teamcenter Installation part1.pdf
VnyKumar1
 
All chapters of Strength of materials.ppt
girmabiniyam1234
 
Sensor IC System Design Using COMSOL Multiphysics 2025-July.pptx
James D.B. Wang, PhD
 
The Complete Guide to the Role of the Fourth Engineer On Ships
Mahmoud Moghtaderi
 
67243-Cooling and Heating & Calculation.pdf
DHAKA POLYTECHNIC
 
Construction of a Thermal Vacuum Chamber for Environment Test of Triple CubeS...
2208441
 
Ground improvement techniques-DEWATERING
DivakarSai4
 
Fluid statistics and Numerical on pascal law
Ravindra Kolhe
 
AI-Driven IoT-Enabled UAV Inspection Framework for Predictive Maintenance and...
ijcncjournal019
 
Unit 2 Theodolite and Tachometric surveying p.pptx
satheeshkumarcivil
 
cybersecurityandthe importance of the that
JayachanduHNJc
 
Introduction to Fluid and Thermal Engineering
Avesahemad Husainy
 
Zero carbon Building Design Guidelines V4
BassemOsman1
 
2025 Laurence Sigler - Advancing Decision Support. Content Management Ecommer...
Francisco Javier Mora Serrano
 
Unit II: Meteorology of Air Pollution and Control Engineering:
sundharamm
 
勉強会資料_An Image is Worth More Than 16x16 Patches
NABLAS株式会社
 
Ad

Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer

  • 1. Jambhulkar. R. Kunal, International Journal of Advance Research, Ideas and Innovations in Technology. © 2017, IJARIIT All Rights Reserved Page | 18 ISSN: 2454-132X Impact factor: 4.295 (Volume3, Issue1) Available online at: www.ijariit.com Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer Kunal R. Jambhulkar Department of Embedded System & Computing, Nagpur Institute of Technology, India [email protected] Abstract: Gesture Recognition is the method of identifying and understanding meaningful movements of the arms, hands, face, or sometimes head. It is one of the most important aspects in the field of Human-Computer interface. There has been a continuous research in this field because of its ability for application in user interfaces. Gesture Recognition is one of the important areas of research for engineers and scientists. Nowadays the industry is working on the different implementation for the trouble free, natural and easy product which can be easy to handle. This paper proposed a method to work with motion sensors and interpret the motion of hand into various applications in a virtual interface. The Micro-Electro-Mechanical Systems (MEMS) accelerometers are used to capture the dynamic hand gesture. These sensors information is transferred to the microcontroller from where these data are transferred wirelessly to the computer system for actual processing of the data with the use of various algorithms. Keywords: Gesture Recognition, MEMS, Teensy, Accelerometer. I. INTRODUCTION Gesture recognition Technique is one of the important tools in the field of Human-computer interaction (HCI). As the Technology Advances in time, Researchers devoted to this, are struggling to develop the new way to find an intelligent, easy and more natural way for human-computer interaction. There are several techniques introduced in human-computer interaction such as face detection, speech recognition and motion gesture. Hand gesture provides an intuitive and more natural form of communication for humans. Gesture recognition system interprets human gestures into some commands, which can be used to control useful devices. There are two main techniques in gesture recognition. A. Glove based Approach A Glove-Based Gesture Recognition system consists of sensors mounted on the glove for motion capturing, microcontroller for information processing and power supply. The wearable glove senses the orientation of user’s hand along with the motion. The users are required to carry additional equipment which may feel inconvenient and disturbing, troubling the actual interaction. Because of which they are unfit for spontaneous interaction due to the complicated arrangement. B. Vision-based Approach Vision-based techniques use visual inputs like the camera to capture the gesture or expression to be used in the gesture recognition system. Vision-based techniques can deal with the problems of Glove-based techniques, but it has some of its own problems. Portability is a problem for most vision-based systems that require still arrangements of the video cameras. Video processing of information has several problems as they are highly dependent on light conditions, video camera settings and an environment. In our Proposed system we are following the Glove based gesture recognition approach because it has high accuracy and fast reaction speed. To minimize the wearable part of the system we used the Teensy USB microcontroller development board. It is a small sized chip with onboard microcontroller and Analog to digital converter and uses very low power (3.3v – 5v), which is perfectly suitable for our system. Also, we are using 6 MEMS accelerometer for a complete orientation data of hand and one flex sensor for any clicking operation. Zigbee transceiver is used to make the system wireless. The receiver module is attached to the computer where the sensor data further processed with the use of various algorithms and performs related tasks. II. LITERATURE SURVEY This section describes some of the related works already done in the field of Gesture recognition.Yikai Fang, Kongqiao Wang, Jian Cheng and HanqingLu [1], proposed a real-time hand gesture recognition system. In this system, a particular gesture is required to trigger the hand detection which is followed by tracking. Then hand is segmented using motion and color cues and
  • 2. Jambhulkar. R. Kunal, International Journal of Advance Research, Ideas and Innovations in Technology. © 2017, IJARIIT All Rights Reserved Page | 19 applied the proposed method to the navigation of image browsing. It locates hands without separate segmentation mechanism and the classifier is learned from a small set of image samples, so the generalization is very limited. S. Zhou, Z. Dong, W. J. Li, and C. P. Kwong [2], proposed a system in which a Micro Inertial Measurement Unit (MIMU) is based on Micro Electro Mechanical Systems (MEMS) sensors is applied to capture the motion information produced by alphabets written by the human user in air. The MIMU is built to capture the three-dimensional accelerations and angular velocities of the motions during writing by hand. In this paper, they used SOM as a character recognition method for training and classifying data transformed by DCT. However, SOM has some limitations which affect its classification results, such as its sensitivity to data input sequence and signal level. SiddharthSwarup Rautaray and Anupam Agrawal [3], proposed a computer vision-based gesture recognition techniques and developed a vision-based low inexpensive device to control the VLC player by hand gestures. VLC media player consists of a computational module which is used to analyze the Principal Component for hand gesture information and finds the feature vectors of the gesture and saved it in the memory. The identification of the gesture is done by K-Nearest Neighbour algorithm. This application is less robust in recognition phase. Robustness of the application can be increased by applying some more robust algorithms to reduce noise and blur motion. For controlling VLC, the application uses global keyboard shortcut in VLC and making keyboard event of that global shortcut with keybd_event () function. It’s not the smart way of controlling any application. Ruize Xu, Shengli Zhou and Wen J. Li [4], proposed three different gesture recognition models which are capable of recognizing seven hand gestures, i.e., up, down, left, right, tick, circle, and cross, based on the gesture information provided by MEMS triaxial accelerometers. The accelerations of a hand in motion in three perpendicular planes are detected by three accelerometers respectively and transmitted to a PC by using Bluetooth transceiver. An automatic gesture segmentation algorithm is developed for identifying the individual gestures in a sequence. The segmentation algorithm used in this paper has some limitations such as inaccurate results in finding the terminal points of gestures. Hence the accuracy of the system is limited. Chetana S. Ingulkar and A. N. Gaikwad [5], proposed a real-time Human-Computer Interaction (HCI) based on the hand data glove and K-NN classifier for gesture recognition is proposed. The gestures classified are categorized as clicking, rotating, dragging, pointing and ideal position. There are two types of sensors are used flex sensor and accelerometer, which are completely different with each other hence makes system complex to calculate & recognize a perfect gesture. Meenaakumari.M. and M.Muthulakshmi [6], proposed a portable gesture recognition system that has been developed with the use of trajectory recognition algorithm. The portable device consists of a 3-axes accelerometer, a microcontroller, and a Zigbee wireless transceiver module. Users can use this portable system to write digits and make hand gestures at normal speed. The limitation of the proposed trajectory recognition algorithm is that it can only identify a letter or a number drawn with a single stroke. R. Suriya and V. Vijayachamundeshwari [7], discussed work done in the area of hand gesture recognition using various methods like Hidden Markov Model, simple mouse control and MEMS accelerometer. They have described hand detection methods in the pre-processed image for detecting the hand image, which is the main process in gesture recognition. O. Sidek and M. A. Hadi [8], proposed the development of wireless Bluetooth hand gesture recognition system using six 3-axis accelerometers embedded in a glove with a database in a computer for samples. This system can identify any sampled data saved in the database while making the device portable and mobile to the user by using wireless Bluetooth technology. The system analyzed gesture data such as static data, dynamic data, and average recognition rates relationships are discussed in this paper. In this paper, the system only recognizes the gestures on the basis of the existing set of gestures in the database but it is not dedicated for a particular application. III. PROPOSED WORK This paper proposed the technique for development of wireless hand gesture recognition system using six 3-axis accelerometers and a flex sensor embedded in a glove and a media player application in the computer which can be controlled by making gesture using the glove. This system can recognize any sampled data saved in the database while promoting maximum portability and mobility to the user via wireless technology. The system has six 3-axis accelerometers – one on each finger and one on the back of the palm integrated into the glove to detect hand positions and motions. Also, one flex sensor is used to provide any clicking operation for the media player. All the accelerometers are connected to a microcontroller and the raw data received are mapped and arranged in an array before it is transferred serially to a wireless module. Data acquired by the computer by the wireless module are saved in a database called gesture library by means of a graphical user interface (GUI) in a computer. The GUI system is created to ease the collection of sample data and it can also be used to recognize gestures from the glove. The recognition system will return the recognized gesture based on the highest probabilities score. According to the detected gesture pattern by the recognition system, the appropriate commands are given to the media player to control the various functions of it. The components selected for the proposed system are as follows.
  • 3. Jambhulkar. R. Kunal, International Journal of Advance Research, Ideas and Innovations in Technology. © 2017, IJARIIT All Rights Reserved Page | 20 A. Accelerometer Fig 1 ADXL335 Accelerometer Fig 2 ADXL335 Circuit diagram Accelerometers are the sensors which sense acceleration. For achieving analog input values we select the ADXL335 analog 3- axes accelerometer. The ADXL335 is a small, thin, low power, 3-axis accelerometer with signal based voltage outputs. The sensor measures acceleration with a range of ±3 g. The sensor can be used to measure static acceleration in tilt-sensing applications, and dynamic acceleration arises from motion, shock, or vibration. It uses extremely low power (only 320uA) which is perfectly suitable for our battery based application. It is a 3-axis accelerometer means it can sense the acceleration in three dimensions X, Y, and Z at the same time, provides better judgment towards detection of hand movement. The user can select the bandwidth of the accelerometer using the CX, CY, and CZ capacitors at the X, Y, and Z pins. Bandwidths can be selected to according to the application, with a range of 0.5 Hz to 1600 Hz for X and Y axes, and a range of 0.5 Hz to 550 Hz for the Z axis. A. Teensy Fig 3 Teensy 3.2 Pin diagram The Teensy board is a complete USB-based microcontroller development system in a very small size, capable of implementing many types of projects. It consists of 32 bit ARM cortex M4 core which is based on ARM 7 architecture. It also had on chip 16 bit analog to digital converter (ADC) and works on 3.3 to 5 volts. Teensy comes pre-flashed with a boot loader called Teensy loader, so it can be programmed using the onboard USB connection, No external programmer needed. We can program for the Teensy in your favourite program editor using C or you can install the Teensyduino, an add-on for the Arduino IDE and write Arduino sketches for Teensy. The main features of Teensy which make it suitable for our system are, 1) Dimensions: its dimensions are 1.4 x 0.7" (~35 x 18 mm), it is so small that can be easily mounted on the glove and handled easily.
  • 4. Jambhulkar. R. Kunal, International Journal of Advance Research, Ideas and Innovations in Technology. © 2017, IJARIIT All Rights Reserved Page | 21 2) Power: It works on 3.3v to 5v only, hence suitable for a battery based system. Also, It has an on-chip voltage regulator which can provide a regulated voltage of 3.3 volts and up to 100mA, which is sufficient to power all the sensors and wireless module of our system. 3) Microcontroller: It has 32-bit ARM Cortex M4 core based on ARM 7 architecture, which is efficient and dependable. 4) ADC: It has on-chip two 16 bit analog to digital converters (ADC), which is required in our system to convert analog sensor values to digital. 5) Analog Inputs: It has 21 Analog inputs of 16 bit each, which is the most important requirement of our system to connect 6 accelerometers of 3 input each and 1 flex sensor. 6) Programming: is easy to program using onboard USB provided using any c compiler or Teensyduino IDE based on Arduino. B. Flex Sensor Fig 4 A Flex Sensor Fig 5 Basic Flex Sensor Circuit Flex sensors are passive resistive devices that can be used to detect bending in one direction. They were popularized by being used in the Nintendo Power-Glove as a gaming interface. The change in resistance depends upon the amount of bending on the sensor. As the angle of bend increases the resistance increases. A flat un-flexed sensor has a resistance of 10 KΩ. As the flex sensor is bent, the resistance gradually changes. When flexed all the way the resistance rises to 20KΩ. In our System, the flex sensor is connected to a 10KΩ resistor in series to form a voltage bridge. The voltage at the middle of the bridge is provided to the analog input of the Teensy to detect the changes in bending of the sensor by voltage values received and converts it into digital form using it’s built in ADC. C. Wireless Module Fig 6 A nRF24L01+ wireless module The nRF24L01+ is an ultra-low power wireless trans-receiver module, which operates in the 2.4 GHz band. The configuration of the module can be done by a Serial Peripheral Interface (SPI) where data is sent in both ways at once but on separate lines. NRF24L01+ has internal FIFOs which ensure smooth data flow with the use of 6 data pipe multilevel. The module has a built- in state machine which controls the transmission between its operating modes and the configuration registers are accessible in all operational modes. It uses GFSK modulation. A user can configure parameters like frequency channel, output power and transmission rate. NRF24L01+ supports the data transmission rate of 250 kbps, 1 Mbps and 2Mbps. Two wireless modules are used in our system one acts as a transmitter, situated on the glove itself and second acts as a receiver, which is connected to the computer. The sensor data are transferred from the sensor glove to the computer by this wireless module. D. Computer A computer is required which collects the sensor data by the wireless receiver module attached to it. The actual processing of data is implemented here with the help of some algorithms consist of three main components quantizer, Model and Classifier. As an accelerometer continuously transfers a sequence of data, there is a need of quantizer to cluster or make groups of the gesture data. Here, a common k-mean algorithm can be used. The model part of processing has to be a hidden Markov model since it is mostly used in the gesture recognition and delivers reliable results for patterns. The remaining component is a classic Bayes-classifier. In addition to these main components, we can use two filters for pre-processing of the sensor data, an “idle state” and a “directorial equivalence” filter. Both are reliable to reduce and simplify the incoming sensor data.
  • 5. Jambhulkar. R. Kunal, International Journal of Advance Research, Ideas and Innovations in Technology. © 2017, IJARIIT All Rights Reserved Page | 22 CONCLUSIONS This paper proposed a MEMS sensor-based gesture recognition system which is wireless and modular in nature. Since the hardware used are of small size and efficient, the system becomes easy to handle. The applicability of the proposed system is defined by controlling a media player application created for the purpose and user can control its functionality with the use of hand gestures. The system used different algorithms such as K-mean, Hidden Markov model and Bayes classifier to efficiently process the sensor information and to detect the gestures accurately to perform the assigned task to that particular gesture. REFERENCES [1] O. Sidek and M. A. Hadi, “Wireless Gesture Recognition System using MEMS Accelerometer”, International Symposium on Technology Management and Emerging Technologies- ISTMET-2014. [2] R. Suriya and V. Vijayachamundeshwari, “A Survey on Hand Gesture Recognition for Simple Mouse Control”, ICICES- 2014. [3] Meenaakumari.M. And M.Muthulakshmi “MEMS Accelerometer Based Hand Gesture Recognition”, (IJARCET), Volume.2, No.5, May 2013. [4] Chetana S. Ingulkar and A. N. Gaikwad, “Hand Data Glove: A wearable real time device for human computer Interaction”, International Journal of Science and Engineering, Volume 1, Number 2, 2013. [5] RuizeXu, Shengli Zhou, Wen J. Li. “MEMS Accelerometer Based Nonspecific-User Hand Gesture Recognition”, Volume 12, IEEE, 2012. [6] SiddharthSwarup Rautaray and Anupam Agrawal, “A Vision based Hand Gesture Interface for Controlling VLC Media Player”, International Journal of Computer Applications, Volume 10, 2010. [7] S. Zhou, Z. Dong, W. J. Li, and C. P. Kwong, “Hand-written character recognition using MEMS motion sensing technology,” in Proc.IEEE/ASME Int. Conf. Advanced Intelligent Mechatronics, 2008. [8] A Yikai Fang, Kongqiao Wang, Jian Cheng, HanqingLu, ”A Real-Time hand gesture recognition method”, In International Conference on Multimedia and Expo, IEEE, July 2007.