SlideShare a Scribd company logo
Integrated Intelligent Research (IIR) International Journal of Data Mining Techniques and Applications
Volume: 03 Issue: 01 June 2014, Page No. 16- 20
ISSN: 2278-2419
16
Analysis of Influences of memory on Cognitive
load Using Neural Network Back Propagation
Algorithm
A.Naveen1
, M.S.Josephine2
1
Research Scholar, Department of Computer Applications, Dr. M.G.R. Educational and Research Institute University, Chennai
2
Professor, Department of Computer Applications, Dr. M.G.R. Educational and Research Institute University, Chennai
E-mail:josejbr@yahoo.com, naveenking@yahoo.co.in
Abstract-Educational mining used to evaluate the leaner's
performance and the learning environment. The learning
process are involved and influenced by different
components. The memory is playing vital role in the
process of learning. The long term, short term, working,
instant, responsive, process, recollect, reference,
instruction and action memory are involved in the
process of learning. The influencing factors on these
memories are identified through the construction analysis
of Neural Network Back Propagation Algorithm. The
observed set of data represented using cubical dataset
format for the mining approach. The mining process is
carried out using neural network based back propagation
network model to decide the influencing cognitive load
for the different learning challenges. The learners’
difficulties are identified through the experimental
results.
I.INTRODUCTION
The data mining techniques are applicable to all domains
according to the need of applications.The data
mining technology phases are majorly having preprocess
analysis and pattern generation. The data mining
technology is attempted to integrate the learning system
oflearners and their cognitive behavior using predict
approach. The Data mining technology is supported to
determine the unknown values from available values. In
this work,the predictive approach is used to identify the
relationship of learner’s performance and their cognitive
load using Neural network approach and its optimized
based on back propagation algorithm.
II.REVIEW OF LITERATURE
The review of literature r covers basic concept of the data
mining and the applications. The data mining tool and its
techniques are highlighted. The Educational Data Mining
process and its applicationsreviewed and presented in
table1. The Learner difficulties are identified and
attempted to resolve.
Table 1: Describes Various Research Work Done Related To The Use Of Data Mining In The Context Of Education.
S.
No.
Year Author Work
1 2000 Ma, Y., Liu, B., Wong, C.
K., Yu, P. S., Lee, S. M
Presented a real life application of data mining to find weak students
2 2001 Luan J. Introduced a powerful decision support tool, data mining, in the context
of knowledge management
3 2002 Luan J. Discussed the potential applications of data mining in higher education
& explained how data mining saves resources while maximizing
efficiency in academics.
4 2005 Delavari et al Proposed a model for the application of data mining in higher
education.
5 2006 Shyamala, K. &Rajagopalan,
S. P.
Developed a model to find similar patterns from the data gathered and
to make predication about students’ performance.
6 2006 Sargenti et al Explored the development of a model which allows for diffusion of
knowledge within a small business university.
Integrated Intelligent Research (IIR) International Journal of Data Mining Techniques and Applications
Volume: 03 Issue: 01 June 2014, Page No. 16- 20
ISSN: 2278-2419
17
III. ROLE OF MEMORY MODEL IN LEARNING
PROCESS
In the learning process, memory model is playing vital
role from the observation, reorganization, understating
and learning. In learning process memory classified as
working memory , short term memory, long term
memory and sensor memory . Working memory is where
thinking gets done. The working memory is dual coded
with a buffer for storage of verbal/text elements.sensory
memory that those experiences get introduced into
working memory. Once an experience is in working
memory, the person can then consciously hold it in
memory and think about it in context. The short-term
memory acts in parallel with the long-term
memory.According to the analysis of memory , the
learning process is involved in the following memory
process such as Long Term Memory, Short Term
Memory, Working (Calculation) Memory , Instant
Memory ,Responsive Memory ,Processing (Search
Content) Memory ,Recollecting Memory ,Reference
Memory ,Instruction Memory , Action Memory These
memory performances are consider as a process unit and
the neural network model designed . The real time
learners performances are observed using NASA
workload scaling process.
IV.NEURAL NETWORK –BACK PROPAGATION
MODEL
Neural Networks are analytic techniques modeled after
the (hypothesized) processes of learning’s in the
cognitive system and the neurological functions of the
brain and capable of predicting new observations (on
specific variables) from other observations (on the same
or other variables) after executing a process of so-called
learning from existing data. Neural Networks is one of
the Data Mining techniques to determine and optimize
the factors.
A. Neural Network
Artificial neural networks simulate the biological
structure of neural networks in brains in a simplified way
to endow computers with the very considerable abilities
of biological neural networks: the ability of learning from
examples, pattern classification, generalization and
prediction.
B. Back Propagation Algorithm
As per the adopted back propagation neural network
model, the observed data is mapped and the model is
constructed. The adopted algorithm concept converted as
a step by step executable concept and presented below
Step 1 :
Normalize the I/P and O/P with respect to their
maximum values. For each training pair, assume
that in normalized form there are ℓ inputs given
by { I } I and ℓx 1.n outputs given by { O }o
n x 1
Step 2 :
Assume that the number of neurons in the hidden
layers lie between 1 < m < 10. because the
ten memory attributes are consider for this
network construction.
Step 3:
Let [ V ] represents the weights of synapses
connecting input neuron and hidden neuron. Let [
W ] represents the weights of synapses
connecting hidden neuron and output
neuron.Initialize the weights to small random
values usually from -1 to +1;
[ V]0
= [ramdom weights ]
[ W ]0
= [ randaom weights]
[∆ V ]0
= [∆ W ]0
= [ 0 ]
For general problems λ can be assumed as 1 and
threshold value as 0.
Step 4:
For training we need to present one set of inputs
and outputs.Present the pattern as inputs to the
input layer { I }I then by using linear activation
function, the output of the input layer may be
evaluated as.
{ O }I = { I }It x 1 t x 1
Step 5 :
Compute the inputs to the hidden layers by
multiplying corresponding weights of synapses
as
{ I }H= [ V ]T
{ O }I
m x 1 m x ℓ ℓ x 1
Step 6 :
Let the hidden layer units, evaluate the output
using the sigmoidal function as 1
{ O } =
( 1 + e –(IHi)
)
m x 1
Step 7 :
Compute the inputs to the output layers by
multiplying corresponding weights of synapses
as
Integrated Intelligent Research (IIR) International Journal of Data Mining Techniques and Applications
Volume: 03 Issue: 01 June 2014, Page No. 16- 20
ISSN: 2278-2419
18
{ I }o = [ W ]T
{ O }H
n x 1 n x mm x 1
Step 8 :
Let the output layer units, evaluate the output
using sigmoid function as
1{ O }o = (
1 + e –(I
oj
)
)
Note : This output is the network output
Step 9 :
Calculate the error using the difference between
the network output and the desired output as for
the j th training set asEp
Step 10 :
Find a term { d } as
{ d} = ( Tk – Ook) Ook ( 1- Ook )
n x 1
Step 11 :
Find [ Y ] matrix as
[ Y ] = { O }H {d}
m x n m x 1 1 x n
Step 12 :
Find [∆ W ]t + 1
= α [∆ W ] t
+ η [ Y ]
m x n m x n m x n
Step 13 :
Find { e } = [ W ] { d }
m x 1 m x n n x 1
(OHi) ( 1- OHi)
{ d * } = ei
m x 1 m x 1
Find [ X ] matrix as
[ X ] = { O }I {d* } = { I }I {
d *}
1 x m ℓ x 1 1 x m ℓ x 1 1
x m
Step 14 :
Find [∆ V ]t + 1
= α [∆ V ] t
+ η [ X ]
1 x m 1 x m 1 x m
Step 15 :
Find [ V ]t + 1
= [V ] t
+ [∆ V ]t + 1
[ W ]t + 1
= [ W ] t
+ [ ∆ W ]t + 1
Step 16 : Find error rate as
Error rate = nset
Step 17 :Repeat steps 4 to 16 until the convergence in
the error rate is less than the tolerance value.
The Back propagation algorithm is converted for
Education process
V. BACK PROPAGATION NEURAL NETWORK
MODEL TO IDENTIFY INFLUENCES OF
MEMORY ON COGNITIVE LOAD
In neural network, multilayer perceptron (MLP)
architecture consists more than 2 layers; A MLP can have
any number of layers, units per layer, network inputs, and
network outputs such as fig 1 models. This network has
3 Layers; first layer is called input layer and last layer is
called output layer; in between first and last layers which
are called hidden layers. Finally, this network has three
network inputs, one network output and hidden layer
network. This model is the most popular in the
supervised learning architecture because of the weight
error correct rules. It is considered a generalization of the
delta rule for nonlinear activation functions and
multilayer networks. In a back-propagation neural
network, the learning algorithm has two phases. First, a
training input pattern is presented to the network input
layer.
Figure1 : Neural network model
VI RESULT ANALYSIS
All external observation attributes are considered for the
input. I1 to I10 represents the input variable which
presents the score of the each exercise which carried out
different memory learning process such as long term,
short term, working, instant, responsive, process,
recollect, reference, instruction and action memory. The
cognitive loads are treated as process neuron of hidden
layer. The H1 to H6 presents the mental, physical,
temporal, performance, effort and frustration cognitive
loads.
Integrated Intelligent Research (IIR) International Journal of Data Mining Techniques and Applications
Volume: 03 Issue: 01 June 2014, Page No. 16- 20
ISSN: 2278-2419
19
According to the observed values, the neural network
process is made. The student exercise values for different
memory and are presented 5, 7, 10, 7, 10, 6, 8, 10, 10 and
8. The process values weightage is presented as a matrix.
Table 2: Initial assignment I-> H
Initially Assigned Input to Hidden layer values are
0.1159 0.7452 0.7667 0.3303 0.5616 0.0683
0.6439 0.4698 0.0191 0.2963 0.8009 0.6156
0.2143 0.5475 0.2102 0.3169 0.1742 0.4353
0.9549 0.5177 0.1248 0.0150 0.7018 0.2846
0.4923 0.1257 0.9708 0.0489 0.7574 0.9536
0.0792 0.3202 0.8221 0.3629 0.9702 0.0913
0.1030 0.4678 0.0615 0.0294 0.1558 0.3777
0.6844 0.4638 0.3015 0.5345 0.3129 0.2956
0.8698 0.8769 0.1957 0.9274 0.4688 0.0399
0.1469 0.6540 0.3309 0.5981 0.0100 0.3490
Table 3: Initial Assignment H-> O
Initially Assigned Hidden layer to Output values are
0.9532 0.4718
0.0451 0.0943
0.1912 0.4462
0.5433 0.1154
0.6347 0.8885
0.8214 0.0804
From the initial values, each level obtained output is
recalculated and assigned as a input and reduced the error
level. Initially the output is estimated for 81 for the
learning cognitive and 95 for the learning performance.
The initial assigned neuron process produced 96 for the
cognitive load and 89 for the learner performance. While
obtaining this process, -62.048 error is produced. The
iterative process implemented and obtained the final
value in zero level error. The final weight age values are
presented below
Table 4: Obtained weight I-> H
Final Input to Hidden layer values are
0.2318 1.4905 1.5335 0.6605 1.1232 0.1366
1.2879 0.9397 0.0382 0.5926 1.6019 1.2313
0.4286 1.0950 0.4204 0.6338 0.3484 0.8705
1.9099 1.0355 0.2495 0.0300 1.4036 0.5692
0.9846 0.2514 1.9416 0.0979 1.5147 1.9072
0.1583 0.6403 1.6442 0.7259 1.9405 0.1825
0.2060 0.9356 0.1229 0.0587 0.3116 0.7554
1.3687 0.9276 0.6030 1.0690 0.6258 0.5912
1.7397 1.7538 0.3913 1.8548 0.9375 0.0799
0.2938 1.3080 0.6618 1.1962 0.0200 0.6980
Table 5 Obtained weight H->O
Final Hidden to output layer values are
0.9532 0.4718
0.0451 0.0943
0.1912 0.4462
0.5433 0.1154
0.6347 0.8885
0.8214 0.0804
While processing the network model, the high value of
mental effort is influenced at the maximum level of 95.32
percentage and least level of physical at 4.51 percentage.
The advantage of this model is less number of iteration
and better performance compare with standard back-
propagation model. To evaluate this algorithm, the
MATLAB coding designed and executed .The learning
performance is inclined to performance factor of the
cognitive load to obtain as expected result. The entire
model is presented according to the load along with the
learning performance with the controlled weight. The
load is differ one with another according to the different
learning process. Mental, physical, temporal, effort,
frustration is less while performance is high.
VII. CONCLUSION
The back propagation model could be designed and
evaluation with multiple learning domains. A knowledge
repositorycould be created for different learning
objectives with available and adoptable technologies. The
cognitive load six factors may be increased with
environmental, socio-economiclevel of Learners as
continuation of this research work.
VIII. BIBILOGRAPHY
1. F. Newell, A. Carmichael, P. Gregor and N. Alm,
(2002) "Information technology forcognitive support
"In The Human-Computer Interaction Handbook 2
pp.464-481.
2. Adefowoju, B.S. and Osofisan A.O. (2004). “Cocoa
Production Forecasting UsingArtificial Neural
Networks”. International Centre for Mathematics and
Computer Science Nigeria.ICMCS117-136.
3. Baddeley, A. (1998). Human memory. Boston:
Allyn& Bacon.
Integrated Intelligent Research (IIR) International Journal of Data Mining Techniques and Applications
Volume: 03 Issue: 01 June 2014, Page No. 16- 20
ISSN: 2278-2419
20
4. Baddeley, A.D. (1992). Working Memory. Science;
255-256.
5. Bose, N.K. and Liang, P. (1996). Neural Networks
Fundamental With Graphs AlgorithmsAnd
Applications.Mcgraw-Hill: New York, NY.
6. Bransford, J. D., Brown, A. L., & Cocking, R. R.
(Eds.) (2000). How people learn:
Brain,mind,experience, and school: Expanded
Edition. Washington, DC: National AcademyPress.
7. Fougnie, D., &Marois, R. (2011). What limits
Working Memory capacity? Evidence forattention to
visual cortex. Journal of Cognitive Neuroscience
,23(9),2593-604.
AUTHOR PROFILE
A.Naveen is a research scholar in department of
computer application from Dr. MGR University,
Chennai. He graduated Masters in computer science
from St.Joseph’s College, Bharathidasan University,
Trichy in 2012. His areas of interest include Data Mining
, Cognitive Computing and Neural Networking.
M.S.Josephine is Working in Dept of Computer
Applications, Dr.MGR University, Chennai. She
graduated Masters Degree (MCA) from St.Joseph’s
College, Bharathidasan University, M.Phil (Computer
Science ) from Periyar University and Doctorate from
Mother Teresa University in Computer Applications. Her
research Interest includes Software Engineering , Expert
System.

More Related Content

What's hot (20)

PPTX
Unsupervised learning
amalalhait
 
PDF
Knowledge distillation deeplab
Frozen Paradise
 
PDF
Adaptive modified backpropagation algorithm based on differential errors
IJCSEA Journal
 
PPTX
Introduction to Neural networks (under graduate course) Lecture 9 of 9
Randa Elanwar
 
PDF
Investigations on Hybrid Learning in ANFIS
IJERA Editor
 
PDF
X trepan an extended trepan for
ijaia
 
PDF
Improving Performance of Back propagation Learning Algorithm
ijsrd.com
 
PPT
Supervised Learning
butest
 
PDF
Black-box modeling of nonlinear system using evolutionary neural NARX model
IJECEIAES
 
PPTX
Introduction to Neural networks (under graduate course) Lecture 8 of 9
Randa Elanwar
 
PPTX
Artificial Neural Network
Dessy Amirudin
 
PDF
Electricity consumption-prediction-model-using neuro-fuzzy-system
Cemal Ardil
 
PDF
Life-long / Incremental Learning (DLAI D6L1 2017 UPC Deep Learning for Artifi...
Universitat Politècnica de Catalunya
 
PDF
Evaluation of a hybrid method for constructing multiple SVM kernels
infopapers
 
PPTX
Introduction to Neural networks (under graduate course) Lecture 7 of 9
Randa Elanwar
 
PPTX
03 Single layer Perception Classifier
Tamer Ahmed Farrag, PhD
 
PDF
Multilayer Backpropagation Neural Networks for Implementation of Logic Gates
IJCSES Journal
 
PDF
Introduction to Applied Machine Learning
SheilaJimenezMorejon
 
PDF
Ffnn
guestd60a613
 
PDF
A survey research summary on neural networks
eSAT Publishing House
 
Unsupervised learning
amalalhait
 
Knowledge distillation deeplab
Frozen Paradise
 
Adaptive modified backpropagation algorithm based on differential errors
IJCSEA Journal
 
Introduction to Neural networks (under graduate course) Lecture 9 of 9
Randa Elanwar
 
Investigations on Hybrid Learning in ANFIS
IJERA Editor
 
X trepan an extended trepan for
ijaia
 
Improving Performance of Back propagation Learning Algorithm
ijsrd.com
 
Supervised Learning
butest
 
Black-box modeling of nonlinear system using evolutionary neural NARX model
IJECEIAES
 
Introduction to Neural networks (under graduate course) Lecture 8 of 9
Randa Elanwar
 
Artificial Neural Network
Dessy Amirudin
 
Electricity consumption-prediction-model-using neuro-fuzzy-system
Cemal Ardil
 
Life-long / Incremental Learning (DLAI D6L1 2017 UPC Deep Learning for Artifi...
Universitat Politècnica de Catalunya
 
Evaluation of a hybrid method for constructing multiple SVM kernels
infopapers
 
Introduction to Neural networks (under graduate course) Lecture 7 of 9
Randa Elanwar
 
03 Single layer Perception Classifier
Tamer Ahmed Farrag, PhD
 
Multilayer Backpropagation Neural Networks for Implementation of Logic Gates
IJCSES Journal
 
Introduction to Applied Machine Learning
SheilaJimenezMorejon
 
A survey research summary on neural networks
eSAT Publishing House
 

Similar to Analysis of Influences of memory on Cognitive load Using Neural Network Back Propagation Algorithm (20)

PDF
PREDICTING STUDENT ACADEMIC PERFORMANCE IN BLENDED LEARNING USING ARTIFICIAL ...
gerogepatton
 
PDF
PREDICTING STUDENT ACADEMIC PERFORMANCE IN BLENDED LEARNING USING ARTIFICIAL ...
ijaia
 
PPTX
Neural Networks
SurajKumar579888
 
PDF
B42010712
IJERA Editor
 
PDF
A Study On Deep Learning
Abdelrahman Hosny
 
PDF
Deep Learning Survey
Anthony Parziale
 
PDF
Implementation Of Back-Propagation Neural Network For Isolated Bangla Speech ...
ijistjournal
 
PDF
Implementation Of Back-Propagation Neural Network For Isolated Bangla Speech ...
ijistjournal
 
PDF
Implementation Of Back-Propagation Neural Network For Isolated Bangla Speech ...
ijistjournal
 
PDF
tsopze2011
Eric DJIKY D.
 
PDF
Artificial Neural Networks: Applications In Management
IOSR Journals
 
PDF
X-TREPAN: A MULTI CLASS REGRESSION AND ADAPTED EXTRACTION OF COMPREHENSIBLE D...
cscpconf
 
PDF
Levenberg marquardt-algorithm-for-karachi-stock-exchange-share-rates-forecast...
Cemal Ardil
 
PDF
Survey on Artificial Neural Network Learning Technique Algorithms
IRJET Journal
 
PDF
F017533540
IOSR Journals
 
PDF
A Time Series ANN Approach for Weather Forecasting
ijctcm
 
PDF
Nature Inspired Reasoning Applied in Semantic Web
guestecf0af
 
PPT
Machine Learning and Artificial Neural Networks.ppt
Anshika865276
 
PDF
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
Guru Nanak Technical Institutions
 
PPT
this is a Ai topic neural network ML_Lecture_4.ppt
ry54321288
 
PREDICTING STUDENT ACADEMIC PERFORMANCE IN BLENDED LEARNING USING ARTIFICIAL ...
gerogepatton
 
PREDICTING STUDENT ACADEMIC PERFORMANCE IN BLENDED LEARNING USING ARTIFICIAL ...
ijaia
 
Neural Networks
SurajKumar579888
 
B42010712
IJERA Editor
 
A Study On Deep Learning
Abdelrahman Hosny
 
Deep Learning Survey
Anthony Parziale
 
Implementation Of Back-Propagation Neural Network For Isolated Bangla Speech ...
ijistjournal
 
Implementation Of Back-Propagation Neural Network For Isolated Bangla Speech ...
ijistjournal
 
Implementation Of Back-Propagation Neural Network For Isolated Bangla Speech ...
ijistjournal
 
tsopze2011
Eric DJIKY D.
 
Artificial Neural Networks: Applications In Management
IOSR Journals
 
X-TREPAN: A MULTI CLASS REGRESSION AND ADAPTED EXTRACTION OF COMPREHENSIBLE D...
cscpconf
 
Levenberg marquardt-algorithm-for-karachi-stock-exchange-share-rates-forecast...
Cemal Ardil
 
Survey on Artificial Neural Network Learning Technique Algorithms
IRJET Journal
 
F017533540
IOSR Journals
 
A Time Series ANN Approach for Weather Forecasting
ijctcm
 
Nature Inspired Reasoning Applied in Semantic Web
guestecf0af
 
Machine Learning and Artificial Neural Networks.ppt
Anshika865276
 
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
Guru Nanak Technical Institutions
 
this is a Ai topic neural network ML_Lecture_4.ppt
ry54321288
 
Ad

More from ijdmtaiir (20)

PDF
A review on data mining techniques for Digital Mammographic Analysis
ijdmtaiir
 
PDF
Comparison on PCA ICA and LDA in Face Recognition
ijdmtaiir
 
PDF
A Novel Approach to Mathematical Concepts in Data Mining
ijdmtaiir
 
PDF
Analysis of Classification Algorithm in Data Mining
ijdmtaiir
 
PDF
Performance Analysis of Selected Classifiers in User Profiling
ijdmtaiir
 
PDF
Analysis of Sales and Distribution of an IT Industry Using Data Mining Techni...
ijdmtaiir
 
PDF
An Analysis of Data Mining Applications for Fraud Detection in Securities Market
ijdmtaiir
 
PDF
An Ill-identified Classification to Predict Cardiac Disease Using Data Cluste...
ijdmtaiir
 
PDF
Scaling Down Dimensions and Feature Extraction in Document Repository Classif...
ijdmtaiir
 
PDF
Music Promotes Gross National Happiness Using Neutrosophic fuzzyCognitive Map...
ijdmtaiir
 
PDF
A Study on Youth Violence and Aggression using DEMATEL with FCM Methods
ijdmtaiir
 
PDF
Certain Investigation on Dynamic Clustering in Dynamic Datamining
ijdmtaiir
 
PDF
Analyzing the Role of a Family in Constructing Gender Roles Using Combined Ov...
ijdmtaiir
 
PDF
An Interval Based Fuzzy Multiple Expert System to Analyze the Impacts of Clim...
ijdmtaiir
 
PDF
An Approach for the Detection of Vascular Abnormalities in Diabetic Retinopathy
ijdmtaiir
 
PDF
Improve the Performance of Clustering Using Combination of Multiple Clusterin...
ijdmtaiir
 
PDF
The Study of Symptoms of Tuberculosis Using Induced Fuzzy Coginitive Maps (IF...
ijdmtaiir
 
PDF
A Study on Finding the Key Motive of Happiness Using Fuzzy Cognitive Maps (FCMs)
ijdmtaiir
 
PDF
Study of sustainable development using Fuzzy Cognitive Relational Maps (FCM)
ijdmtaiir
 
PDF
A Study of Personality Influence in Building Work Life Balance Using Fuzzy Re...
ijdmtaiir
 
A review on data mining techniques for Digital Mammographic Analysis
ijdmtaiir
 
Comparison on PCA ICA and LDA in Face Recognition
ijdmtaiir
 
A Novel Approach to Mathematical Concepts in Data Mining
ijdmtaiir
 
Analysis of Classification Algorithm in Data Mining
ijdmtaiir
 
Performance Analysis of Selected Classifiers in User Profiling
ijdmtaiir
 
Analysis of Sales and Distribution of an IT Industry Using Data Mining Techni...
ijdmtaiir
 
An Analysis of Data Mining Applications for Fraud Detection in Securities Market
ijdmtaiir
 
An Ill-identified Classification to Predict Cardiac Disease Using Data Cluste...
ijdmtaiir
 
Scaling Down Dimensions and Feature Extraction in Document Repository Classif...
ijdmtaiir
 
Music Promotes Gross National Happiness Using Neutrosophic fuzzyCognitive Map...
ijdmtaiir
 
A Study on Youth Violence and Aggression using DEMATEL with FCM Methods
ijdmtaiir
 
Certain Investigation on Dynamic Clustering in Dynamic Datamining
ijdmtaiir
 
Analyzing the Role of a Family in Constructing Gender Roles Using Combined Ov...
ijdmtaiir
 
An Interval Based Fuzzy Multiple Expert System to Analyze the Impacts of Clim...
ijdmtaiir
 
An Approach for the Detection of Vascular Abnormalities in Diabetic Retinopathy
ijdmtaiir
 
Improve the Performance of Clustering Using Combination of Multiple Clusterin...
ijdmtaiir
 
The Study of Symptoms of Tuberculosis Using Induced Fuzzy Coginitive Maps (IF...
ijdmtaiir
 
A Study on Finding the Key Motive of Happiness Using Fuzzy Cognitive Maps (FCMs)
ijdmtaiir
 
Study of sustainable development using Fuzzy Cognitive Relational Maps (FCM)
ijdmtaiir
 
A Study of Personality Influence in Building Work Life Balance Using Fuzzy Re...
ijdmtaiir
 
Ad

Recently uploaded (20)

PPTX
Introduction to Neural Networks and Perceptron Learning Algorithm.pptx
Kayalvizhi A
 
PPTX
MPMC_Module-2 xxxxxxxxxxxxxxxxxxxxx.pptx
ShivanshVaidya5
 
PPTX
Benefits_^0_Challigi😙🏡💐8fenges[1].pptx
akghostmaker
 
PPTX
Heart Bleed Bug - A case study (Course: Cryptography and Network Security)
Adri Jovin
 
PPTX
drones for disaster prevention response.pptx
NawrasShatnawi1
 
PDF
Introduction to Productivity and Quality
মোঃ ফুরকান উদ্দিন জুয়েল
 
PPTX
Thermal runway and thermal stability.pptx
godow93766
 
PDF
Additional Information in midterm CPE024 (1).pdf
abolisojoy
 
PDF
PRIZ Academy - Change Flow Thinking Master Change with Confidence.pdf
PRIZ Guru
 
PDF
Set Relation Function Practice session 24.05.2025.pdf
DrStephenStrange4
 
PPTX
Innowell Capability B0425 - Commercial Buildings.pptx
regobertroza
 
PDF
Unified_Cloud_Comm_Presentation anil singh ppt
anilsingh298751
 
PDF
POWER PLANT ENGINEERING (R17A0326).pdf..
haneefachosa123
 
PDF
Statistical Data Analysis Using SPSS Software
shrikrishna kesharwani
 
PPTX
UNIT DAA PPT cover all topics 2021 regulation
archu26
 
PPTX
Green Building & Energy Conservation ppt
Sagar Sarangi
 
PDF
Ethics and Trustworthy AI in Healthcare – Governing Sensitive Data, Profiling...
AlqualsaDIResearchGr
 
PPTX
ISO/IEC JTC 1/WG 9 (MAR) Convenor Report
Kurata Takeshi
 
PDF
UNIT-4-FEEDBACK AMPLIFIERS AND OSCILLATORS (1).pdf
Sridhar191373
 
PPT
Oxygen Co2 Transport in the Lungs(Exchange og gases)
SUNDERLINSHIBUD
 
Introduction to Neural Networks and Perceptron Learning Algorithm.pptx
Kayalvizhi A
 
MPMC_Module-2 xxxxxxxxxxxxxxxxxxxxx.pptx
ShivanshVaidya5
 
Benefits_^0_Challigi😙🏡💐8fenges[1].pptx
akghostmaker
 
Heart Bleed Bug - A case study (Course: Cryptography and Network Security)
Adri Jovin
 
drones for disaster prevention response.pptx
NawrasShatnawi1
 
Introduction to Productivity and Quality
মোঃ ফুরকান উদ্দিন জুয়েল
 
Thermal runway and thermal stability.pptx
godow93766
 
Additional Information in midterm CPE024 (1).pdf
abolisojoy
 
PRIZ Academy - Change Flow Thinking Master Change with Confidence.pdf
PRIZ Guru
 
Set Relation Function Practice session 24.05.2025.pdf
DrStephenStrange4
 
Innowell Capability B0425 - Commercial Buildings.pptx
regobertroza
 
Unified_Cloud_Comm_Presentation anil singh ppt
anilsingh298751
 
POWER PLANT ENGINEERING (R17A0326).pdf..
haneefachosa123
 
Statistical Data Analysis Using SPSS Software
shrikrishna kesharwani
 
UNIT DAA PPT cover all topics 2021 regulation
archu26
 
Green Building & Energy Conservation ppt
Sagar Sarangi
 
Ethics and Trustworthy AI in Healthcare – Governing Sensitive Data, Profiling...
AlqualsaDIResearchGr
 
ISO/IEC JTC 1/WG 9 (MAR) Convenor Report
Kurata Takeshi
 
UNIT-4-FEEDBACK AMPLIFIERS AND OSCILLATORS (1).pdf
Sridhar191373
 
Oxygen Co2 Transport in the Lungs(Exchange og gases)
SUNDERLINSHIBUD
 

Analysis of Influences of memory on Cognitive load Using Neural Network Back Propagation Algorithm

  • 1. Integrated Intelligent Research (IIR) International Journal of Data Mining Techniques and Applications Volume: 03 Issue: 01 June 2014, Page No. 16- 20 ISSN: 2278-2419 16 Analysis of Influences of memory on Cognitive load Using Neural Network Back Propagation Algorithm A.Naveen1 , M.S.Josephine2 1 Research Scholar, Department of Computer Applications, Dr. M.G.R. Educational and Research Institute University, Chennai 2 Professor, Department of Computer Applications, Dr. M.G.R. Educational and Research Institute University, Chennai E-mail:[email protected], [email protected] Abstract-Educational mining used to evaluate the leaner's performance and the learning environment. The learning process are involved and influenced by different components. The memory is playing vital role in the process of learning. The long term, short term, working, instant, responsive, process, recollect, reference, instruction and action memory are involved in the process of learning. The influencing factors on these memories are identified through the construction analysis of Neural Network Back Propagation Algorithm. The observed set of data represented using cubical dataset format for the mining approach. The mining process is carried out using neural network based back propagation network model to decide the influencing cognitive load for the different learning challenges. The learners’ difficulties are identified through the experimental results. I.INTRODUCTION The data mining techniques are applicable to all domains according to the need of applications.The data mining technology phases are majorly having preprocess analysis and pattern generation. The data mining technology is attempted to integrate the learning system oflearners and their cognitive behavior using predict approach. The Data mining technology is supported to determine the unknown values from available values. In this work,the predictive approach is used to identify the relationship of learner’s performance and their cognitive load using Neural network approach and its optimized based on back propagation algorithm. II.REVIEW OF LITERATURE The review of literature r covers basic concept of the data mining and the applications. The data mining tool and its techniques are highlighted. The Educational Data Mining process and its applicationsreviewed and presented in table1. The Learner difficulties are identified and attempted to resolve. Table 1: Describes Various Research Work Done Related To The Use Of Data Mining In The Context Of Education. S. No. Year Author Work 1 2000 Ma, Y., Liu, B., Wong, C. K., Yu, P. S., Lee, S. M Presented a real life application of data mining to find weak students 2 2001 Luan J. Introduced a powerful decision support tool, data mining, in the context of knowledge management 3 2002 Luan J. Discussed the potential applications of data mining in higher education & explained how data mining saves resources while maximizing efficiency in academics. 4 2005 Delavari et al Proposed a model for the application of data mining in higher education. 5 2006 Shyamala, K. &Rajagopalan, S. P. Developed a model to find similar patterns from the data gathered and to make predication about students’ performance. 6 2006 Sargenti et al Explored the development of a model which allows for diffusion of knowledge within a small business university.
  • 2. Integrated Intelligent Research (IIR) International Journal of Data Mining Techniques and Applications Volume: 03 Issue: 01 June 2014, Page No. 16- 20 ISSN: 2278-2419 17 III. ROLE OF MEMORY MODEL IN LEARNING PROCESS In the learning process, memory model is playing vital role from the observation, reorganization, understating and learning. In learning process memory classified as working memory , short term memory, long term memory and sensor memory . Working memory is where thinking gets done. The working memory is dual coded with a buffer for storage of verbal/text elements.sensory memory that those experiences get introduced into working memory. Once an experience is in working memory, the person can then consciously hold it in memory and think about it in context. The short-term memory acts in parallel with the long-term memory.According to the analysis of memory , the learning process is involved in the following memory process such as Long Term Memory, Short Term Memory, Working (Calculation) Memory , Instant Memory ,Responsive Memory ,Processing (Search Content) Memory ,Recollecting Memory ,Reference Memory ,Instruction Memory , Action Memory These memory performances are consider as a process unit and the neural network model designed . The real time learners performances are observed using NASA workload scaling process. IV.NEURAL NETWORK –BACK PROPAGATION MODEL Neural Networks are analytic techniques modeled after the (hypothesized) processes of learning’s in the cognitive system and the neurological functions of the brain and capable of predicting new observations (on specific variables) from other observations (on the same or other variables) after executing a process of so-called learning from existing data. Neural Networks is one of the Data Mining techniques to determine and optimize the factors. A. Neural Network Artificial neural networks simulate the biological structure of neural networks in brains in a simplified way to endow computers with the very considerable abilities of biological neural networks: the ability of learning from examples, pattern classification, generalization and prediction. B. Back Propagation Algorithm As per the adopted back propagation neural network model, the observed data is mapped and the model is constructed. The adopted algorithm concept converted as a step by step executable concept and presented below Step 1 : Normalize the I/P and O/P with respect to their maximum values. For each training pair, assume that in normalized form there are ℓ inputs given by { I } I and ℓx 1.n outputs given by { O }o n x 1 Step 2 : Assume that the number of neurons in the hidden layers lie between 1 < m < 10. because the ten memory attributes are consider for this network construction. Step 3: Let [ V ] represents the weights of synapses connecting input neuron and hidden neuron. Let [ W ] represents the weights of synapses connecting hidden neuron and output neuron.Initialize the weights to small random values usually from -1 to +1; [ V]0 = [ramdom weights ] [ W ]0 = [ randaom weights] [∆ V ]0 = [∆ W ]0 = [ 0 ] For general problems λ can be assumed as 1 and threshold value as 0. Step 4: For training we need to present one set of inputs and outputs.Present the pattern as inputs to the input layer { I }I then by using linear activation function, the output of the input layer may be evaluated as. { O }I = { I }It x 1 t x 1 Step 5 : Compute the inputs to the hidden layers by multiplying corresponding weights of synapses as { I }H= [ V ]T { O }I m x 1 m x ℓ ℓ x 1 Step 6 : Let the hidden layer units, evaluate the output using the sigmoidal function as 1 { O } = ( 1 + e –(IHi) ) m x 1 Step 7 : Compute the inputs to the output layers by multiplying corresponding weights of synapses as
  • 3. Integrated Intelligent Research (IIR) International Journal of Data Mining Techniques and Applications Volume: 03 Issue: 01 June 2014, Page No. 16- 20 ISSN: 2278-2419 18 { I }o = [ W ]T { O }H n x 1 n x mm x 1 Step 8 : Let the output layer units, evaluate the output using sigmoid function as 1{ O }o = ( 1 + e –(I oj ) ) Note : This output is the network output Step 9 : Calculate the error using the difference between the network output and the desired output as for the j th training set asEp Step 10 : Find a term { d } as { d} = ( Tk – Ook) Ook ( 1- Ook ) n x 1 Step 11 : Find [ Y ] matrix as [ Y ] = { O }H {d} m x n m x 1 1 x n Step 12 : Find [∆ W ]t + 1 = α [∆ W ] t + η [ Y ] m x n m x n m x n Step 13 : Find { e } = [ W ] { d } m x 1 m x n n x 1 (OHi) ( 1- OHi) { d * } = ei m x 1 m x 1 Find [ X ] matrix as [ X ] = { O }I {d* } = { I }I { d *} 1 x m ℓ x 1 1 x m ℓ x 1 1 x m Step 14 : Find [∆ V ]t + 1 = α [∆ V ] t + η [ X ] 1 x m 1 x m 1 x m Step 15 : Find [ V ]t + 1 = [V ] t + [∆ V ]t + 1 [ W ]t + 1 = [ W ] t + [ ∆ W ]t + 1 Step 16 : Find error rate as Error rate = nset Step 17 :Repeat steps 4 to 16 until the convergence in the error rate is less than the tolerance value. The Back propagation algorithm is converted for Education process V. BACK PROPAGATION NEURAL NETWORK MODEL TO IDENTIFY INFLUENCES OF MEMORY ON COGNITIVE LOAD In neural network, multilayer perceptron (MLP) architecture consists more than 2 layers; A MLP can have any number of layers, units per layer, network inputs, and network outputs such as fig 1 models. This network has 3 Layers; first layer is called input layer and last layer is called output layer; in between first and last layers which are called hidden layers. Finally, this network has three network inputs, one network output and hidden layer network. This model is the most popular in the supervised learning architecture because of the weight error correct rules. It is considered a generalization of the delta rule for nonlinear activation functions and multilayer networks. In a back-propagation neural network, the learning algorithm has two phases. First, a training input pattern is presented to the network input layer. Figure1 : Neural network model VI RESULT ANALYSIS All external observation attributes are considered for the input. I1 to I10 represents the input variable which presents the score of the each exercise which carried out different memory learning process such as long term, short term, working, instant, responsive, process, recollect, reference, instruction and action memory. The cognitive loads are treated as process neuron of hidden layer. The H1 to H6 presents the mental, physical, temporal, performance, effort and frustration cognitive loads.
  • 4. Integrated Intelligent Research (IIR) International Journal of Data Mining Techniques and Applications Volume: 03 Issue: 01 June 2014, Page No. 16- 20 ISSN: 2278-2419 19 According to the observed values, the neural network process is made. The student exercise values for different memory and are presented 5, 7, 10, 7, 10, 6, 8, 10, 10 and 8. The process values weightage is presented as a matrix. Table 2: Initial assignment I-> H Initially Assigned Input to Hidden layer values are 0.1159 0.7452 0.7667 0.3303 0.5616 0.0683 0.6439 0.4698 0.0191 0.2963 0.8009 0.6156 0.2143 0.5475 0.2102 0.3169 0.1742 0.4353 0.9549 0.5177 0.1248 0.0150 0.7018 0.2846 0.4923 0.1257 0.9708 0.0489 0.7574 0.9536 0.0792 0.3202 0.8221 0.3629 0.9702 0.0913 0.1030 0.4678 0.0615 0.0294 0.1558 0.3777 0.6844 0.4638 0.3015 0.5345 0.3129 0.2956 0.8698 0.8769 0.1957 0.9274 0.4688 0.0399 0.1469 0.6540 0.3309 0.5981 0.0100 0.3490 Table 3: Initial Assignment H-> O Initially Assigned Hidden layer to Output values are 0.9532 0.4718 0.0451 0.0943 0.1912 0.4462 0.5433 0.1154 0.6347 0.8885 0.8214 0.0804 From the initial values, each level obtained output is recalculated and assigned as a input and reduced the error level. Initially the output is estimated for 81 for the learning cognitive and 95 for the learning performance. The initial assigned neuron process produced 96 for the cognitive load and 89 for the learner performance. While obtaining this process, -62.048 error is produced. The iterative process implemented and obtained the final value in zero level error. The final weight age values are presented below Table 4: Obtained weight I-> H Final Input to Hidden layer values are 0.2318 1.4905 1.5335 0.6605 1.1232 0.1366 1.2879 0.9397 0.0382 0.5926 1.6019 1.2313 0.4286 1.0950 0.4204 0.6338 0.3484 0.8705 1.9099 1.0355 0.2495 0.0300 1.4036 0.5692 0.9846 0.2514 1.9416 0.0979 1.5147 1.9072 0.1583 0.6403 1.6442 0.7259 1.9405 0.1825 0.2060 0.9356 0.1229 0.0587 0.3116 0.7554 1.3687 0.9276 0.6030 1.0690 0.6258 0.5912 1.7397 1.7538 0.3913 1.8548 0.9375 0.0799 0.2938 1.3080 0.6618 1.1962 0.0200 0.6980 Table 5 Obtained weight H->O Final Hidden to output layer values are 0.9532 0.4718 0.0451 0.0943 0.1912 0.4462 0.5433 0.1154 0.6347 0.8885 0.8214 0.0804 While processing the network model, the high value of mental effort is influenced at the maximum level of 95.32 percentage and least level of physical at 4.51 percentage. The advantage of this model is less number of iteration and better performance compare with standard back- propagation model. To evaluate this algorithm, the MATLAB coding designed and executed .The learning performance is inclined to performance factor of the cognitive load to obtain as expected result. The entire model is presented according to the load along with the learning performance with the controlled weight. The load is differ one with another according to the different learning process. Mental, physical, temporal, effort, frustration is less while performance is high. VII. CONCLUSION The back propagation model could be designed and evaluation with multiple learning domains. A knowledge repositorycould be created for different learning objectives with available and adoptable technologies. The cognitive load six factors may be increased with environmental, socio-economiclevel of Learners as continuation of this research work. VIII. BIBILOGRAPHY 1. F. Newell, A. Carmichael, P. Gregor and N. Alm, (2002) "Information technology forcognitive support "In The Human-Computer Interaction Handbook 2 pp.464-481. 2. Adefowoju, B.S. and Osofisan A.O. (2004). “Cocoa Production Forecasting UsingArtificial Neural Networks”. International Centre for Mathematics and Computer Science Nigeria.ICMCS117-136. 3. Baddeley, A. (1998). Human memory. Boston: Allyn& Bacon.
  • 5. Integrated Intelligent Research (IIR) International Journal of Data Mining Techniques and Applications Volume: 03 Issue: 01 June 2014, Page No. 16- 20 ISSN: 2278-2419 20 4. Baddeley, A.D. (1992). Working Memory. Science; 255-256. 5. Bose, N.K. and Liang, P. (1996). Neural Networks Fundamental With Graphs AlgorithmsAnd Applications.Mcgraw-Hill: New York, NY. 6. Bransford, J. D., Brown, A. L., & Cocking, R. R. (Eds.) (2000). How people learn: Brain,mind,experience, and school: Expanded Edition. Washington, DC: National AcademyPress. 7. Fougnie, D., &Marois, R. (2011). What limits Working Memory capacity? Evidence forattention to visual cortex. Journal of Cognitive Neuroscience ,23(9),2593-604. AUTHOR PROFILE A.Naveen is a research scholar in department of computer application from Dr. MGR University, Chennai. He graduated Masters in computer science from St.Joseph’s College, Bharathidasan University, Trichy in 2012. His areas of interest include Data Mining , Cognitive Computing and Neural Networking. M.S.Josephine is Working in Dept of Computer Applications, Dr.MGR University, Chennai. She graduated Masters Degree (MCA) from St.Joseph’s College, Bharathidasan University, M.Phil (Computer Science ) from Periyar University and Doctorate from Mother Teresa University in Computer Applications. Her research Interest includes Software Engineering , Expert System.