PC603IT
Machine Learning
Instructor: Dr. A Srinivas Reddy
IF DATA HAD MASS, THE EARTH
WOULD BE A BLACK HOLE
• Computers capture and store terabytes of
data every day.
– Data from shops, banks, hospitals, scientific
laboratories, and many more.
– Data of MRI, DNA, Social networks, sales
Transactions, etc.
• Challenge: How to use this data to and
make human file easy, earn business, etc.
• Machine learning solves these problems.
Why not humans?
• Complexity of data:
Volume, Variety and
Velocity, high
dimension data.
• Humans cannot handle
such complex data.
• Cannot understand
• Cannot process and
• Cannot analyze
Traditional Programming
Machine Learning
Computer
Data
Program
Output
Computer
Data
Output
Program
Magic?
No, more like gardening
• Seeds = Algorithms
• Nutrients = Data
• Gardener = You
• Plants = Programs
LEARNING
• Humans learn from experience.
– remembering, adapting, and generalizing,
reasoning, and logical deduction.
• Machines is learning from data.
• How?
– Modify or adapt their actions to fit the data.
– Example: Human vs computer game.
Computer starts learning from humans and
finally plays better than human after certain
number of trails.
• Data mining: Extraction of
useful information (in the
form of patterns) from
massive datasets.
• Machine learning:
Classification, clustering,
outlier detection, etc.
• Machine learning: Includes training over data and
testing on data.
• Training: Learning from data
• Testing: Measuring the performance of model
• ML Complexity:
– Training and testing complexity should be low
– Can be deployed on edge devices.
Types of Learning
• Supervised (inductive) learning
– Training data includes desired outputs.
• Unsupervised learning
– Training data does not include desired outputs.
• Reinforcement learning
– Rewards from sequence of actions.
• Evolutionary learning
– Biological evolution can be seen as a learning
process.
9
Supervised Machine learning
• Like human learning from past experiences.
• A computer does not have “experiences”.
• A computer system learns from data, which
represent some “past experiences” of an
application domain.
• Our focus: learn a target function that can be used
to predict the values of a discrete class attribute,
e.g., approve or not-approved, and high-risk or low
risk.
• Supervised learning task: regression, classification,
or inductive learning.
10
What do we mean by learning?
• Given
– a data set D,
– a task T, and
– a performance measure M,
a computer system is said to learn from D
to perform the task T if after learning the
system’s performance on T improves as
measured by M.
• In other words, the learned model helps
the system to perform T better as
compared to no learning.
11
An example: data (loan
application) Approved or not
12
An example
• Data: Loan application data
• Task: Predict whether a loan should be
approved or not.
• Performance measure: accuracy.
No learning: classify all future applications
(test data) to the majority class (i.e., Yes):
Accuracy = 9/15 = 60%.
• We can do better than 60% with learning.
Regression
• Regression problem: Fit a mathematical function
describing a curve, so that the curve passes as
close as possible to all the datapoints.
• It is generally a problem of function
approximation or interpolation, working out the
value between values that we know
• Given a dataset <Xi, ti> of N points.
• What is the function in x, t fits all the
points.
• ax3 + bx2 + cx + d = 0,
• t = 3 sin(5x)
Regression
15
• Data: A set of data records (also called
examples, instances or cases) described
by
– k attributes: A1, A2, … Ak.
– a class: Each example is labelled with a pre-
defined class.
• Goal: To learn a classification model from
the data that can be used to predict the
classes of new (future, or test)
cases/instances.
Classification: Data and the goal
16
An example: the learning task
• Learn a classification model from the data
• Use the model to classify future loan applications
into
– Yes (approved) and
– No (not approved)
• What is the class for following case/instance?
17
Supervised vs. unsupervised
Learning
• Supervised learning: classification is seen
as supervised learning from examples.
– Supervision: The data (observations,
measurements, etc.) are labeled with pre-
defined classes. It is like that a “teacher” gives
the classes (supervision).
– Test data are classified into these classes too.
• Unsupervised learning (clustering)
– Class labels of the data are unknown
– Given a set of data, the task is to establish the
existence of classes or clusters in the data
18
Supervised learning process:
two steps
 Learning (training): Learn a model using the
training data
 Testing: Test the model using unseen test data
to assess the model accuracy
,
cases
test
of
number
Total
tions
classifica
correct
of
Number

Accuracy
• How do we classify
these coins
• Select the list of
features.
• What happens if there
are to many features?
– the number of datapoints required
increases
20
Fundamental assumption of learning
Assumption: The distribution of training
examples is identical to the distribution of
test examples (including future unseen
examples).
• In practice, this assumption is often
violated to certain degree.
• Strong violations will clearly result in poor
classification accuracy.
• To achieve good accuracy on the test data,
training examples must be sufficiently
representative of the test data.
THE MACHINE LEARNING
PROCESS
• A process by which machine learning
algorithms can be selected, applied, and
evaluated for the problem.
• Data Collection and Preparation
• Feature Selection
• Algorithm Choice
• Parameter and Model Selection
• Training
• Evaluation
SOME TERMINOLOGY
• Inputs vector X= (x1, x2, x3, …, xm , t)
• Weights (wi,j) , are the weighted connections
between nodes i and j. Weight matrix W=[wi,j]
• Outputs (yj): yj j=1 to n (No.of points),
– Model y(x,W)
• Targets : vector t, tj , tj j=1 to n.
• Activation Function: For neural networks, g(·) is a
mathematical function that describes the firing of the neuron
as a response to the weighted inputs, such as the threshold
function described.
• Error E: difference in outputs y and targets t
Weight Space
• Equation of a curve: w1x1 + w2x2 + w3x + d = 0
• Axes: mutually orthogonal

More Related Content

PDF
Machine learning
PDF
amarsppt-151204090549-lva1-app6891.pdf
PPTX
supervised learning
PPT
Machine learning introduction to unit 1.ppt
PPT
Lecture: introduction to Machine Learning.ppt
PDF
Machine Learning an Research Overview
PPTX
Essential of ML 1st Lecture IIT Kharagpur
Machine learning
amarsppt-151204090549-lva1-app6891.pdf
supervised learning
Machine learning introduction to unit 1.ppt
Lecture: introduction to Machine Learning.ppt
Machine Learning an Research Overview
Essential of ML 1st Lecture IIT Kharagpur

Similar to Unit-1.ppt (20)

PDF
Week 1.pdf
PPTX
ECT463 Machine Learning Module 1 KTU 2019 Scheme.pptx
PPTX
Presentation on supervised learning
PPTX
Introduction to Machine Learning_ UNIT 1
PPTX
supervised_learning_PRESENTATION___.pptx
PPT
module 6 (1).ppt
PPTX
Introduction to Machine Learning
PDF
Machine learning it is time...
PPTX
AI_06_Machine Learning.pptx
PPTX
Machine learning Method and techniques
PPT
Lecture 1
PPT
lec1.ppt
PPTX
machine learning
PPTX
BE ML Module 1A_Introduction to Machine Learning.pptx
PDF
Lecture 2 Basic Concepts in Machine Learning for Language Technology
PPT
Unit-V Machine Learning.ppt
PPTX
Machine learning and types
PPTX
Chapter 05 Machine Learning.pptx
PPTX
Lec1 intoduction.pptx
PDF
Machine Learning Basics_Dr.Balamurugan.pdf
Week 1.pdf
ECT463 Machine Learning Module 1 KTU 2019 Scheme.pptx
Presentation on supervised learning
Introduction to Machine Learning_ UNIT 1
supervised_learning_PRESENTATION___.pptx
module 6 (1).ppt
Introduction to Machine Learning
Machine learning it is time...
AI_06_Machine Learning.pptx
Machine learning Method and techniques
Lecture 1
lec1.ppt
machine learning
BE ML Module 1A_Introduction to Machine Learning.pptx
Lecture 2 Basic Concepts in Machine Learning for Language Technology
Unit-V Machine Learning.ppt
Machine learning and types
Chapter 05 Machine Learning.pptx
Lec1 intoduction.pptx
Machine Learning Basics_Dr.Balamurugan.pdf
Ad

Recently uploaded (20)

PPTX
CAPACITY BUILDING PROGRAMME IN ADOLESCENT EDUCATION
DOCX
Cambridge-Practice-Tests-for-IELTS-12.docx
PPTX
Climate Change and Its Global Impact.pptx
PDF
1.Salivary gland disease.pdf 3.Bleeding and Clotting Disorders.pdf important
PPTX
PLASMA AND ITS CONSTITUENTS 123.pptx
PDF
Disorder of Endocrine system (1).pdfyyhyyyy
PPTX
UNIT_2-__LIPIDS[1].pptx.................
PDF
Farming Based Livelihood Systems English Notes
PDF
Controlled Drug Delivery System-NDDS UNIT-1 B.Pharm 7th sem
DOCX
Ibrahim Suliman Mukhtar CV5AUG2025.docx
PPTX
Reproductive system-Human anatomy and physiology
PPTX
What’s under the hood: Parsing standardized learning content for AI
PDF
Everyday Spelling and Grammar by Kathi Wyldeck
PDF
LIFE & LIVING TRILOGY - PART (3) REALITY & MYSTERY.pdf
PDF
0520_Scheme_of_Work_(for_examination_from_2021).pdf
PDF
Journal of Dental Science - UDMY (2021).pdf
PDF
faiz-khans about Radiotherapy Physics-02.pdf
PDF
Compact First Student's Book Cambridge Official
PDF
Fun with Grammar (Communicative Activities for the Azar Grammar Series)
PDF
Skin Care and Cosmetic Ingredients Dictionary ( PDFDrive ).pdf
CAPACITY BUILDING PROGRAMME IN ADOLESCENT EDUCATION
Cambridge-Practice-Tests-for-IELTS-12.docx
Climate Change and Its Global Impact.pptx
1.Salivary gland disease.pdf 3.Bleeding and Clotting Disorders.pdf important
PLASMA AND ITS CONSTITUENTS 123.pptx
Disorder of Endocrine system (1).pdfyyhyyyy
UNIT_2-__LIPIDS[1].pptx.................
Farming Based Livelihood Systems English Notes
Controlled Drug Delivery System-NDDS UNIT-1 B.Pharm 7th sem
Ibrahim Suliman Mukhtar CV5AUG2025.docx
Reproductive system-Human anatomy and physiology
What’s under the hood: Parsing standardized learning content for AI
Everyday Spelling and Grammar by Kathi Wyldeck
LIFE & LIVING TRILOGY - PART (3) REALITY & MYSTERY.pdf
0520_Scheme_of_Work_(for_examination_from_2021).pdf
Journal of Dental Science - UDMY (2021).pdf
faiz-khans about Radiotherapy Physics-02.pdf
Compact First Student's Book Cambridge Official
Fun with Grammar (Communicative Activities for the Azar Grammar Series)
Skin Care and Cosmetic Ingredients Dictionary ( PDFDrive ).pdf
Ad

Unit-1.ppt

  • 2. IF DATA HAD MASS, THE EARTH WOULD BE A BLACK HOLE • Computers capture and store terabytes of data every day. – Data from shops, banks, hospitals, scientific laboratories, and many more. – Data of MRI, DNA, Social networks, sales Transactions, etc. • Challenge: How to use this data to and make human file easy, earn business, etc. • Machine learning solves these problems.
  • 3. Why not humans? • Complexity of data: Volume, Variety and Velocity, high dimension data. • Humans cannot handle such complex data. • Cannot understand • Cannot process and • Cannot analyze
  • 5. Magic? No, more like gardening • Seeds = Algorithms • Nutrients = Data • Gardener = You • Plants = Programs
  • 6. LEARNING • Humans learn from experience. – remembering, adapting, and generalizing, reasoning, and logical deduction. • Machines is learning from data. • How? – Modify or adapt their actions to fit the data. – Example: Human vs computer game. Computer starts learning from humans and finally plays better than human after certain number of trails.
  • 7. • Data mining: Extraction of useful information (in the form of patterns) from massive datasets. • Machine learning: Classification, clustering, outlier detection, etc. • Machine learning: Includes training over data and testing on data. • Training: Learning from data • Testing: Measuring the performance of model • ML Complexity: – Training and testing complexity should be low – Can be deployed on edge devices.
  • 8. Types of Learning • Supervised (inductive) learning – Training data includes desired outputs. • Unsupervised learning – Training data does not include desired outputs. • Reinforcement learning – Rewards from sequence of actions. • Evolutionary learning – Biological evolution can be seen as a learning process.
  • 9. 9 Supervised Machine learning • Like human learning from past experiences. • A computer does not have “experiences”. • A computer system learns from data, which represent some “past experiences” of an application domain. • Our focus: learn a target function that can be used to predict the values of a discrete class attribute, e.g., approve or not-approved, and high-risk or low risk. • Supervised learning task: regression, classification, or inductive learning.
  • 10. 10 What do we mean by learning? • Given – a data set D, – a task T, and – a performance measure M, a computer system is said to learn from D to perform the task T if after learning the system’s performance on T improves as measured by M. • In other words, the learned model helps the system to perform T better as compared to no learning.
  • 11. 11 An example: data (loan application) Approved or not
  • 12. 12 An example • Data: Loan application data • Task: Predict whether a loan should be approved or not. • Performance measure: accuracy. No learning: classify all future applications (test data) to the majority class (i.e., Yes): Accuracy = 9/15 = 60%. • We can do better than 60% with learning.
  • 13. Regression • Regression problem: Fit a mathematical function describing a curve, so that the curve passes as close as possible to all the datapoints. • It is generally a problem of function approximation or interpolation, working out the value between values that we know • Given a dataset <Xi, ti> of N points. • What is the function in x, t fits all the points. • ax3 + bx2 + cx + d = 0, • t = 3 sin(5x)
  • 15. 15 • Data: A set of data records (also called examples, instances or cases) described by – k attributes: A1, A2, … Ak. – a class: Each example is labelled with a pre- defined class. • Goal: To learn a classification model from the data that can be used to predict the classes of new (future, or test) cases/instances. Classification: Data and the goal
  • 16. 16 An example: the learning task • Learn a classification model from the data • Use the model to classify future loan applications into – Yes (approved) and – No (not approved) • What is the class for following case/instance?
  • 17. 17 Supervised vs. unsupervised Learning • Supervised learning: classification is seen as supervised learning from examples. – Supervision: The data (observations, measurements, etc.) are labeled with pre- defined classes. It is like that a “teacher” gives the classes (supervision). – Test data are classified into these classes too. • Unsupervised learning (clustering) – Class labels of the data are unknown – Given a set of data, the task is to establish the existence of classes or clusters in the data
  • 18. 18 Supervised learning process: two steps  Learning (training): Learn a model using the training data  Testing: Test the model using unseen test data to assess the model accuracy , cases test of number Total tions classifica correct of Number  Accuracy
  • 19. • How do we classify these coins • Select the list of features. • What happens if there are to many features? – the number of datapoints required increases
  • 20. 20 Fundamental assumption of learning Assumption: The distribution of training examples is identical to the distribution of test examples (including future unseen examples). • In practice, this assumption is often violated to certain degree. • Strong violations will clearly result in poor classification accuracy. • To achieve good accuracy on the test data, training examples must be sufficiently representative of the test data.
  • 21. THE MACHINE LEARNING PROCESS • A process by which machine learning algorithms can be selected, applied, and evaluated for the problem. • Data Collection and Preparation • Feature Selection • Algorithm Choice • Parameter and Model Selection • Training • Evaluation
  • 22. SOME TERMINOLOGY • Inputs vector X= (x1, x2, x3, …, xm , t) • Weights (wi,j) , are the weighted connections between nodes i and j. Weight matrix W=[wi,j] • Outputs (yj): yj j=1 to n (No.of points), – Model y(x,W) • Targets : vector t, tj , tj j=1 to n. • Activation Function: For neural networks, g(·) is a mathematical function that describes the firing of the neuron as a response to the weighted inputs, such as the threshold function described. • Error E: difference in outputs y and targets t
  • 23. Weight Space • Equation of a curve: w1x1 + w2x2 + w3x + d = 0 • Axes: mutually orthogonal