SlideShare a Scribd company logo
6
Most read
10
Most read
18
Most read
UNIT-3
The Greedy Method: Introduction, Huffman Trees and codes, Minimum
Coin Change problem, Knapsack problem, Job sequencing with deadlines,
Minimum Cost Spanning Trees, Single Source Shortest paths.
Q) Define the following terms.
i. Feasible solution ii. Objective function iii. Optimal solution
Feasible Solution: Any subset that satisfies the given constraints is called
feasible solution.
Objective Function: Any feasible solution needs either maximize or
minimize a given function which is called objective function.
Optimal Solution: Any feasible solution that maximizes or minimizes the
given objective function is called an optimal solution.
Q) Describe Greedy technique with an example.
Greedy method constructs a solution to an optimization problem piece
by piece through a sequence of choices that are:
 feasible, i.e. satisfying the constraints.
 locally optimal, i.e., it has to be the best local choice among all
feasible choices available on that step.
 irrevocable, i.e., once made, it cannot be changed on subsequent steps
of the algorithm.
For some problems, it yields a globally optimal solution for every
instance.
The following is the general greedy approach for control abstraction of
subset paradigm.
Algorithm Greedy(a,n)
//a[1:n] contains n inputs
{
solution := // initializes to empty
for i:=1 to n do
{
x := select(a);
if Feasible(solution x) then
solution := union(solution, x)
}
return solution;
}
Eg. Minimum Coin Change:
Given unlimited amounts of coins of denominations d1 > … > dm ,
give change for amount n with the least number of coins.
here, d1 = 25c, d2 =10c, d3 = 5c, d4 = 1c and n = 48c
Greedy approach: At each step we take a maximum denomination
coin which is less than or equal to remaining amount required.
Step 1: 48 – 25 = 23
Step 2: 23 – 10 = 13
Step 3: 13 – 10 = 03
Step 4: 03 – 01 = 02
Step 5: 02 – 01 = 01
Step 6: 01 – 01 = 00
Solution: <1, 2, 0, 3> i.e; d1 – 1coin, d2 – 2 coins, d3 – 0 coin and d4 –
3 coins.
Greedy solution is optimal for any amount and “normal’’ set of
denominations.
Q) Explain Huffman tree and Huffman code with suitable example.
Huffman tree is any binary tree with edges labeled with 0’s and 1’s yields a
prefix-free code of characters assigned to its leaves.
Huffman coding or prefix coding is a lossless data compression algorithm.
The idea is to assign variable-length codes to input characters, lengths of
the assigned codes are based on the frequencies of corresponding
characters.
Algorithm to build Huffman tree:
// Input is an array of unique characters along with their frequency of
occurrences and output is Huffman Tree.
1. Create a leaf node for each unique character and build a min heap of all
leaf nodes.
2. Extract two nodes with the minimum frequency from the min heap.
3. Create a new internal node with a frequency equal to the sum of the two
nodes frequencies. Make the first extracted node as its left child and the
other extracted node as its right child. Add this node to the min heap.
4. Repeat step2 and step3 until the heap contains only one node. The
remaining node is the root node and the tree is complete.
Time complexity: O(nlogn) where n is the number of unique characters. If
there are n nodes, extractMin() is called 2*(n – 1) times. extractMin() takes
O(logn) time as it calles minHeapify(). So, overall complexity is O(nlogn).
Eg.
character A B C D _
frequency 0.35 0.1 0.2 0.2 0.15
The code word for the character will be 001, 010, 011, 100 and 101
(fixed length encoding) without using Huffman coding, i.e; on an average
we need 3 bits to represent a character.
Step1:
Step2:
Step3:
Step4:
Step5:
Therefore, the codeword we get after using Huffman coding is
character A B C D _
frequency 0.35 0.1 0.2 0.2 0.15
codeword 11 100 00 01 101
Average bits per character using Huffman coding
= 2*0.35 + 3*0.1 + 2*0.2 + 2*0.2 + 3*0.15
= 2.25
Therefore, compression ratio: (3 - 2.25)/3*100% = 25%
Q) Briefly explain about knapsack problem with an example.
Knapsack Problem
Given a set of items, each with a weight and a value, determine a subset of
items to include in a collection so that the total weight is less than or equal
to a given limit and the total value is as large as possible.
Fractional Knapsack
In this case, items can be broken into smaller pieces, hence we can select
fractions of items.
According to the problem statement,
 There are n items in the store
 Weight of ith item wi > 0
 Profit for ith item pi>0 and
 Capacity of the Knapsack is W
In this version of Knapsack problem, items can be broken into smaller
pieces. So, the thief may take only a fraction xi of ith item.
0 ≤ xi ≤ 1
The ith item contributes the weight xi *wi to the total weight in the
knapsack and profit xi.pi to the total profit.
Hence, the objective of this algorithm is to
Maximize ∑
subject to constraint,
∑ ≤ W
It is clear that an optimal solution must fill the knapsack exactly, otherwise
we could add a fraction of one of the remaining items and increase the
overall profit.
Thus, an optimal solution can be obtained by
∑ = W
Algorithm Greedyknapsack(m,n)
//p[1:n] and w[1:n] contain the prfits and weights respectively
//all n objects are ordered p[i]/w[i] ≥ p[i+1]/w[i+1]
//m is the knapsack size and x[1:n] is the solution vector
{
for i:=1 to n do
x[i]:=0.0;
u := m;
for i:=1 to n do
{
if(w[i] > u) then
break;
x[i] := 1;
u := u - w[i];
}
if(i ≤ n) then
x[i]:= u/w[i];
}
Analysis
If the provided items are already sorted into a decreasing order of pi/wi,
then the while loop takes a time in O(n); Therefore, the total time including
the sort is in O(n logn).
Eg. Let us consider that the capacity of the knapsack W = 60 and the list of
provided items are shown in the following table −
Item A B C D
Profit 280 100 120 120
Weight 40 10 20 24
Step 1: find p/w ratio for each item.
Item A B C D
Profit 280 100 120 120
Weight 40 10 20 24
Ratio pi/wi 7 10 6 5
Step2:
As the provided items are not sorted based on pi/wi. After sorting, the items
are as shown in the following table.
Item B A C D
Profit 100 280 120 120
Weight 10 40 20 24
Ratio pi/wi 10 7 6 5
Step3:
We choose 1st item B as weight of B is less than the capacity of the
knapsack.
Now knapsack contains weight = 60 – 10 = 50
Step4:
item A is chosen, as the available capacity of the knapsack is greater than
the weight of A.
Now knapsack contains weight = 50 – 40 = 10
Step5:
Now, C is chosen as the next item. However, the whole item cannot be
chosen as the remaining capacity of the knapsack is less than the weight
of C.
Hence, fraction of C (i.e. (60 − 50)/20) is chosen.
Now, the capacity of the Knapsack is equal to the selected items. Hence, no
more item can be selected.
The total weight of the selected items is 10 + 40 + 20 * (10/20) = 60
And the total profit is 100 + 280 + 120 * (10/20) = 380 + 60 = 440
Q) Explain job sequencing with deadlines indetail with an example.
We are given a set of n jobs. Associated with job i is an integer deadline di ≥
0 and a profit pi>0. For any job i, the profit pi is earned iff the job is
completed by its deadline.
To complete a job one has to process the job on a machine for one unit of
time. Only one machine is available for processing jobs.
A feasible solution for this problem is a subset J of jobs such that each job
in this subset can be completed by its deadline.
The value of a feasible solution J is the sum of the profits of the jobs in J.
i.e; is ∑
An optimal solution is a feasible solution with maximum value.
Eg.
The above is exhaustive technique in which we check all 1 and 2 jobs
feasible possibilities and the optimal is 3rd sequence which is 4,1 sequence.
The following algorithm is a high level description of job sequencing:
The following JS is the correct implementation of above algorithm:
The above algorithm assumes that the jobs are already sorted such that P1
≥ p2 ≥ ... ≥ pn. Further it assumes that n>=1 and the deadline d[i] of job i
is atleast 1.
For the above algorithm JS there are 2 possible parameters in terms of
which its time complexity can be measured.
1. the number of jobs, n
2. the number of jobs included in the solution J, which is s.
The while loop in the above algorithm is iterated atmost k times. Each
iteration takes O(1) time.
The body of the conditional operator if require O(k-r) time to insert a job i.
Hence the total time for each iteration of the for loop is O(k). This loop is
iterated for n-1 times.
If s is the final value of k, that is, S is the number of jobs in the final
solution, then the total time needed by the algorithm is O(sn). Since s ≤ n, in
worst case, the time complexity is O(n2)
Unit-3 greedy method, Prim's algorithm, Kruskal's algorithm.pdf
Q) What is minimum spanning tree?
i) Explain Prim’s algorithm with an example.
ii) Explain Kruskal’s algorithm with an example.
A spanning tree of an undirected connected graph is its connected acyclic
subgraph (i.e., a tree) that contains all the vertices of the graph. If such a
graph has weights assigned to its edges,
A minimum spanning tree is its spanning tree of the smallest weight,
where the weight of a tree is defined as the sum of the weights on all its
edges.
The minimum spanning tree problem is the problem of finding a minimum
spanning tree for a given weighted connected graph.
Eg.
In the above image (a) is given graph and (b),(c) are two different spanning
trees. Image (c) is the minimum spanning tree as it have less cost compare
to (b).
i. Prim’s algorithm:
 Start with tree T1 consisting of one (any) vertex and “grow” tree one
vertex at a time to produce MST through a series of expanding
subtrees T1, T2, …, Tn
 On each iteration, construct Ti+1 from Ti by adding vertex not in Ti
that is closest to those already in Ti (this is a “greedy” step!)
 Stop when all vertices are included.
 Needs priority queue for locating closest fringe(not visited) vertex.
 Efficiency:
i. O(n2) for weight matrix representation of graph and array
implementation of priority queue
ii. O(m log n) for adjacency lists representation of graph with n
vertices and m edges and min-heap implementation of the
priority queue
Eg.
Unit-3 greedy method, Prim's algorithm, Kruskal's algorithm.pdf
Eg. 2:
ii. Kruskal’s algorithm:
 Sort the edges in nondecreasing order of lengths
 “Grow” tree one edge at a time to produce MST through a series
of expanding forests F1, F2, …, Fn-1
 On each iteration, add the next edge on the sorted list unless
this would create a cycle. (If it would, skip the edge.)
 Algorithm looks easier than Prim’s but is harder to implement
(checking for cycles!)
 Cycle checking: a cycle is created iff added edge connects vertices in
the same connected component
 Runs in O(m log m) time, with m = |E|. The time is mostly spent on
sorting.
Q) Explain indetail about single source shortest path problem.
Single Source Shortest Paths Problem: Given a weighted connected
(directed) graph G, find shortest paths from source vertex s to each of the
other vertices.
Dijkstra’s algorithm: Similar to Prim’s MST algorithm, with a different way
of computing numerical labels: Among vertices not already in the tree, it
finds vertex u with the smallest sum
dv + w(v,u)
where
v is a vertex for which shortest path has been already found
on preceding iterations (such vertices form a tree rooted at s)
dv is the length of the shortest path from source s to v
w(v,u) is the length (weight) of edge from v to u.
 Doesn’t work for graphs with negative weights
 Applicable to both undirected and directed graphs
 Efficiency
o O(|V|2) for graphs represented by weight matrix and array
implementation of priority queue
o O(|E|log|V|) for graphs represented by adj. lists and min-heap
implementation of priority queue
Unit-3 greedy method, Prim's algorithm, Kruskal's algorithm.pdf
Unit-3 greedy method, Prim's algorithm, Kruskal's algorithm.pdf
Unit-3 greedy method, Prim's algorithm, Kruskal's algorithm.pdf
Eg 2.
The shortest paths and their lengths are:
From a to b: a – b of length 3
From a to d: a – b – d of length 5
From a to c: a – b – c of length 7
From a to e: a – b – d – e of length 9

More Related Content

What's hot (20)

PDF
Greedy algorithm activity selection fractional
Amit Kumar Rathi
 
PPTX
Greedy method
Anusha sivakumar
 
PPTX
Finite State Machine.ppt.pptx
SKUP1
 
PDF
N Queens problem
Arkadeep Dey
 
PPTX
Single source Shortest path algorithm with example
VINITACHAUHAN21
 
PPTX
Encoder.pptx
Pooja Dixit
 
PPTX
1 sollins algorithm
Muhammad Salman
 
PPT
Greedy method
Dr Shashikant Athawale
 
PPT
Computer Organization and Assembly Language
fasihuddin90
 
PPTX
Queues
Ashim Lamichhane
 
PPTX
Insertion sort algorithm power point presentation
University of Science and Technology Chitttagong
 
PPTX
1. elementary signals
MdFazleRabbi18
 
PDF
Computer Organization
anishgoel
 
PPTX
Bootstrap loader
Saiteja Enimidigandla
 
PPT
Ppt Digital Electronics
Naval Kush
 
DOCX
Nonrecursive predictive parsing
alldesign
 
PPT
Assembler
Mir Majid
 
PPTX
Assembly Language
Ibrahimcommunication Al Ani
 
PPTX
Code conversions.pptx415.pptx
MariaJoseph591921
 
Greedy algorithm activity selection fractional
Amit Kumar Rathi
 
Greedy method
Anusha sivakumar
 
Finite State Machine.ppt.pptx
SKUP1
 
N Queens problem
Arkadeep Dey
 
Single source Shortest path algorithm with example
VINITACHAUHAN21
 
Encoder.pptx
Pooja Dixit
 
1 sollins algorithm
Muhammad Salman
 
Greedy method
Dr Shashikant Athawale
 
Computer Organization and Assembly Language
fasihuddin90
 
Insertion sort algorithm power point presentation
University of Science and Technology Chitttagong
 
1. elementary signals
MdFazleRabbi18
 
Computer Organization
anishgoel
 
Bootstrap loader
Saiteja Enimidigandla
 
Ppt Digital Electronics
Naval Kush
 
Nonrecursive predictive parsing
alldesign
 
Assembler
Mir Majid
 
Assembly Language
Ibrahimcommunication Al Ani
 
Code conversions.pptx415.pptx
MariaJoseph591921
 

Similar to Unit-3 greedy method, Prim's algorithm, Kruskal's algorithm.pdf (20)

PPTX
Module 3_DAA (2).pptx
AnkitaVerma776806
 
PDF
Unit 3 - Greedy Method
MaryJacob24
 
PDF
Unit 3 greedy method
MaryJacob24
 
PPTX
Unit 3- Greedy Method.pptx
MaryJacob24
 
PPTX
daa-unit-3-greedy method
hodcsencet
 
PDF
Module 2 - Greedy Algorithm Data structures and algorithm
farzanirani201402
 
PDF
Introduction to Greedy method, 0/1 Knapsack problem
DrSMeenakshiSundaram1
 
PPTX
Greedy algorithms -Making change-Knapsack-Prim's-Kruskal's
Jay Patel
 
PPT
Greedy algorithm pptxe file for computer
kerimu1235
 
PDF
Mastering Greedy Algorithms: Optimizing Solutions for Efficiency"
22bcs058
 
PPT
376951072-3-Greedy-Method-new-ppt.ppt
RohitPaul71
 
PPT
Unit 3-Greedy Method
DevaKumari Vijay
 
PPT
8282967.ppt
ArunachalamSelva
 
PPTX
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhh
danielgetachew0922
 
PPT
Chapter 17
ashish bansal
 
DOC
Data structure notes
anujab5
 
PPTX
Algorithm Design Techiques, divide and conquer
Minakshee Patil
 
PDF
Lec07-Greedy Algorithms.pdf Lec07-Greedy Algorithms.pdf
MAJDABDALLAH3
 
Module 3_DAA (2).pptx
AnkitaVerma776806
 
Unit 3 - Greedy Method
MaryJacob24
 
Unit 3 greedy method
MaryJacob24
 
Unit 3- Greedy Method.pptx
MaryJacob24
 
daa-unit-3-greedy method
hodcsencet
 
Module 2 - Greedy Algorithm Data structures and algorithm
farzanirani201402
 
Introduction to Greedy method, 0/1 Knapsack problem
DrSMeenakshiSundaram1
 
Greedy algorithms -Making change-Knapsack-Prim's-Kruskal's
Jay Patel
 
Greedy algorithm pptxe file for computer
kerimu1235
 
Mastering Greedy Algorithms: Optimizing Solutions for Efficiency"
22bcs058
 
376951072-3-Greedy-Method-new-ppt.ppt
RohitPaul71
 
Unit 3-Greedy Method
DevaKumari Vijay
 
8282967.ppt
ArunachalamSelva
 
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhh
danielgetachew0922
 
Chapter 17
ashish bansal
 
Data structure notes
anujab5
 
Algorithm Design Techiques, divide and conquer
Minakshee Patil
 
Lec07-Greedy Algorithms.pdf Lec07-Greedy Algorithms.pdf
MAJDABDALLAH3
 
Ad

More from yashodamb (10)

PPTX
conversion of Infix to Postfix conversion using stack
yashodamb
 
PPTX
Introduction to Cloud computing concept.pptx
yashodamb
 
PPTX
Introduction to Database Management System.pptx
yashodamb
 
PDF
Cascading style sheet: complete explanation with some examples
yashodamb
 
PPTX
applets.pptx
yashodamb
 
PPTX
Navigation between the worksheets.pptx
yashodamb
 
PPTX
Implementing a Java Program.pptx
yashodamb
 
PPTX
Thread life cycle.pptx
yashodamb
 
PPTX
DECISION MAKING STATEMENTS.pptx
yashodamb
 
PPTX
History of C programming.pptx
yashodamb
 
conversion of Infix to Postfix conversion using stack
yashodamb
 
Introduction to Cloud computing concept.pptx
yashodamb
 
Introduction to Database Management System.pptx
yashodamb
 
Cascading style sheet: complete explanation with some examples
yashodamb
 
applets.pptx
yashodamb
 
Navigation between the worksheets.pptx
yashodamb
 
Implementing a Java Program.pptx
yashodamb
 
Thread life cycle.pptx
yashodamb
 
DECISION MAKING STATEMENTS.pptx
yashodamb
 
History of C programming.pptx
yashodamb
 
Ad

Recently uploaded (20)

PDF
The Different Types of Non-Experimental Research
Thelma Villaflores
 
PDF
Biological Bilingual Glossary Hindi and English Medium
World of Wisdom
 
PPT
Talk on Critical Theory, Part One, Philosophy of Social Sciences
Soraj Hongladarom
 
PDF
Knee Extensor Mechanism Injuries - Orthopedic Radiologic Imaging
Sean M. Fox
 
PPTX
Stereochemistry-Optical Isomerism in organic compoundsptx
Tarannum Nadaf-Mansuri
 
PPTX
QUARTER 1 WEEK 2 PLOT, POV AND CONFLICTS
KynaParas
 
PDF
QNL June Edition hosted by Pragya the official Quiz Club of the University of...
Pragya - UEM Kolkata Quiz Club
 
PDF
ARAL_Orientation_Day-2-Sessions_ARAL-Readung ARAL-Mathematics ARAL-Sciencev2.pdf
JoelVilloso1
 
PPTX
Cultivation practice of Litchi in Nepal.pptx
UmeshTimilsina1
 
PDF
Aprendendo Arquitetura Framework Salesforce - Dia 03
Mauricio Alexandre Silva
 
PPTX
How to Create Odoo JS Dialog_Popup in Odoo 18
Celine George
 
PPTX
PPT-Q1-WK-3-ENGLISH Revised Matatag Grade 3.pptx
reijhongidayawan02
 
PPTX
How to Create a PDF Report in Odoo 18 - Odoo Slides
Celine George
 
PPTX
Post Dated Cheque(PDC) Management in Odoo 18
Celine George
 
PPTX
Universal immunization Programme (UIP).pptx
Vishal Chanalia
 
PPTX
I AM MALALA The Girl Who Stood Up for Education and was Shot by the Taliban...
Beena E S
 
PDF
Women's Health: Essential Tips for Every Stage.pdf
Iftikhar Ahmed
 
PDF
Chapter-V-DED-Entrepreneurship: Institutions Facilitating Entrepreneurship
Dayanand Huded
 
PPTX
A PPT on Alfred Lord Tennyson's Ulysses.
Beena E S
 
PPTX
Neurodivergent Friendly Schools - Slides from training session
Pooky Knightsmith
 
The Different Types of Non-Experimental Research
Thelma Villaflores
 
Biological Bilingual Glossary Hindi and English Medium
World of Wisdom
 
Talk on Critical Theory, Part One, Philosophy of Social Sciences
Soraj Hongladarom
 
Knee Extensor Mechanism Injuries - Orthopedic Radiologic Imaging
Sean M. Fox
 
Stereochemistry-Optical Isomerism in organic compoundsptx
Tarannum Nadaf-Mansuri
 
QUARTER 1 WEEK 2 PLOT, POV AND CONFLICTS
KynaParas
 
QNL June Edition hosted by Pragya the official Quiz Club of the University of...
Pragya - UEM Kolkata Quiz Club
 
ARAL_Orientation_Day-2-Sessions_ARAL-Readung ARAL-Mathematics ARAL-Sciencev2.pdf
JoelVilloso1
 
Cultivation practice of Litchi in Nepal.pptx
UmeshTimilsina1
 
Aprendendo Arquitetura Framework Salesforce - Dia 03
Mauricio Alexandre Silva
 
How to Create Odoo JS Dialog_Popup in Odoo 18
Celine George
 
PPT-Q1-WK-3-ENGLISH Revised Matatag Grade 3.pptx
reijhongidayawan02
 
How to Create a PDF Report in Odoo 18 - Odoo Slides
Celine George
 
Post Dated Cheque(PDC) Management in Odoo 18
Celine George
 
Universal immunization Programme (UIP).pptx
Vishal Chanalia
 
I AM MALALA The Girl Who Stood Up for Education and was Shot by the Taliban...
Beena E S
 
Women's Health: Essential Tips for Every Stage.pdf
Iftikhar Ahmed
 
Chapter-V-DED-Entrepreneurship: Institutions Facilitating Entrepreneurship
Dayanand Huded
 
A PPT on Alfred Lord Tennyson's Ulysses.
Beena E S
 
Neurodivergent Friendly Schools - Slides from training session
Pooky Knightsmith
 

Unit-3 greedy method, Prim's algorithm, Kruskal's algorithm.pdf

  • 1. UNIT-3 The Greedy Method: Introduction, Huffman Trees and codes, Minimum Coin Change problem, Knapsack problem, Job sequencing with deadlines, Minimum Cost Spanning Trees, Single Source Shortest paths. Q) Define the following terms. i. Feasible solution ii. Objective function iii. Optimal solution Feasible Solution: Any subset that satisfies the given constraints is called feasible solution. Objective Function: Any feasible solution needs either maximize or minimize a given function which is called objective function. Optimal Solution: Any feasible solution that maximizes or minimizes the given objective function is called an optimal solution. Q) Describe Greedy technique with an example. Greedy method constructs a solution to an optimization problem piece by piece through a sequence of choices that are:  feasible, i.e. satisfying the constraints.  locally optimal, i.e., it has to be the best local choice among all feasible choices available on that step.  irrevocable, i.e., once made, it cannot be changed on subsequent steps of the algorithm. For some problems, it yields a globally optimal solution for every instance. The following is the general greedy approach for control abstraction of subset paradigm. Algorithm Greedy(a,n) //a[1:n] contains n inputs { solution := // initializes to empty for i:=1 to n do { x := select(a); if Feasible(solution x) then solution := union(solution, x) } return solution; } Eg. Minimum Coin Change: Given unlimited amounts of coins of denominations d1 > … > dm , give change for amount n with the least number of coins. here, d1 = 25c, d2 =10c, d3 = 5c, d4 = 1c and n = 48c
  • 2. Greedy approach: At each step we take a maximum denomination coin which is less than or equal to remaining amount required. Step 1: 48 – 25 = 23 Step 2: 23 – 10 = 13 Step 3: 13 – 10 = 03 Step 4: 03 – 01 = 02 Step 5: 02 – 01 = 01 Step 6: 01 – 01 = 00 Solution: <1, 2, 0, 3> i.e; d1 – 1coin, d2 – 2 coins, d3 – 0 coin and d4 – 3 coins. Greedy solution is optimal for any amount and “normal’’ set of denominations. Q) Explain Huffman tree and Huffman code with suitable example. Huffman tree is any binary tree with edges labeled with 0’s and 1’s yields a prefix-free code of characters assigned to its leaves. Huffman coding or prefix coding is a lossless data compression algorithm. The idea is to assign variable-length codes to input characters, lengths of the assigned codes are based on the frequencies of corresponding characters. Algorithm to build Huffman tree: // Input is an array of unique characters along with their frequency of occurrences and output is Huffman Tree. 1. Create a leaf node for each unique character and build a min heap of all leaf nodes. 2. Extract two nodes with the minimum frequency from the min heap. 3. Create a new internal node with a frequency equal to the sum of the two nodes frequencies. Make the first extracted node as its left child and the other extracted node as its right child. Add this node to the min heap. 4. Repeat step2 and step3 until the heap contains only one node. The remaining node is the root node and the tree is complete. Time complexity: O(nlogn) where n is the number of unique characters. If there are n nodes, extractMin() is called 2*(n – 1) times. extractMin() takes O(logn) time as it calles minHeapify(). So, overall complexity is O(nlogn). Eg. character A B C D _ frequency 0.35 0.1 0.2 0.2 0.15 The code word for the character will be 001, 010, 011, 100 and 101 (fixed length encoding) without using Huffman coding, i.e; on an average we need 3 bits to represent a character. Step1:
  • 3. Step2: Step3: Step4: Step5: Therefore, the codeword we get after using Huffman coding is character A B C D _ frequency 0.35 0.1 0.2 0.2 0.15 codeword 11 100 00 01 101 Average bits per character using Huffman coding = 2*0.35 + 3*0.1 + 2*0.2 + 2*0.2 + 3*0.15 = 2.25 Therefore, compression ratio: (3 - 2.25)/3*100% = 25%
  • 4. Q) Briefly explain about knapsack problem with an example. Knapsack Problem Given a set of items, each with a weight and a value, determine a subset of items to include in a collection so that the total weight is less than or equal to a given limit and the total value is as large as possible. Fractional Knapsack In this case, items can be broken into smaller pieces, hence we can select fractions of items. According to the problem statement,  There are n items in the store  Weight of ith item wi > 0  Profit for ith item pi>0 and  Capacity of the Knapsack is W In this version of Knapsack problem, items can be broken into smaller pieces. So, the thief may take only a fraction xi of ith item. 0 ≤ xi ≤ 1 The ith item contributes the weight xi *wi to the total weight in the knapsack and profit xi.pi to the total profit. Hence, the objective of this algorithm is to Maximize ∑ subject to constraint, ∑ ≤ W It is clear that an optimal solution must fill the knapsack exactly, otherwise we could add a fraction of one of the remaining items and increase the overall profit. Thus, an optimal solution can be obtained by ∑ = W Algorithm Greedyknapsack(m,n) //p[1:n] and w[1:n] contain the prfits and weights respectively //all n objects are ordered p[i]/w[i] ≥ p[i+1]/w[i+1] //m is the knapsack size and x[1:n] is the solution vector { for i:=1 to n do x[i]:=0.0; u := m; for i:=1 to n do { if(w[i] > u) then break; x[i] := 1; u := u - w[i]; } if(i ≤ n) then x[i]:= u/w[i]; }
  • 5. Analysis If the provided items are already sorted into a decreasing order of pi/wi, then the while loop takes a time in O(n); Therefore, the total time including the sort is in O(n logn). Eg. Let us consider that the capacity of the knapsack W = 60 and the list of provided items are shown in the following table − Item A B C D Profit 280 100 120 120 Weight 40 10 20 24 Step 1: find p/w ratio for each item. Item A B C D Profit 280 100 120 120 Weight 40 10 20 24 Ratio pi/wi 7 10 6 5 Step2: As the provided items are not sorted based on pi/wi. After sorting, the items are as shown in the following table. Item B A C D Profit 100 280 120 120 Weight 10 40 20 24 Ratio pi/wi 10 7 6 5 Step3: We choose 1st item B as weight of B is less than the capacity of the knapsack. Now knapsack contains weight = 60 – 10 = 50 Step4: item A is chosen, as the available capacity of the knapsack is greater than the weight of A. Now knapsack contains weight = 50 – 40 = 10 Step5: Now, C is chosen as the next item. However, the whole item cannot be chosen as the remaining capacity of the knapsack is less than the weight of C. Hence, fraction of C (i.e. (60 − 50)/20) is chosen. Now, the capacity of the Knapsack is equal to the selected items. Hence, no more item can be selected. The total weight of the selected items is 10 + 40 + 20 * (10/20) = 60 And the total profit is 100 + 280 + 120 * (10/20) = 380 + 60 = 440
  • 6. Q) Explain job sequencing with deadlines indetail with an example. We are given a set of n jobs. Associated with job i is an integer deadline di ≥ 0 and a profit pi>0. For any job i, the profit pi is earned iff the job is completed by its deadline. To complete a job one has to process the job on a machine for one unit of time. Only one machine is available for processing jobs. A feasible solution for this problem is a subset J of jobs such that each job in this subset can be completed by its deadline. The value of a feasible solution J is the sum of the profits of the jobs in J. i.e; is ∑ An optimal solution is a feasible solution with maximum value.
  • 7. Eg. The above is exhaustive technique in which we check all 1 and 2 jobs feasible possibilities and the optimal is 3rd sequence which is 4,1 sequence. The following algorithm is a high level description of job sequencing:
  • 8. The following JS is the correct implementation of above algorithm: The above algorithm assumes that the jobs are already sorted such that P1 ≥ p2 ≥ ... ≥ pn. Further it assumes that n>=1 and the deadline d[i] of job i is atleast 1. For the above algorithm JS there are 2 possible parameters in terms of which its time complexity can be measured. 1. the number of jobs, n 2. the number of jobs included in the solution J, which is s. The while loop in the above algorithm is iterated atmost k times. Each iteration takes O(1) time. The body of the conditional operator if require O(k-r) time to insert a job i. Hence the total time for each iteration of the for loop is O(k). This loop is iterated for n-1 times. If s is the final value of k, that is, S is the number of jobs in the final solution, then the total time needed by the algorithm is O(sn). Since s ≤ n, in worst case, the time complexity is O(n2)
  • 10. Q) What is minimum spanning tree? i) Explain Prim’s algorithm with an example. ii) Explain Kruskal’s algorithm with an example. A spanning tree of an undirected connected graph is its connected acyclic subgraph (i.e., a tree) that contains all the vertices of the graph. If such a graph has weights assigned to its edges, A minimum spanning tree is its spanning tree of the smallest weight, where the weight of a tree is defined as the sum of the weights on all its edges. The minimum spanning tree problem is the problem of finding a minimum spanning tree for a given weighted connected graph. Eg. In the above image (a) is given graph and (b),(c) are two different spanning trees. Image (c) is the minimum spanning tree as it have less cost compare to (b). i. Prim’s algorithm:  Start with tree T1 consisting of one (any) vertex and “grow” tree one vertex at a time to produce MST through a series of expanding subtrees T1, T2, …, Tn  On each iteration, construct Ti+1 from Ti by adding vertex not in Ti that is closest to those already in Ti (this is a “greedy” step!)  Stop when all vertices are included.
  • 11.  Needs priority queue for locating closest fringe(not visited) vertex.  Efficiency: i. O(n2) for weight matrix representation of graph and array implementation of priority queue ii. O(m log n) for adjacency lists representation of graph with n vertices and m edges and min-heap implementation of the priority queue
  • 12. Eg.
  • 15. ii. Kruskal’s algorithm:  Sort the edges in nondecreasing order of lengths  “Grow” tree one edge at a time to produce MST through a series of expanding forests F1, F2, …, Fn-1  On each iteration, add the next edge on the sorted list unless this would create a cycle. (If it would, skip the edge.)  Algorithm looks easier than Prim’s but is harder to implement (checking for cycles!)  Cycle checking: a cycle is created iff added edge connects vertices in the same connected component  Runs in O(m log m) time, with m = |E|. The time is mostly spent on sorting.
  • 16. Q) Explain indetail about single source shortest path problem. Single Source Shortest Paths Problem: Given a weighted connected (directed) graph G, find shortest paths from source vertex s to each of the other vertices. Dijkstra’s algorithm: Similar to Prim’s MST algorithm, with a different way of computing numerical labels: Among vertices not already in the tree, it finds vertex u with the smallest sum dv + w(v,u)
  • 17. where v is a vertex for which shortest path has been already found on preceding iterations (such vertices form a tree rooted at s) dv is the length of the shortest path from source s to v w(v,u) is the length (weight) of edge from v to u.  Doesn’t work for graphs with negative weights  Applicable to both undirected and directed graphs  Efficiency o O(|V|2) for graphs represented by weight matrix and array implementation of priority queue o O(|E|log|V|) for graphs represented by adj. lists and min-heap implementation of priority queue
  • 21. Eg 2. The shortest paths and their lengths are: From a to b: a – b of length 3 From a to d: a – b – d of length 5 From a to c: a – b – c of length 7 From a to e: a – b – d – e of length 9