SlideShare a Scribd company logo
1
Front End vs Back End of a Compilers
 The phases of a compiler are collected into
front end and back end.
 The front end consists of those phases that
depend primarily on the source program.These
normally include Lexical and Syntactic
analysis,Semantic analysis ,and the generation
of intermediate code
2
Front End vs Back End of a
Compilers(Cont’d)
 A certain amount of code optimization can be
done by front end as well.
3
Front End vs Back End of a
Compilers(Cont’d)
 The BACK END includes the code optimization
phase and final code generation phase,along
with the necessary error handling and symbol
table operations.
4
Front End vs Back End of a
Compilers(Cont’d)
 The front end analyzes the source program
and produces intermediate code while the
back end synthesizes the target program
from the intermediate code.
 A naive approach (front force) to that front end
might run the phases serially
5
Front End vs Back End of a
Compilers(Cont’d)
 It is also tempting to compile several different
languages into the same intermediate
language and use a common back end for the
different front ends, thereby obtaining several
compilers for one machine.
 However, because of subtle difference in the
view points of different language, there has
been only limited success in this direction.
6
Passes
 In an implementation of a complier, portion of
one or more phases are combined into a
module called a pass
7
Passes(cont’d)
 Several phases of complier are usually
implemented in a single pass consisting of
reading an input file and writing as output file.
 It is common for several phases to be grouped
into one pass and for the activity of these
phases to be interleaved during the pass
8
Passes(cont’d)
 For example lexical analysis ,syntax analysis,
semantic analysis and intermediate code
generation might be grouped into one pass.
 If so, the token stream after lexical analysis
may be translated directly into intermediate
code.
9
Passes(cont’d)
 A pass reads the source program or output of
the previous pass make the transformation
specified by its phases and writes output into
an intermediate file , which may then be read
by a subsequent pass.
10
Multi Pass Compiler
 A multi pass compiler can be made a useless
space than a single pass compiler.
 Since the space occupied by the complier
program for one pass can be reused by the
following pass.
11
Multi Pass Compiler(cont’d)
 A multi pass complier is of the course slower
than a single pass compiler, because each
pass reads and writes an intermediate file .
12
Multi Pass Compiler(cont’d)
 Thus compiler running in computers with small
memory would normally use several passes
while on a computer with a large random
memory , a compiler with fewer passes would
be possible.
13
Reducing The No of Passes
It is desirable to have relatively few passes, since
it takes time to read and write intermediate files.
on the other hand ,if we group several phases
into one pass,we may be forced to keep the
entire program in memory.
14
Reducing The No of Passes(cont’d)
Because one phase may need
information in a different order than a previous
phase produce it.
15
Reducing The No of Passes(cont’d)
The internal form of the program may be
considerably larger than either the source
program or the target program ,so this space
may not be a trivial matter.
16
Reducing The No of Passes(cont’d)
For some phases,grouping into one pass
presents few problems.For example,the
interface b/w lexical and syntactic analyzers
can often be limited to a single token.
17
Reducing The No of Passes(cont’d)
On the other hand, it is often very hard to perform
Code generation until the intermediate representation
has been completely generated.
18
Reducing The No of Passes(cont’d)
For example, languages like PL/I and Algol 68
permit variables to be used before they are
declared. we can not generate the target code
for a construct if we do not know the types of
variables involved in that construct.
19
Reducing The No of Passes(cont’d)
 Similarly most languages allow goto`s that
jump forward in the code.
 We can not determine the target addresses of
such a jump until we have seen the intervening
source code and generated target code for it.
20
Reducing The No of Passes(cont’d)
 In some cases ,it is possible to leave a blank
slot for missing information ,and fill in the slot
when the information becomes available.
 In particular ,intermediate and target code
generation can often be merged into one pass
using a technique called “back patching”
21
Reducing The No of Passes(cont’d)
we can combine the action of the passes as
follows.On encountering an assembly statement
that is forward reference ,say
GOTO target.
22
Reducing The No of Passes(cont’d)
We generate a skeletal instruction ,with the
machine operation code for GOTO and blanks
for the address. All instructions with blanks
for the address of target are kept in a list
associated with the symbol table entry for
target.
23
Reducing The No of Passes(cont’d)
The blanks are filled in when we
finally encounter and instruction such as
target : MOV R1, bar
24
Reducing The No of Passes(cont’d)
And determine the value of target ;it is the
address of the current instruction.We then
back patch by going down the list for target
of all the instructions that need its address,
substituting the address of target for the
blanks in the address fields of those
instructions
25
Reducing The No of Passes(cont’d)
This approach is easy to implement
if the instructions can be kept in memory until
all target addresses can be determined.
26
Reducing The No of Passes(cont’d)
This approach is a reasonable one for an
Assembler that can keep all its output in
memory.Since the intermediate and final
representations of code for an assembler
are roughly the same.
27
Reducing The No of Passes(cont’d)
 And surely of approximately the same
enough,back patching over the length of the
entire assembly program is not infeasible.
 However, in a compiler ,with a space
consuming intermediate code ,we may need
to be careful about the distance over which
back patching occurs.
28
Compiler Construction Tools
 A number of tools have been developed
specifically to held construct compilers.These
tools variously called compiler-
compilers,compiler-generators, or translator-
writing systems,which produce a compiler from
some form of specification of a source
language and target m/c language.
29
Compiler Construction Tools(cont’d)
 Largely ,they are oriented around a
particular model of languages and they are
most suitable for generating compilers of
languages similar to the model.
30
Compiler Construction Tools(cont’d)
 For example , it is tempting to assume that
lexical analyzers for all languages are
essentially the same,except for the particular
key words and signs recognized.
31
Compiler Construction Tools(cont’d)
 Many compiler-compilers do in fact produce
fixed lexical analysis routines for use in the
generated compiler.
32
 These routines differ only in the list of key
words recognized ,and this list is all that needs
to be supplied by the user. The approach is
valid, but may be unworkable if it is required to
recognize nonstandard tokens,such as
identifiers that may include certain character
other than letters and digits.
Compiler Construction Tools(cont’d)
33
Compiler Construction Tools(cont’d)
 Some general tools have been created for the
automatic design of specific compiler
components.
 These tools use specialized languages for
specifying and implementing the
component ,and many use algorithms that are
quite sophisticated.
34
Compiler Construction Tools(cont’d)
 The most successful tools are those that hide
the details of the generation algorithm and
produce components that can be easily
integrated into the remainder of a compiler
35
Compiler Construction Tools(cont’d)
The following is a list of some useful compiler
construction tools.
1) Parser Generators
2) Scanner Generators
3) Syntax-directed translation Engines
4) Automatic Code Generators
5) Data Flow Engines
36
1) Parser Generators
 These produce syntax analyzers ,normally
from input that is based on a context free
grammar. In early compilers, syntax analysis
consumed not only a large fraction of the
running time of a compiler but a large fraction
of the intellectual effort of writing a compiler.
37
Parser Generators(cont’d)
 This phase is now considered one of the
easiest to implement.
 Many parser generators utilize powerful
parsing algorithms that are too complex to be
carried out by hand.
38
2) Scanner Generators
 These automatically generate lexical analyzer
normally from a specification based on regular
expressions.
39
3) Syntax Directed Translation Engine
 These produce collections of routines that walk
the parse tree ,generating intermediate cods.
40
4) Automatic Code Generators
 Such a tool takes a collection of rules that
defines the translation of each operation of the
intermediate language into the m/c for the
target machine.
41
Automatic Code Generators(cont’d)
 The rules must include sufficient detail that we
can handle the different possible access
methods for data e.g. variables may be in
registers or a fixed (static)location in memory
or may be allocated a position on a stack.
42
5) Data Flow Engines.
 Much of the information needed to perform
good code optimization involves “data flow
analysis “ the gathering of information about
how values are transmitted from one part of
the programme to each part.
43
The Phases Of Compiler
44
Source Program
Symbol Table Error Handler
Manager
Target Program
Lexical Analyzer
Syntax Analyzer
Semantic Analyzer
Intermediate Code
Generator
Code Optimizer
Code Generator
45
 A Compiler operates in phases ,each of which
transforms the source programme from one
representation to another.
 In practice,some of the phases may be
grouped together.
 The first three phases ,forming the bulk of the
analysis portion of a compiler
46
 Two other activities , symbol table
management and error handling, are shown
interacting with the six phases of lexical
analysis, syntax analysis,semantic
analysis,intermediate code generation,code
optimization,and code generation.
 Informally,we shall also call the symbol table
manager and error handler Phases.
47
THANKS

More Related Content

Similar to Data structure and algorithm.lect-03.ppt (20)

DOCX
Compiler Design Material
Dr. C.V. Suresh Babu
 
PDF
lec00-Introduction.pdf
wigewej294
 
PPT
Compiler Design in Computer Applications
Mohit422982
 
PPT
Compiler design computer science engineering.ppt
khandareshobhit17
 
PDF
COMPILER DESIGN.pdf
AdiseshaK
 
PPT
1 - Introduction to Compilers.ppt
Rakesh Kumar
 
PDF
Phases of compiler
ahsaniftikhar19
 
PPT
introduction of compiler unit 1 phases of compiler
vrawat4
 
PPTX
1 compiler outline
ASHOK KUMAR REDDY
 
PPTX
Unit2_CD.pptx more about compilation of the day
k12196987
 
PPTX
Introduction to Compilers
vijaya603274
 
PPTX
COMPILER DESIGN PPTS.pptx
MUSHAMHARIKIRAN6737
 
PDF
Compiler_Lecture1.pdf
AkarTaher
 
PPTX
UNIT 1 COMPILER DESIGN TO BE ENHANCE THEIR FEATURES AND BEHAVIOURS.pptx
Neelkaranbind
 
PDF
Compiler design Introduction
Aman Sharma
 
PPT
Cpcs302 1
guest5de1a5
 
PPTX
role of lexical anaysis
Sudhaa Ravi
 
PPTX
Chapter 2 Program language translation.pptx
dawod yimer
 
PPTX
4_5802928814682016556.pptx
AshenafiGirma5
 
PPTX
Cd ch1 - introduction
mengistu23
 
Compiler Design Material
Dr. C.V. Suresh Babu
 
lec00-Introduction.pdf
wigewej294
 
Compiler Design in Computer Applications
Mohit422982
 
Compiler design computer science engineering.ppt
khandareshobhit17
 
COMPILER DESIGN.pdf
AdiseshaK
 
1 - Introduction to Compilers.ppt
Rakesh Kumar
 
Phases of compiler
ahsaniftikhar19
 
introduction of compiler unit 1 phases of compiler
vrawat4
 
1 compiler outline
ASHOK KUMAR REDDY
 
Unit2_CD.pptx more about compilation of the day
k12196987
 
Introduction to Compilers
vijaya603274
 
COMPILER DESIGN PPTS.pptx
MUSHAMHARIKIRAN6737
 
Compiler_Lecture1.pdf
AkarTaher
 
UNIT 1 COMPILER DESIGN TO BE ENHANCE THEIR FEATURES AND BEHAVIOURS.pptx
Neelkaranbind
 
Compiler design Introduction
Aman Sharma
 
Cpcs302 1
guest5de1a5
 
role of lexical anaysis
Sudhaa Ravi
 
Chapter 2 Program language translation.pptx
dawod yimer
 
4_5802928814682016556.pptx
AshenafiGirma5
 
Cd ch1 - introduction
mengistu23
 

More from KamranAli649587 (20)

PPT
sect7--ch9--legal_priv_ethical_issues.ppt
KamranAli649587
 
PPTX
bba system analysis and algorithm development.pptx
KamranAli649587
 
PPTX
assignment and database algorithmno 4.pptx
KamranAli649587
 
PDF
Ict in healthcare and well being use how it will be benefit
KamranAli649587
 
PPTX
Angular is and php fram work as idea of secience
KamranAli649587
 
PDF
UNIT-2-liang-barsky-clipping-algorithm-KM.pdf
KamranAli649587
 
PPT
Data design and analysis of computing tools
KamranAli649587
 
PPTX
graphs data structure and algorithm link list
KamranAli649587
 
PPT
lecture10 date structure types of graph and terminology
KamranAli649587
 
PPTX
Encoder-and-decoder.pptx
KamranAli649587
 
PPTX
Radio propagation model...pptx
KamranAli649587
 
PPT
Loops_and_FunctionsWeek4_0.ppt
KamranAli649587
 
PPT
Lecture+06-TypesVars.ppt
KamranAli649587
 
PPT
C++InputOutput.PPT
KamranAli649587
 
PDF
radiopropagation-140328202308-phpapp01.pdf
KamranAli649587
 
PPTX
cluster.pptx
KamranAli649587
 
PPT
Week11-EvaluationMethods.ppt
KamranAli649587
 
PPT
Week6-Sectionsofapaper.ppt
KamranAli649587
 
PDF
null-13.pdf
KamranAli649587
 
PDF
Reaches
KamranAli649587
 
sect7--ch9--legal_priv_ethical_issues.ppt
KamranAli649587
 
bba system analysis and algorithm development.pptx
KamranAli649587
 
assignment and database algorithmno 4.pptx
KamranAli649587
 
Ict in healthcare and well being use how it will be benefit
KamranAli649587
 
Angular is and php fram work as idea of secience
KamranAli649587
 
UNIT-2-liang-barsky-clipping-algorithm-KM.pdf
KamranAli649587
 
Data design and analysis of computing tools
KamranAli649587
 
graphs data structure and algorithm link list
KamranAli649587
 
lecture10 date structure types of graph and terminology
KamranAli649587
 
Encoder-and-decoder.pptx
KamranAli649587
 
Radio propagation model...pptx
KamranAli649587
 
Loops_and_FunctionsWeek4_0.ppt
KamranAli649587
 
Lecture+06-TypesVars.ppt
KamranAli649587
 
C++InputOutput.PPT
KamranAli649587
 
radiopropagation-140328202308-phpapp01.pdf
KamranAli649587
 
cluster.pptx
KamranAli649587
 
Week11-EvaluationMethods.ppt
KamranAli649587
 
Week6-Sectionsofapaper.ppt
KamranAli649587
 
null-13.pdf
KamranAli649587
 
Ad

Recently uploaded (20)

DOCX
8th International Conference on Electrical Engineering (ELEN 2025)
elelijjournal653
 
PPTX
ISO/IEC JTC 1/WG 9 (MAR) Convenor Report
Kurata Takeshi
 
PPTX
Types of Bearing_Specifications_PPT.pptx
PranjulAgrahariAkash
 
PPTX
MobileComputingMANET2023 MobileComputingMANET2023.pptx
masterfake98765
 
PDF
Set Relation Function Practice session 24.05.2025.pdf
DrStephenStrange4
 
PDF
MAD Unit - 2 Activity and Fragment Management in Android (Diploma IT)
JappanMavani
 
PDF
MAD Unit - 1 Introduction of Android IT Department
JappanMavani
 
PPTX
Innowell Capability B0425 - Commercial Buildings.pptx
regobertroza
 
PPTX
MPMC_Module-2 xxxxxxxxxxxxxxxxxxxxx.pptx
ShivanshVaidya5
 
PPTX
Day2 B2 Best.pptx
helenjenefa1
 
PPTX
Thermal runway and thermal stability.pptx
godow93766
 
DOCX
CS-802 (A) BDH Lab manual IPS Academy Indore
thegodhimself05
 
PDF
GTU Civil Engineering All Semester Syllabus.pdf
Vimal Bhojani
 
PPTX
Break Statement in Programming with 6 Real Examples
manojpoojary2004
 
PDF
International Journal of Information Technology Convergence and services (IJI...
ijitcsjournal4
 
PDF
Biomechanics of Gait: Engineering Solutions for Rehabilitation (www.kiu.ac.ug)
publication11
 
PPTX
Introduction to Neural Networks and Perceptron Learning Algorithm.pptx
Kayalvizhi A
 
PDF
ARC--BUILDING-UTILITIES-2-PART-2 (1).pdf
IzzyBaniquedBusto
 
PPTX
Pharmaceuticals and fine chemicals.pptxx
jaypa242004
 
PDF
Pressure Measurement training for engineers and Technicians
AIESOLUTIONS
 
8th International Conference on Electrical Engineering (ELEN 2025)
elelijjournal653
 
ISO/IEC JTC 1/WG 9 (MAR) Convenor Report
Kurata Takeshi
 
Types of Bearing_Specifications_PPT.pptx
PranjulAgrahariAkash
 
MobileComputingMANET2023 MobileComputingMANET2023.pptx
masterfake98765
 
Set Relation Function Practice session 24.05.2025.pdf
DrStephenStrange4
 
MAD Unit - 2 Activity and Fragment Management in Android (Diploma IT)
JappanMavani
 
MAD Unit - 1 Introduction of Android IT Department
JappanMavani
 
Innowell Capability B0425 - Commercial Buildings.pptx
regobertroza
 
MPMC_Module-2 xxxxxxxxxxxxxxxxxxxxx.pptx
ShivanshVaidya5
 
Day2 B2 Best.pptx
helenjenefa1
 
Thermal runway and thermal stability.pptx
godow93766
 
CS-802 (A) BDH Lab manual IPS Academy Indore
thegodhimself05
 
GTU Civil Engineering All Semester Syllabus.pdf
Vimal Bhojani
 
Break Statement in Programming with 6 Real Examples
manojpoojary2004
 
International Journal of Information Technology Convergence and services (IJI...
ijitcsjournal4
 
Biomechanics of Gait: Engineering Solutions for Rehabilitation (www.kiu.ac.ug)
publication11
 
Introduction to Neural Networks and Perceptron Learning Algorithm.pptx
Kayalvizhi A
 
ARC--BUILDING-UTILITIES-2-PART-2 (1).pdf
IzzyBaniquedBusto
 
Pharmaceuticals and fine chemicals.pptxx
jaypa242004
 
Pressure Measurement training for engineers and Technicians
AIESOLUTIONS
 
Ad

Data structure and algorithm.lect-03.ppt

  • 1. 1 Front End vs Back End of a Compilers  The phases of a compiler are collected into front end and back end.  The front end consists of those phases that depend primarily on the source program.These normally include Lexical and Syntactic analysis,Semantic analysis ,and the generation of intermediate code
  • 2. 2 Front End vs Back End of a Compilers(Cont’d)  A certain amount of code optimization can be done by front end as well.
  • 3. 3 Front End vs Back End of a Compilers(Cont’d)  The BACK END includes the code optimization phase and final code generation phase,along with the necessary error handling and symbol table operations.
  • 4. 4 Front End vs Back End of a Compilers(Cont’d)  The front end analyzes the source program and produces intermediate code while the back end synthesizes the target program from the intermediate code.  A naive approach (front force) to that front end might run the phases serially
  • 5. 5 Front End vs Back End of a Compilers(Cont’d)  It is also tempting to compile several different languages into the same intermediate language and use a common back end for the different front ends, thereby obtaining several compilers for one machine.  However, because of subtle difference in the view points of different language, there has been only limited success in this direction.
  • 6. 6 Passes  In an implementation of a complier, portion of one or more phases are combined into a module called a pass
  • 7. 7 Passes(cont’d)  Several phases of complier are usually implemented in a single pass consisting of reading an input file and writing as output file.  It is common for several phases to be grouped into one pass and for the activity of these phases to be interleaved during the pass
  • 8. 8 Passes(cont’d)  For example lexical analysis ,syntax analysis, semantic analysis and intermediate code generation might be grouped into one pass.  If so, the token stream after lexical analysis may be translated directly into intermediate code.
  • 9. 9 Passes(cont’d)  A pass reads the source program or output of the previous pass make the transformation specified by its phases and writes output into an intermediate file , which may then be read by a subsequent pass.
  • 10. 10 Multi Pass Compiler  A multi pass compiler can be made a useless space than a single pass compiler.  Since the space occupied by the complier program for one pass can be reused by the following pass.
  • 11. 11 Multi Pass Compiler(cont’d)  A multi pass complier is of the course slower than a single pass compiler, because each pass reads and writes an intermediate file .
  • 12. 12 Multi Pass Compiler(cont’d)  Thus compiler running in computers with small memory would normally use several passes while on a computer with a large random memory , a compiler with fewer passes would be possible.
  • 13. 13 Reducing The No of Passes It is desirable to have relatively few passes, since it takes time to read and write intermediate files. on the other hand ,if we group several phases into one pass,we may be forced to keep the entire program in memory.
  • 14. 14 Reducing The No of Passes(cont’d) Because one phase may need information in a different order than a previous phase produce it.
  • 15. 15 Reducing The No of Passes(cont’d) The internal form of the program may be considerably larger than either the source program or the target program ,so this space may not be a trivial matter.
  • 16. 16 Reducing The No of Passes(cont’d) For some phases,grouping into one pass presents few problems.For example,the interface b/w lexical and syntactic analyzers can often be limited to a single token.
  • 17. 17 Reducing The No of Passes(cont’d) On the other hand, it is often very hard to perform Code generation until the intermediate representation has been completely generated.
  • 18. 18 Reducing The No of Passes(cont’d) For example, languages like PL/I and Algol 68 permit variables to be used before they are declared. we can not generate the target code for a construct if we do not know the types of variables involved in that construct.
  • 19. 19 Reducing The No of Passes(cont’d)  Similarly most languages allow goto`s that jump forward in the code.  We can not determine the target addresses of such a jump until we have seen the intervening source code and generated target code for it.
  • 20. 20 Reducing The No of Passes(cont’d)  In some cases ,it is possible to leave a blank slot for missing information ,and fill in the slot when the information becomes available.  In particular ,intermediate and target code generation can often be merged into one pass using a technique called “back patching”
  • 21. 21 Reducing The No of Passes(cont’d) we can combine the action of the passes as follows.On encountering an assembly statement that is forward reference ,say GOTO target.
  • 22. 22 Reducing The No of Passes(cont’d) We generate a skeletal instruction ,with the machine operation code for GOTO and blanks for the address. All instructions with blanks for the address of target are kept in a list associated with the symbol table entry for target.
  • 23. 23 Reducing The No of Passes(cont’d) The blanks are filled in when we finally encounter and instruction such as target : MOV R1, bar
  • 24. 24 Reducing The No of Passes(cont’d) And determine the value of target ;it is the address of the current instruction.We then back patch by going down the list for target of all the instructions that need its address, substituting the address of target for the blanks in the address fields of those instructions
  • 25. 25 Reducing The No of Passes(cont’d) This approach is easy to implement if the instructions can be kept in memory until all target addresses can be determined.
  • 26. 26 Reducing The No of Passes(cont’d) This approach is a reasonable one for an Assembler that can keep all its output in memory.Since the intermediate and final representations of code for an assembler are roughly the same.
  • 27. 27 Reducing The No of Passes(cont’d)  And surely of approximately the same enough,back patching over the length of the entire assembly program is not infeasible.  However, in a compiler ,with a space consuming intermediate code ,we may need to be careful about the distance over which back patching occurs.
  • 28. 28 Compiler Construction Tools  A number of tools have been developed specifically to held construct compilers.These tools variously called compiler- compilers,compiler-generators, or translator- writing systems,which produce a compiler from some form of specification of a source language and target m/c language.
  • 29. 29 Compiler Construction Tools(cont’d)  Largely ,they are oriented around a particular model of languages and they are most suitable for generating compilers of languages similar to the model.
  • 30. 30 Compiler Construction Tools(cont’d)  For example , it is tempting to assume that lexical analyzers for all languages are essentially the same,except for the particular key words and signs recognized.
  • 31. 31 Compiler Construction Tools(cont’d)  Many compiler-compilers do in fact produce fixed lexical analysis routines for use in the generated compiler.
  • 32. 32  These routines differ only in the list of key words recognized ,and this list is all that needs to be supplied by the user. The approach is valid, but may be unworkable if it is required to recognize nonstandard tokens,such as identifiers that may include certain character other than letters and digits. Compiler Construction Tools(cont’d)
  • 33. 33 Compiler Construction Tools(cont’d)  Some general tools have been created for the automatic design of specific compiler components.  These tools use specialized languages for specifying and implementing the component ,and many use algorithms that are quite sophisticated.
  • 34. 34 Compiler Construction Tools(cont’d)  The most successful tools are those that hide the details of the generation algorithm and produce components that can be easily integrated into the remainder of a compiler
  • 35. 35 Compiler Construction Tools(cont’d) The following is a list of some useful compiler construction tools. 1) Parser Generators 2) Scanner Generators 3) Syntax-directed translation Engines 4) Automatic Code Generators 5) Data Flow Engines
  • 36. 36 1) Parser Generators  These produce syntax analyzers ,normally from input that is based on a context free grammar. In early compilers, syntax analysis consumed not only a large fraction of the running time of a compiler but a large fraction of the intellectual effort of writing a compiler.
  • 37. 37 Parser Generators(cont’d)  This phase is now considered one of the easiest to implement.  Many parser generators utilize powerful parsing algorithms that are too complex to be carried out by hand.
  • 38. 38 2) Scanner Generators  These automatically generate lexical analyzer normally from a specification based on regular expressions.
  • 39. 39 3) Syntax Directed Translation Engine  These produce collections of routines that walk the parse tree ,generating intermediate cods.
  • 40. 40 4) Automatic Code Generators  Such a tool takes a collection of rules that defines the translation of each operation of the intermediate language into the m/c for the target machine.
  • 41. 41 Automatic Code Generators(cont’d)  The rules must include sufficient detail that we can handle the different possible access methods for data e.g. variables may be in registers or a fixed (static)location in memory or may be allocated a position on a stack.
  • 42. 42 5) Data Flow Engines.  Much of the information needed to perform good code optimization involves “data flow analysis “ the gathering of information about how values are transmitted from one part of the programme to each part.
  • 43. 43 The Phases Of Compiler
  • 44. 44 Source Program Symbol Table Error Handler Manager Target Program Lexical Analyzer Syntax Analyzer Semantic Analyzer Intermediate Code Generator Code Optimizer Code Generator
  • 45. 45  A Compiler operates in phases ,each of which transforms the source programme from one representation to another.  In practice,some of the phases may be grouped together.  The first three phases ,forming the bulk of the analysis portion of a compiler
  • 46. 46  Two other activities , symbol table management and error handling, are shown interacting with the six phases of lexical analysis, syntax analysis,semantic analysis,intermediate code generation,code optimization,and code generation.  Informally,we shall also call the symbol table manager and error handler Phases.