Analysis and synthesis phase of compiler

There are two main phases in the compiler.

1. Analysis - Front end of a compiler

2. Synthesis - Back end of a compiler

In this tutorial, we will learn the role of analysis and synthesis phase of a compiler.

Analysis phase of compiler

Analysis phase reads the source program and splits it into multiple tokens and constructs the intermediate representation of the source program.

And also checks and indicates the syntax and semantic errors of a source program.

It collects information about the source program and prepares the symbol table . Symbol table will be used all over the compilation process.

This is also called as the front end of a compiler.

Synthesis phase of compiler

It will get the analysis phase input(intermediate representation and symbol table) and produces the targeted machine level code.

This is also called as the back end of a compiler.

Compiler Phases

Compiler Design Tutorial

  • Compiler Design Tutorial
  • Compiler Design - Home
  • Compiler Design - Overview

Compiler Design - Architecture

  • Compiler Design - Phases of Compiler
  • Compiler Design - Lexical Analysis
  • Compiler - Regular Expressions
  • Compiler Design - Finite Automata
  • Compiler Design - Syntax Analysis
  • Compiler Design - Types of Parsing
  • Compiler Design - Top-Down Parser
  • Compiler Design - Bottom-Up Parser
  • Compiler Design - Error Recovery
  • Compiler Design - Semantic Analysis
  • Compiler - Run-time Environment
  • Compiler Design - Symbol Table
  • Compiler - Intermediate Code
  • Compiler Design - Code Generation
  • Compiler Design - Code Optimization
  • Compiler Design Useful Resources
  • Compiler Design - Quick Guide
  • Compiler Design - Useful Resources
  • Selected Reading
  • UPSC IAS Exams Notes
  • Developer's Best Practices
  • Questions and Answers
  • Effective Resume Writing
  • HR Interview Questions
  • Computer Glossary

A compiler can broadly be divided into two phases based on the way they compile.

Analysis Phase

Known as the front-end of the compiler, the analysis phase of the compiler reads the source program, divides it into core parts and then checks for lexical, grammar and syntax errors.The analysis phase generates an intermediate representation of the source program and symbol table, which should be fed to the Synthesis phase as input.

Analysis and Synthesis phase of compiler

Synthesis Phase

Known as the back-end of the compiler, the synthesis phase generates the target program with the help of intermediate source code representation and symbol table.

A compiler can have many phases and passes.

Pass : A pass refers to the traversal of a compiler through the entire program.

Phase : A phase of a compiler is a distinguishable stage, which takes input from the previous stage, processes and yields output that can be used as input for the next stage. A pass can have more than one phase.

Compiler Architecture

Compilers perform translation. Every non-trivial translation requires analysis and synthesis:

analsyn.png

Both analysis and synthesis are made up of internal phases.

Compiler Components

These are the main functional components of a production compiler that looks to generate assembly or machine language (if you are just targeting a high-level language like C, or a virtual machine, you might not have so many phases):

compilerphases.png

You might also identify an error recovery subsystem and a symbol table manager , too.

Lexical Analysis (Scanning)

An example in C:

gets tokenized into:

unsigned ID(gcd) ( unsigned int ID(x) , unsigned ID(y) ) { while ( ID(x) > INTLIT(0) ) { unsigned ID(temp) = ID(x) ; ID(x) = ID(y) % ID(x) ; ID(y) = ID(temp) ; } return ID(y) ; }

Scanners are concerned with issues such as:

  • Case sensitivity (or insensitivity)
  • Whether or not blanks are significant
  • Whether or not newlines are significant
  • Whether comments can be nested

Errors that might occur during scanning, called lexical errors include:

  • Encountering characters that are not in the language’s alphabet
  • Too many characters in a word or line (yes, such languages do exist!)
  • An unclosed character or string literal
  • An end of file within a comment

Syntax Analysis (Parsing)

The parser turns the token sequence into an abstract syntax tree . For the example above, we get this tree:

gcdast1.png

The tree can also be stored as a string

Technically, each node in the AST is stored as an object with named fields, many of whose values are themselves nodes in the tree. Note that at this stage in the compilation, the tree is definitely just a tree. There are no cycles.

gcdast2.png

When constructing a parser, one needs to be concerned with grammar complexity (such as whether the grammar is LL or LR), and whether there are any disambiguation rules that might have to be hacked in. Some languages actually require a bit of semantic analysis to parse.

Errors that might occur during parsing, called syntax errors include things like following, in C

  • j = 4 * (6 − x;

Combining Scanning and Parsing

It is not necessary to actually separate scanning (lexical analysis / tokenization) from parsing (syntax analysis / tree generation). Systems based on PEGs, like Ohm, are actually scannerless : they perform parsing in a predictive fashion, with lexical and syntactic rules mixed together . (However, systems like Ohm will need a pre-parsing phase to handle indents and dedents.)

When using a scannerless system, the language designer and compiler writer still thinks in terms of tokens and phrases, but does not have to worry about complex rules like the so-called Maximal Munch principle. Lookahead captures any kind of tokenization scheme you need. Besides, the predictive nature of scannerless parsing means we never have to decide whether a * is a unary operator pointer dereference token or a binary multiplication operator token or a star token. We always have context when parsing predictively.

Semantic Analysis

During semantic analysis we have to check legality rules and while doing so, we tie up the pieces of the syntax tree (by resolving identifier references, inserting cast operations for implicit coercions, etc.) to form a semantic graph .

Continuing the example above:

gcdsemgraph.png

Obviously, the set of legality rules is different for each language . Examples of legality rules you might see in a Java-like language include:

  • Multiple declarations of a variable within a scope
  • Referencing a variable before its declaration
  • Referencing an indentifier that has no declaration
  • Violating access (public, private, protected, ...) rules
  • Too many arguments in a method call
  • Not enough arguments in a method call
  • Type mismatches (there are tons of these)

Intermediate Code Generation

The intermediate code generator produces a flow graph made up of tuples grouped into basic blocks. For the example above, we’d see:

gcdflowgraph.png

You can read more about intermediate representations elsewhere.

Machine Independent Code Improvement

Code improvement that is done on the semantic graph or on the intermediate code is called machine independent code optimization. In practice there are zillions of known optimzations (er, improvements), but none really apply to our running example.

Code Generation

Code generation produces the actual target code, or something close. This is what I got when assembling with gcc 6.3 targeting the x86-64, without any optimizations:

Here is code for the ARM, using gcc 5.4, without optimizations:

And MIPS code with gcc 5.4, also unoptimized:

Machine Dependent Code Improvement

Usually the final phase in compilation is cleaning up and improving the target code. For the example above, I got this when setting the optimization level to -O3 :

Optimized ARM code:

Optimized MIPS code:

Analysis synthesis model of compilation

Photo of author

By Team EasyExamNotes

The analysis and synthesis phases of a compiler are:

Analysis phase.

Breaks the source program into constituent pieces and creates intermediate representation.

The analysis part can be divided along the following phases:

1. Lexical Analysis

The program is considered as a unique sequence of characters. The Lexical Analyzer reads the program from left-to-right and sequence of characters is grouped into tokens–lexical units with a collective meaning.

2. Syntax Analysis

The Syntactic Analysis is also called Parsing. Tokens are grouped into grammatical phrases represented by a Parse Tree, which gives a hierarchical structure to the source program.

3. Semantic Analysis

The Semantic Analysis phase checks the program for semantic errors (Type Checking) and gathers type information for the successive phases. Type Checking check types of operands; No real number as index of array etc

Synthesis Phase

Generates the target program from the intermediate representation.

The synthesis part can be divided along the following phases:

analysis and synthesis model of compiler

Intermediate Code Generator

An intermediate code is generated as a program for an abstract machine. The intermediate code should be easy to translate into the target program.

Code Optimizer

This phase attempts to improve the intermediate code so that faster-running machine code can be obtained. Different compilers adopt different optimization techniques.

Code Generator

This phase generates the target code consisting of assembly code.

  • Memory locations are selected for each variable
  • Instructions are translated into a sequence of assembly instructions
  • Variables and intermediate results are assigned to memory registers.

analysis and synthesis model of compiler

Share this:

  • Click to share on Facebook (Opens in new window)
  • Click to share on Telegram (Opens in new window)
  • Click to share on WhatsApp (Opens in new window)

CS335 Compiler Design (2018-19 IInd Semester)

TAs : Ameya Rajgopal Loya (ameya@cse), Ayush Tulsyan (atulsyan@cse), Gopichand Kotana (gopick@cse), Manish K Bera (mkbera@iitk), Nimisha Agarwal (nimisha@cse), Pawan Patel (patelp@cse), Rahul Gupta (grahul@cse)

  • Timings : Mon, Wed, Fri : 9:00AM - 10:00AM

Venue : RM 101 (Rajeev Motvani Building, CSE Dept)

This course aims to teach various phases of Compiler Design.

Code of Ethics

Announcements, course outline, evaluation scheme.

  • Topics covered and handouts

Assignments

Course project, supporting material, recommended references.

Any report/program/assignment you submit must clearly distinguish your contribution from others (webpages, softwares, report, discussions with other students). The penalty for copying in any form will be severe .

Important : All emails either to the instructor or the TAs should begin with subject line " [CS335] " -- without any spaces in the course code (and without quotes). Email not complying to this rule will NOT be entertained .

The course site is up on Canvas . Everyone in the class must register on Canvas.

Topics Covered and Slides

The slides are not suitable for taking prints as there is a lot of redundancy due to overlays. Print only the pages that you really require.

Note that [ DragonBook ] refers to Compilers: Principles, Techniques, and Tools, Second edition, 2006. by Alfred V. Aho , Monica S. Lam , Ravi Sethi , Jeffrey D. Ullman. Almost all the content we cover in the class is also available in the older edition of the book [ OldDragonBook ], but the chapters/sections could be different.

There will be short assignments to give you a chance to apply the lecture material. Assignments will have some written and some programming tasks.

The course project gives you a chance to apply the concepts learnt in the class to build a prototype compiler. You will be required to implement various phases of a compiler, and perform an experimental evaluation of your implementation.

  • Project will be done in groups. The maximum (and preferable) size of a group will be three students.

The course will mainly cover topics from the following list (not necessarily in the same order).

  • Compiler structure : analysis-synthesis model of compilation, various phases of a compiler, tool based approach to compiler construction.
  • Lexical analysis : interface with input, parser and symbol table, token, lexeme and patterns. Difficulties in lexical analysis. Error reporting. Implementation. Regular definition, Transition diagrams, LEX.
  • Syntax analysis : CFGs, ambiguity, associativity, precedence, top down parsing, recursive descent parsing, transformation on the grammars, predictive parsing, bottom up parsing, operator precedence grammars, LR parsers (SLR, LALR, LR), YACC.
  • Syntax directed definitions : inherited and synthesized attributes, dependency graph, evaluation order, bottom up and top down evaluation of attributes, L- and S-attributed definitions.
  • Type checking : type system, type expressions, structural and name equivalence of types, type conversion, overloaded functions and operators, polymorphic functions.
  • Run time system : storage organization, activation tree, activation record, parameter passing, symbol table, dynamic storage allocation.
  • Intermediate code generation : intermediate representations, translation of declarations, assignments, control flow, boolean expressions and procedure calls. Implementation issues.
  • Code generation and instruction selection : issues, basic blocks and flow graphs, register allocation, code generation, dag representation of programs, code generation from dags, peep hole optimization, code generator generators, specifications of machine.

(Tentative)

Most of the following softwares are available from package repositories of linux distributions. It is recommended to use these repositories and install instead of building them from source.

  • cscope and ctags : Code browsing utilities
  • Eclipse : A Java based Integrated Development Environment
  • LaTeX : A document preparation system
  • Aho, A., Lam, M., Sethi, R., Ullman, J., Compilers: Principles, Techniques, & Tools , Addison Wesley, 2007.
  • Appel, A., Modern Compiler Implementation in Java (or ML, or C), Cambridge University Press, 2002.
  • Cooper, K., Torczon, L., Engineering a Compiler , Morgan Kaufmann, 2004
  • Muchnick, S., Advanced Compiler Design and Implementation , Morgan Kaufmann, 1997.
  • Randy Allen, Ken Kennedy, Optimizing Compilers for Modern Architectures: A Dependence-based Approach , Morgan Kaufmann, 2001

Take me to the Top

analysis and synthesis model of compiler

Compiler Design

Analysis and Transformation

  • © 2012
  • Helmut Seidl 0 ,
  • Reinhard Wilhelm 1 ,
  • Sebastian Hack 2

, Fakultät für Informatik, Technische Universität München, Garching, Germany

You can also search for this author in PubMed   Google Scholar

, Compiler Research Group, Universität des Saarlandes, Saarbrücken, Germany

, programming group, universität des saarlandes, saarbrücken, germany.

Written by established experts

Illustrated with many examples, exercises and program fragments

Third book in a 4-volume set

Includes supplementary material: sn.pub/extras

9188 Accesses

15 Citations

5 Altmetric

Buy print copy

  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Table of contents (3 chapters)

Front matter, foundations and intraprocedural optimization.

  • Helmut Seidl, Reinhard Wilhelm, Sebastian Hack

Interprocedural Optimization

Optimization of functional programs, back matter.

  • Functional programming
  • High-level programming
  • Imperative programming
  • Interpreters
  • Logic programming
  • Machine architectures
  • Object-oriented programming
  • Programming
  • Virtual machines

About this book

While compilers for high-level programming languages are large complex software systems, they have particular characteristics that differentiate them from other software systems. Their functionality is almost completely well-defined - ideally there exist complete precise descriptions of the source and target languages. Additional descriptions of the interfaces to the operating system, programming system and programming environment, and to other compilers and libraries are often available.

The book deals with the optimization phase of compilers. In this phase, programs are transformed in order to increase their efficiency. To preserve the semantics of the programs in these transformations, the compiler has to meet the associated applicability conditions. These are checked using static analysis of the programs. In this book the authors systematically describe the analysis and transformation of imperative and functional programs. In addition to a detailed description of important efficiency-improving transformations, the book offers a concise introduction to the necessary concepts and methods, namely to operational semantics, lattices, and fixed-point algorithms.

This book is intended for students of computer science. The book is supported throughout with examples, exercises and program fragments.

From the reviews:

“German academics … provide a concise, compact presentation on ‘methods to improve the efficiency of target programs by a compiler,’ i.e., a compiler’s optimizing phase. … The authors provide a wealth of information on analysis along with specific illustrations. … The authors walk through many of their examples with reference to various languages (such as Java). Since this book is aimed at students, it includes exercises at the end of each chapter. … Summing Up: Recommended. Upper-division undergraduates and graduate students.” (M. B. DuBois, Choice, Vol. 50 (10), June, 2013)

“The authors bring together many of the results from the last few decades in a coherent and detailed manner, and the result is an excellent resource for those wanting to understand some of the complex issues in building realistic, industrial-strength compilers. … The authors provide motivation and definitions for many of the concepts in static analysis, and illustrate these ideas through example programs that can be optimized.” (Sara Kalvala, Computing Reviews, April, 2013)

Authors and Affiliations

Helmut Seidl

Reinhard Wilhelm

Sebastian Hack

About the authors

The authors are among the established experts on compiler construction, with decades of related teaching experience. Prof. Dr. Reinhard Wilhelm is the head of the Compiler Design Lab of the Universität des Saarlandes, and his main research interests include compiler construction; Prof. Dr. Helmut Seidl heads the Institut für Informatik of the Technische Universität München, and his main research interests include automatic program analysis and the design and implementation of programming languages; Dr. Sebastian Hack is a Junior Professor in the Computer Science Programming Group of the Universität des Saarlandes, and his main research areas include compilers and code generation.

Bibliographic Information

Book Title : Compiler Design

Book Subtitle : Analysis and Transformation

Authors : Helmut Seidl, Reinhard Wilhelm, Sebastian Hack

DOI : https://doi.org/10.1007/978-3-642-17548-0

Publisher : Springer Berlin, Heidelberg

eBook Packages : Computer Science , Computer Science (R0)

Copyright Information : Springer-Verlag Berlin Heidelberg 2012

Hardcover ISBN : 978-3-642-17547-3 Published: 14 August 2012

Softcover ISBN : 978-3-662-50716-2 Published: 23 August 2016

eBook ISBN : 978-3-642-17548-0 Published: 13 August 2012

Edition Number : 1

Number of Pages : XII, 177

Topics : Programming Techniques , Programming Languages, Compilers, Interpreters

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research
  • System Design Tutorial
  • What is System Design
  • System Design Life Cycle
  • High Level Design HLD
  • Low Level Design LLD
  • Design Patterns
  • UML Diagrams
  • System Design Interview Guide
  • Crack System Design Round
  • System Design Bootcamp
  • System Design Interview Questions
  • Microservices
  • Scalability
  • Compiler Design Tutorial

Introduction

Introduction of compiler design.

  • Compiler construction tools

Phases of a Compiler

  • Symbol Table in Compiler
  • Error detection and Recovery in Compiler
  • Error Handling in Compiler Design
  • Language Processors: Assembler, Compiler and Interpreter
  • Generation of Programming Languages

Lexical Analysis

  • Introduction of Lexical Analysis
  • Flex (Fast Lexical Analyzer Generator )
  • Introduction of Finite Automata
  • Ambiguous Grammar

Syntax Analysis

  • Introduction to Syntax Analysis in Compiler Design
  • Why FIRST and FOLLOW in Compiler Design?
  • FIRST Set in Syntax Analysis
  • FOLLOW Set in Syntax Analysis
  • Classification of Context Free Grammars
  • Parsing | Set 1 (Introduction, Ambiguity and Parsers)
  • Classification of Top Down Parsers
  • Bottom-up or Shift Reduce Parsers | Set 2
  • Shift Reduce Parser in Compiler
  • SLR Parser (with Examples)
  • CLR Parser (with Examples)
  • Construction of LL(1) Parsing Table
  • LALR Parser (with Examples)

Syntax Directed Translation

  • Syntax Directed Translation in Compiler Design
  • S - attributed and L - attributed SDTs in Syntax directed translation
  • Parse Tree in Compiler Design
  • Parse Tree and Syntax Tree
  • Code Generation and Optimization
  • Code Optimization in Compiler Design
  • Intermediate Code Generation in Compiler Design
  • Issues in the design of a code generator
  • Three address code in Compiler
  • Data flow analysis in Compiler
  • Compiler Design | Detection of a Loop in Three Address Code
  • Introduction of Object Code in Compiler Design

Runtime Environments

  • Static and Dynamic Scoping
  • Runtime Environments in Compiler Design
  • Loader in C/C++

Compiler Design LMN

  • Last Minute Notes - Compiler Design

Compiler Design GATE PYQ's and MCQs

  • Lexical analysis
  • Parsing and Syntax directed translation
  • Compiler Design - GATE CSE Previous Year Questions

The compiler is software that converts a program written in a high-level language (Source Language) to a low-level language (Object/Target/Machine Language/0, 1’s). 

A translator or language processor is a program that translates an input program written in a programming language into an equivalent program in another language. The compiler is a type of translator, which takes a program written in a high-level programming language as input and translates it into an equivalent program in low-level languages such as machine language or assembly language. 

The program written in a high-level language is known as a source program, and the program converted into a low-level language is known as an object (or target) program. Without compilation, no program written in a high-level language can be executed. For every programming language, we have a different compiler; however, the basic tasks performed by every compiler are the same. The process of translating the source code into machine code involves several stages, including lexical analysis, syntax analysis, semantic analysis, code generation, and optimization.

Compiler is an intelligent program as compare to an assembler. Compiler verifies all types of limits, ranges, errors , etc. Compiler program takes more time to run and it occupies huge amount of memory space. The speed of compiler is slower than other system software. It takes time because it enters through the program and then does translation of the full program. When compiler runs on same machine and produces machine code for the same machine on which it is running. Then it is called as self compiler or resident compiler. Compiler may run on one machine and produces the machine codes for other computer then in that case it is called as cross compiler.

High-Level Programming Language

A high-level programming language is a language that has an abstraction of attributes of the computer. High-level programming is more convenient to the user in writing a program.

Low-Level Programming Language

A low-Level Programming language is a language that doesn’t require programming ideas and concepts.

Stages of Compiler Design

  • Lexical Analysis: The first stage of compiler design is lexical analysis , also known as scanning. In this stage, the compiler reads the source code character by character and breaks it down into a series of tokens, such as keywords, identifiers, and operators. These tokens are then passed on to the next stage of the compilation process.
  • Syntax Analysis: The second stage of compiler design is syntax analysis , also known as parsing. In this stage, the compiler checks the syntax of the source code to ensure that it conforms to the rules of the programming language. The compiler builds a parse tree, which is a hierarchical representation of the program’s structure, and uses it to check for syntax errors.
  • Semantic Analysis: The third stage of compiler design is semantic analysis . In this stage, the compiler checks the meaning of the source code to ensure that it makes sense. The compiler performs type checking, which ensures that variables are used correctly and that operations are performed on compatible data types. The compiler also checks for other semantic errors, such as undeclared variables and incorrect function calls.
  • Code Generation: The fourth stage of compiler design is code generation . In this stage, the compiler translates the parse tree into machine code that can be executed by the computer. The code generated by the compiler must be efficient and optimized for the target platform.
  • Optimization: The final stage of compiler design is optimization. In this stage, the compiler analyzes the generated code and makes optimizations to improve its performance. The compiler may perform optimizations such as constant folding, loop unrolling, and function inlining.

Overall, compiler design is a complex process that involves multiple stages and requires a deep understanding of both the programming language and the target platform. A well-designed compiler can greatly improve the efficiency and performance of software programs, making them more useful and valuable for users.

Compiler

  • Cross Compiler that runs on a machine ‘A’ and produces a code for another machine ‘B’. It is capable of creating code for a platform other than the one on which the compiler is running.
  • Source-to-source Compiler or transcompiler or transpiler is a compiler that translates source code written in one programming language into the source code of another programming language.

Language Processing Systems

We know a computer is a logical assembly of Software and Hardware . The hardware knows a language, that is hard for us to grasp, consequently, we tend to write programs in a high-level language, that is much less complicated for us to comprehend and maintain in our thoughts. Now, these programs go through a series of transformations so that they can readily be used by machines. This is where language procedure systems come in handy. 

High-Level Language to Machine Code

High-Level Language to Machine Code

  • High-Level Language: If a program contains pre-processor directives such as #include or #define it is called HLL. They are closer to humans but far from machines. These (#) tags are called preprocessor directives. They direct the pre-processor about what to do.
  • Pre-Processor: The pre-processor removes all the #include directives by including the files called file inclusion and all the #define directives using macro expansion. It performs file inclusion, augmentation, macro-processing, etc.
  • Assembly Language: It’s neither in binary form nor high level. It is an intermediate state that is a combination of machine instructions and some other useful data needed for execution.
  • Assembler: For every platform (Hardware + OS) we will have an assembler. They are not universal since for each platform we have one. The output of the assembler is called an object file. Its translates assembly language to machine code.
  • Interpreter: An interpreter converts high-level language into low-level machine language, just like a compiler. But they are different in the way they read the input. The Compiler in one go reads the inputs, does the processing, and executes the source code whereas the interpreter does the same line by line. A compiler scans the entire program and translates it as a whole into machine code whereas an interpreter translates the program one statement at a time. Interpreted programs are usually slower concerning compiled ones. For example: Let in the source program, it is written #include “Stdio. h”. Pre-Processor replaces this file with its contents in the produced output. The basic work of a linker is to merge object codes (that have not even been connected), produced by the compiler, assembler, standard library function, and operating system resources. The codes generated by the compiler, assembler, and linker are generally re-located by their nature, which means to say, the starting location of these codes is not determined, which means they can be anywhere in the computer memory, Thus the basic task of loaders to find/calculate the exact address of these memory locations.
  • Relocatable Machine Code: It can be loaded at any point and can be run. The address within the program will be in such a way that it will cooperate with the program movement.
  • Loader/Linker: Loader/Linker converts the relocatable code into absolute code and tries to run the program resulting in a running program or an error message (or sometimes both can happen). Linker loads a variety of object files into a single file to make it executable. Then loader loads it in memory and executes it.

Types of Compiler

There are mainly three types of compilers.

  • Single Pass Compilers
  • Two Pass Compilers
  • Multipass Compilers

Single Pass Compiler

When all the phases of the compiler are present inside a single module, it is simply called a single-pass compiler. It performs the work of converting source code to machine code.

Two Pass Compiler

Two-pass compiler is a compiler in which the program is translated twice, once from the front end and the back from the back end known as Two Pass Compiler.

Multipass Compiler

When several intermediate codes are created in a program and a syntax tree is processed many times, it is called Multi pass Compiler. It breaks codes into smaller programs.

There are two major phases of compilation, which in turn have many parts. Each of them takes input from the output of the previous level and works in a coordinated way. 

Phases of Compiler

Phases of Compiler

Analysis Phase

An intermediate representation is created from the given source code : 

  • Lexical Analyzer
  • Syntax Analyzer
  • Semantic Analyzer
  • Intermediate Code Generator

The lexical analyzer divides the program into “tokens”, the Syntax analyzer recognizes “sentences” in the program using the syntax of the language and the Semantic analyzer checks the static semantics of each construct. Intermediate Code Generator generates “abstract” code. 

Synthesis Phase

An equivalent target program is created from the intermediate representation. It has two parts : 

  • Code Optimizer
  • Code Generator

Code Optimizer optimizes the abstract code, and the final Code Generator translates abstract intermediate code into specific machine instructions. 

Operations of Compiler

These are some operations that are done by the compiler.

  • It breaks source programs into smaller parts.
  • It enables the creation of symbol tables and intermediate representations.
  • It helps in code compilation and error detection.
  • it saves all codes and variables.
  • It analyses the full program and translates it.
  • Convert source code to machine code.

Advantages of Compiler Design

  • Efficiency: Compiled programs are generally more efficient than interpreted programs because the machine code produced by the compiler is optimized for the specific hardware platform on which it will run.
  • Portability: Once a program is compiled, the resulting machine code can be run on any computer or device that has the appropriate hardware and operating system, making it highly portable.
  • Error Checking: Compilers perform comprehensive error checking during the compilation process, which can help catch syntax, semantic, and logical errors in the code before it is run.
  • Optimizations: Compilers can make various optimizations to the generated machine code, such as eliminating redundant instructions or rearranging code for better performance.

Disadvantages of Compiler Design

  • Longer Development Time: Developing a compiler is a complex and time-consuming process that requires a deep understanding of both the programming language and the target hardware platform.
  • Debugging Difficulties: Debugging compiled code can be more difficult than debugging interpreted code because the generated machine code may not be easy to read or understand.
  • Lack of Interactivity: Compiled programs are typically less interactive than interpreted programs because they must be compiled before they can be run, which can slow down the development and testing process.
  • Platform-Specific Code: If the compiler is designed to generate machine code for a specific hardware platform, the resulting code may not be portable to other platforms.

GATE CS Corner Questions

Practicing the following questions will help you test your knowledge. All questions have been asked in GATE in previous years or GATE Mock Tests. It is highly recommended that you practice them. 

  • GATE CS 2011, Question 1
  • GATE CS 2011, Question 19
  • GATE CS 2009, Question 17
  • GATE CS 1998, Question 27
  • GATE CS 2008, Question 85
  • GATE CS 1997, Question 8
  • GATE CS 2014 (Set 3), Question 65
  • GATE CS 2015 (Set 2), Question 29

Please Login to comment...

Similar reads.

  • Compiler Design
  • System Design

advertisewithusBannerImg

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

IMAGES

  1. Compiler Design

    analysis and synthesis model of compiler

  2. Analysis Synthesis model of Compiler

    analysis and synthesis model of compiler

  3. Analysis-Synthesis model of Compilation |Compiler Design|Lec02|part2

    analysis and synthesis model of compiler

  4. Compiler Design

    analysis and synthesis model of compiler

  5. Structure of Compiler

    analysis and synthesis model of compiler

  6. The analysis synthesis model of compilation

    analysis and synthesis model of compiler

VIDEO

  1. Process Synthesis and Modeling-Lecture 1

  2. roblox-model compiler usage guide

  3. Analysis synthesis model of compilation || compiler design lecture 3

  4. Introduction to Compiler Design in Tamil

  5. Phases of Compiler

  6. Source Model Compiler v2.0 New interface test

COMMENTS

  1. Analysis and synthesis phase of compiler

    Synthesis phase of compiler. It will get the analysis phase input (intermediate representation and symbol table) and produces the targeted machine level code. This is also called as the back end of a compiler. There are two main phases in the compiler.In this tutorial, we will learn the roles of analysis and synthesis phase of a compiler.

  2. Phases of a Compiler

    The typical phases of a compiler are: Lexical Analysis: The first phase of a compiler is lexical analysis, also known as scanning. This phase reads the source code and breaks it into a stream of tokens, which are the basic units of the programming language. The tokens are then passed on to the next phase for further processing.

  3. Synthesis Phase in Compiler Design

    The synthesis phase, also known as the code generation or code optimization phase, is the final step of a compiler. It takes the intermediate code generated by the front end of the compiler and converts it into machine code or assembly code, which can be executed by a computer. The intermediate code can be in the form of an abstract syntax tree ...

  4. The Analysis-Synthesis Model

    The Analysis-Synthesis Model. The ANALYSIS part (Figure 2 ) breaks up the source program into constituent pieces (words, phrases) and creates an intermediate representation of the source program. Informally, the compiler must understand the structure and meaning of the source program. Figure 2: From a source line to an intermediate ...

  5. Compiler Design

    Analysis Phase Known as the front-end of the compiler, the analysis phase of the compiler reads the source program, divides it into core parts and then checks for lexical, grammar and syntax errors.The analysis phase generates an intermediate representation of the source program and symbol table, which should be fed to the Synthesis phase as input.

  6. PDF CS143 Handout 02 Summer 2012 June 25, 2012 Anatomy of a Compiler

    representation of the program. Then, the synthesis stage constructs the desired target program from the intermediate representation. Typically, a compiler's analysis stage is called its front end and the synthesis stage its back end. Each of the stages is broken down into a set of "phases" that handle different parts of the tasks.

  7. PDF Principle of Compilers Lecture I: Introduction to Compilers ...

    The Architecture of a Compiler Compilation can be divided in two parts: Analysis and Synthesis. 1. Analysis. Breaks the source program into constituent pieces and creates intermediate representation. 2. Synthesis. Generates the target program from the intermediate representa-tion. The analysis part can be divided along the following phases: 1.

  8. PDF CS 4300: Compiler Theory Chapter 1 Introduction

    Analysis and Synthesis. There are two parts to compilation: Analysis breaks up the source program into constituent pieces and imposes a grammatical structure on them. It then uses this structure to create an intermediate representation of the source program. The analysis part also collects information about the source program and stores it in a ...

  9. compilerarchitecture

    Compilers perform translation. Every non-trivial translation requires analysis and synthesis: Both analysis and synthesis are made up of internal phases. Compiler Components. These are the main functional components of a production compiler that looks to generate assembly or machine language (if you are just targeting a high-level language like ...

  10. PDF Compiler Design

    The compiler structure described in the following is a conceptual structure. i.e. it identifies the ... The analysis phase is further split into subtasks as this volume is concerned with the analysis phase. Each component realizing such a subtask receives a representation of the program as input and delivers

  11. Analysis synthesis model of compilation

    The analysis and synthesis phases of a compiler are: Analysis Phase. Breaks the source program into constituent pieces and creates intermediate representation. The analysis part can be divided along the following phases: 1. Lexical Analysis . The program is considered as a unique sequence of characters.

  12. PDF Lecture6: Oct. 18,2018 6.1 Introduction

    In the analysis-synthesis model of a compiler, the frontendanalyzes a source program and creates an intermediate representation, from this onwards the backendworks and generates the target code. In ideal sense, details of the source language are confined to the front end, and details of the target machine1 to the back end. With a properly ...

  13. Issues, Importance and Applications of Analysis Phase of Compiler

    The analysis phase of a compiler is an important step in the process of converting source code into an executable form. Some of the key applications of the analysis phase include: Tokenization: During the analysis phase, the compiler breaks the source code down into smaller units called tokens. These tokens represent the different elements of ...

  14. PDF CS335: Compiler Design

    •Overview of Compilation: analysis-synthesis model of compilation, various phases of a compiler, tool based approach to compiler construction. •Lexical Analysis: interface with input, parser and symbol table, token, lexeme and ... Modern Compiler Implementation in Java, 2nd edition. •M. Scott. Programming Language Pragmatics, 4th edition ...

  15. Compiler Design

    Compiler is retargetable. Source and machine independent code optimization is possible. Optimization phase can be inserted after the front and back end phases have been developed and deployed. Also known as Analysis-Synthesis model of compilation. Also since each phase handles a logically different phase of working of a compiler parts of the ...

  16. Analysis Synthesis model of Compiler

    Lecture Notes on Compiler/DBMS/soft computing are available @Rs 500/- each subject by paying through Google Pay/ PayTM on 97173 95658 . You can also pay us...

  17. CS 335A: Compiler Design

    The course will mainly cover topics from the following list (not necessarily in the same order). Compiler structure: analysis-synthesis model of compilation, various phases of a compiler, tool based approach to compiler construction. Lexical analysis: interface with input, parser and symbol table, token, lexeme and patterns.

  18. Compiler Design: Analysis and Transformation

    In this book the authors systematically describe the analysis and transformation of imperative and functional programs. In addition to a detailed description of important efficiency-improving transformations, the book offers a concise introduction to the necessary concepts and methods, namely to operational semantics, lattices, and fixed-point ...

  19. PDF 1.STRUCTURE OF COMPILER

    1.2 THE PHASES OF A COMPILER Analysis-Synthesis Model of Compilation: There are two parts to compilation: analysis and synthesis. The analysis part breaks up the source program into constituent pieces and creates an intermediate representation of the source program. The synthesis

  20. Working of Compiler Phases with Example

    Let's discuss one by one. Pre-requisite - Introduction to compiler phases. You will see how compiler phases like lexical analyzer, Syntax analyzer, Semantic Analyzer, Intermediate code generator, code Optimizer, and Target code generation. let's consider an example. x = a+b*50. The symbol table for the above example is given below.

  21. Analysis

    Analysis Synthesis model of compilationPhases of compiler with example

  22. Introduction of Compiler Design

    The compiler builds a parse tree, which is a hierarchical representation of the program's structure, and uses it to check for syntax errors. Semantic Analysis: The third stage of compiler design is semantic analysis. In this stage, the compiler checks the meaning of the source code to ensure that it makes sense.

  23. Analysis-Synthesis model of Compilation |Compiler Design ...

    In this video you will learn about analysis and synthesis model of compiler.Thank you for watching!Support Us By Like, Share...Subscribe our channel for more...