Compiler Design Ch1 | PDF | Parsing | Compiler
0% found this document useful (0 votes)
99 views

Compiler Design Ch1

A compiler is a program that translates a program written in one language (the source language) into an equivalent program in another language (the target language). Compilers provide an essential interface between applications and architectures, and embody theoretical techniques. A compiler consists of analysis and synthesis phases. Analysis breaks down the source program into tokens, constructs a parse tree via syntax analysis, and performs type checking via semantic analysis. Synthesis constructs an output from the intermediate representation.

Uploaded by

Vuggam Venkatesh
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
99 views

Compiler Design Ch1

A compiler is a program that translates a program written in one language (the source language) into an equivalent program in another language (the target language). Compilers provide an essential interface between applications and architectures, and embody theoretical techniques. A compiler consists of analysis and synthesis phases. Analysis breaks down the source program into tokens, constructs a parse tree via syntax analysis, and performs type checking via semantic analysis. Synthesis constructs an output from the intermediate representation.

Uploaded by

Vuggam Venkatesh
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 13

Chapter 1 Introduction

What is a compiler?

a program that reads a program written in one language (the source language) and translates it into an equivalent
program in another language (the target language).

Why we design compiler?

Why we study compiler construction techniques?

 Compilers provide an essential interface between applications and architectures

 Compilers embody a wide range of theoretical techniques

Output

Using a high-level language for programming has a large impact on how fast programs can be developed.

The main reasons for this are:

 Compared to machine language, the notation used by programming languages is closer to the way humans think
about problems.

 The compiler can spot some obvious programming mistakes.

 Programs written in a high-level language tend to be shorter than equivalent programs written in machine
language.

 The same program can be compiled to many different machine languages and, hence, be brought to run on
many different machines.

Since different platforms, or hardware architectures along with the operating systems (Windows, Macs, Unix), require
different machine code, you must compile most programs separately for each platform.

1
compi progra
le r m
compi compi
ler ler

Programs related to compilers

1) Interpreter

Is a program that reads a source program and executes it

Works by analyzing and executing the source program commands one at a time

Does not translate the whole source program into object code

Interpretation is important when:

o Programmer is working in interactive mode and needs to view and update variables

o Running speed is not important

o Commands have simple formats, and thus can be quickly analyzed and executed

o Modification or addition to user programs is required as execution proceeds

Well-known examples of interpreters:

o Basic interpreter, Lisp interpreter, UNIX shell command interpreter, SQL interpreter, java interpreter…

In principle, any programming language can be either interpreted or compiled:

o Some languages are designed to be interpreted, others are designed to be compiled

Interpreters involve large overheads:

o Execution speed degradation can vary from 10:1 to 100:1

o Substantial space overhead may be involved

E.g., Compiling Java Programs

The Java compiler produces bytecode not machine code

Bytecode is converted into machine code using a Java Interpreter

You can run bytecode on any computer that has a Java Interpreter installed

2
Java compil
Java bytecode
Program er Interpreter

2) Assemblers

Translator for the assembly language.

Assembly code is translated into machine code

Output is relocatable machine code.

3) Linker

o Links object files separately compiled or assembled

o Links object files to standard library functions

o Generates a file that can be loaded and executed

4) Loader

Loading of the executable codes, which are the outputs of linker, into main memory.

5) Pre-processors

A pre-processor is a separate program that is called by the compiler before actual translation begins.

Such a pre-processor:

 Produce input to a compiler

 can delete comments,

 Macro processing (substitutions)

 include other files...

Absolute machine code

3
A compiler consists of
internally of a number of
steps, or phases, that
perform distinct logical
operations.
The phases of a compiler
are shown in the next slide,
together with three
auxiliary components that
interact with some or all of
the phases:
The symbol table,
the literal table,
and error handler.

There are two important


parts in compilation
process:
Analysis and Synthesis

Analysis (front end)


Analysis
o and
Breaks up the source program into constituent pieces and

Synthesis.
o Creates an intermediate representation of the source program.

o During analysis, the operations implied by the source program are determined and recorded in hierarchical structure
called a tree.

Synthesis (back end)

o The synthesis part constructs the desired program from the intermediate representation.

4
Analysis of the source program

Analysis consists of three phases:

1. Linear/Lexical analysis

2. Hierarchical/Syntax analysis

3. Semantic analysis

1. Lexical analysis or Scanning

The stream of characters making up the source program is read from left to right and is grouped into tokens.

A token is a sequence of characters having a collective meaning.

A lexical analyzer, also called a lexer or a scanner, receives a stream of characters from the source program and groups
them into tokens.

Examples:

 Identifiers

 Keywords

 Symbols (+, -, …)

 Numbers …

Blanks, new lines, tabulation marks will be removed during lexical analysis.

Example

a[index] = 4 + 2;

a identifier

[ left bracket

index identifier

] right bracket

= assignment operator

4 number

+ plus operator

2 number

; semicolon

A scanner may perform other operations along with the recognition of tokens.

• It may enter identifiers into the symbol table, and

• It may enter literals into literal table

Lexical Analysis Tools

There are tools available to assist in the writing of lexical analyzers.

5
a) lex - produces C source code (UNIX/linux).

b) flex - produces C source code (gnu).

c) JLex - produces Java source code.

We will use Lex.

2. Syntax analysis or Parsing

The parser receives the source code in the form of tokens from the scanner and performs syntax analysis.

The results of syntax analysis are usually represented by a parse tree or a syntax tree.

Syntax tree à each interior node represents an operation and the children of the node represent the arguments of the
operation.

The syntactic structure of a programming language is determined by context free grammar (CFG).

Abstract syntax tree


Ex. Consider again the line of C code: a[index] = 4 + 2

6
Sometimes syntax trees are called abstract syntax trees, since they represent a further abstraction from parse trees.
Example is shown in the following figure.

Syntax Analysis Tools

There are tools available to assist in the writing of parsers.

a) yacc - produces C source code (UNIX/Linux).

b) bison - produces C source code (gnu).

c) CUP - produces Java source code.

We will use yacc

3. Semantic analysis

The semantics of a program are its meaning as opposed to syntax or structure

The semantics consist of:

o Runtime semantics – behavior of program at runtime

o Static semantics – checked by the compiler

Static semantics include:

o Declarations of variables and constants before use

o Calling functions that exist (predefined in a library or defined by the user)

o Passing parameters properly

o Type checking.

The semantic analyzer does the following:

o Checks the static semantics of the language

o Annotates the syntax tree with type information

7
Ex. Consider again the line of C code: a[index] = 4 + 2

Synthesis of the target program

I. Intermediate code generator

II. Intermediate code optimizer

III. The target code generator

IV. The target code optimizer

Code Improvement

 Code improvement techniques can be applied to:

o Intermediate code – independent of the target machine

o Target code – dependent on the target machine

 Intermediate code improvement include:

o Constant folding

o Elimination of common sub-expressions

o Improving loops

o Improving function calls

 Target code improvement include:

o Allocation and use of registers

o Selection of better (faster) instructions and addressing modes

I. Intermediate code generator

Comes after syntax and semantic analysis

Separates the compiler front end from its backend

Intermediate representation should have 2 important properties:

8
o Should be easy to produce

o Should be easy to translate into the target program

Intermediate representation can have a variety of forms:

o Three-address code, P-code for an abstract machine, Tree or DAG representation

Intermediate code

Three address code for the original C expression a[index]=4+2 is:

t1=2

t2 = 4 + t1

a[index] = t2

II. IC optimizer

An IC optimizer reviews the code, looking for ways to reduce:

o the number of operations and

o the memory requirements.

A program may be optimized for speed or for size.

This phase changes the IC so that the code generator produces a faster and less memory consuming program.

The optimized code does the same thing as the original (non-optimized) code but with less cost in terms of CPU time and
memory space.

Intermediate code
There are several techniques of optimizing code and they will be discussed in the forthcoming chapters.

Ex. Unnecessary lines of code in loops (i.e. code that could be executed outside of the loop) are moved out of the loop.

for(i=1; i<10, i++){

x = y+1;

z = x+i; }

x = y+1;

for(i=1; i<10, i++)

z = x+i;

In our previous example, we have included an opportunity for source level optimization; namely, the expression 4 + 2 can
be recomputed by the compiler to the result 6(This particular optimization is called constant folding).

9
This optimization can be performed directly on the syntax tree as shown below.

Many optimizations can be performed directly on the tree.

However, in a number of cases, it is easier to optimize a linearized form of the tree that is closer to assembly code.

A standard choice is Three-address code, so called because it contains the addresses of up to three locations in memory.

In our example, three address code for the original C expression might look like this:

o t1=2
o t2 = 4 + t 1
o a[index] = t2

Now the optimizer would improve this code in two steps, first computing the result of the addition

o t = 4+2
o a[index] = t

And then replacing t by its value to get the three-address statement

o a[index] = 6

III. Code generator

The machine code generator receives the (optimized) intermediate code, and then it produces either:

o Machine code for a specific machine, or

o Assembly code for a specific machine and assembler.

Code generator

o Selects appropriate machine instructions

o Allocates memory locations for variables

o Allocates registers for intermediate computations

10
The code generator takes the IR code and generates code for the target machine.

Here we will write target code in assembly language: a[index]=6

MOV R0, index ;; value of index -> R0

MUL R0, 2 ;; double value in R0

MOV R1, &a ;; address of a ->R1

ADD R1, R0 ;; add R0 to R1

MOV *R1, 6 ;; constant 6 -> address in R1

&a –the address of a (the base address of the array)

*R1-indirect registers addressing (the last instruction stores the value 6 to the address contained in R1)

IV. The target code optimizer

In this phase, the compiler attempts to improve the target code generated by the code generator.

Such improvement includes:

 Choosing addressing modes to improve performance

 Replacing slow instruction by faster ones

 Eliminating redundant or unnecessary operations

In the sample target code given, use a shift instruction to replace the multiplication in the second instruction.

Another is to use a more powerful addressing mode, such as indexed addressing to perform the array store.

With these two optimizations, our target code becomes:

MOV R0, index ;; value of index -> R0

SHL R0 ;; double value in R0

MOV &a [R0], 6 ;; constant 6 -> address a + R0

Grouping of phases

The discussion of phases deals with the logical organization of a compiler.

In practice most compilers are divided into:

Front end - language-specific and machine-independent.

11
Back end - machine-specific and language-independent.

Compiler passes:

A pass consists of reading an input file and writing an output file.

Several phases may be grouped in one pass.

For example, the front-end phases of lexical analysis, syntax analysis, semantic analysis, and intermediate code
generation might be grouped together into one pass.

Single pass

o is a compiler that passes through the source code of each compilation unit only once.

o a one-pass compiler does not "look back" at code it previously processed.

o A one-pass compilers is faster than multi-pass compilers

o they are unable to generate as efficient programs, due to the limited scope available.

Multi pass

o is a type of compiler that processes the source code or abstract syntax tree of a program several times.

o A collection of phases is done multiple times

Major Data and Structures in a Compiler

Token

o Represented by an integer value or an enumeration literal

o Sometimes, it is necessary to preserve the string of characters that was scanned

o For example, name of an identifiers or value of a literal

Syntax Tree

o Constructed as a pointer-based structure

o Dynamically allocated as parsing proceeds

Nodes have fields containing information collected by the parser and semantic analyzer

Symbol Table

o Keeps information associated with all kinds of tokens:

 Identifiers, numbers, variables, functions, parameters, types, fields, etc.

o Tokens are entered by the scanner and parser

o Semantic analyzer adds type information and other attributes

o Code generation and optimization phases use the information in the symbol table

Performance Issues

o Insertion, deletion, and search operations need to be efficient because they are frequent

o Hash table with constant-time operations is usually the preferred choice

12
o More than one symbol table may be used

Literal Table

o Stores constant values and string literals in a program.

o One literal table applies globally to the entire program.

o Used by the code generator to:

 Assign addresses for literals.

o Avoids the replication of constants and strings.

o Quick insertion and lookup are essential.

Compiler construction tools

Various tools are used in the construction of the various parts of a compiler.

Scanner generators

o Ex. Lex, flex, JLex

o These tools generate a scanner /lexical analyzer/ if given a regular expression.

Parser Generators

o Ex. Yacc, Bison, CUP

o These tools produce a parser /syntax analyzer/ if given a Context Free Grammar (CFG) that describes the syntax
of the source language.

Syntax directed translation engines

o Ex. Cornell Synthesizer Generator

o It produces a collection of routines that walk the parse tree and execute some tasks.

Automatic code generators

o Take a collection of rules that define the translation of the IC to target code and produce a code generator.

13

You might also like