CONTEXT FREE GRAMMAR
3.1 INTRODUCTION
Grammar _: A Granimar__is a set of rules which checks whether a strin
belongs to a particular language g{not.
“A program consists of a various string ofc characters . But, every string is not a
roper or meaningful string. So, to identify valid’ strings in a-language, some rules .
should be specified to check. whether. string i is alld or, not. ‘These rules are e nothing
but actually make a Grammar. T
Eg In English language, Grammar checks whether string of characters is
acceptable or not ie, checks whether nouns, verbs, adverbs. ete are in proper
sequence. aie
5 Acceptable
String , | English (Rules) ct
Grammer Not Acceptable
Fig. 3.1: Grammar |
Context Free Grammar ; It is a notation used to s i ax
language-Context free Grammar are used to design parsers,
‘As lexical analyzer generate & string ing of tokens which are given to parser tor
construct parse tree . But , before constructing parse tree, these tokens will be
grouped so that result of grouping will be valid construct of u language:
So, to specify ‘constructs of language, & suitable notation is.used which will be precise
Context free Grammar.
& éasy to understand. This notation is :
Formally, Context free Grammar (G) can defined as:
It isa 4-tuple (V,z, P, §)
1, Visa set of Non-Terminals or Variables
Lis set of terminals
67e
ar ~
3. Pisaset of Productions or set of rules 45
4. Sisa starting symbol. oe a A
Gis context Free if every Production (P) is of form.A > @, where AE V4,
_ ote (VUS)*
- Example.1: Write down Grammar
for language
. 1 = {a"\nza}
Ans. Let Gey; 2, P, 8)
where V={S}
z=@
P=(S+aS
. Sra
These Productions Generate the language an
ie Sa Lo
“ S ‘sags 4 aa. or az
S sags ?=aaS >aaaorad
S = ass aaS. saaaSs
In Con
| text Free Grammar Variable or Nor.’
side of > (Arrow), These symbols will be expan
Generated, .
The seminal sumbals are basically tokens used in a language.
ae tokens 1 .
_-7Example ind out language generated by a Grammar
S= (1S), fanyig+ aSb,
=
S+ab},s) .
Ans. aSb
> aaShb—
- aaaSbbn
oa ®aaaSbbbob
v
Language Lr {o""|ne Htog let ig Stig? Seg)
$= (0, 1,2, 3 ensen +4,
a Pe S> an
;
> + I>
> <
i> > 0111213. eae 19
ivati - iven b
For e8 the derivation for zis gi YY:
Q 5 2
“2 -
2 ~
2 es >
2 ~l aSa
= abSba
> abcba
beba, derivation is
aus a Write OR : ‘|
: G for language which ‘ “iy nuuinbers
“i S089 1184 &enerates palindrome of bina:
89 | lle
ee To
nee — Senerate String 0110, derivation is
*: * 01819
* > Ole19
* 01g,
~
3.2 DERIVATIONS
: Derivation means replacing a given string's non-terminal by ri
duction rule. The sequence of application, ‘of rules that produces the fini
rang fieminals from the starting symbol is called derivation= ~~ " 7 :
es ‘We can derive terminal strings, beginning with start, symbol, by repeated|
replacing @ variable by some production. The language of CFG is set of terming
symbols we can derive so. This language is called context Free language. 7]
~~ Derivations are denoted by >
Types of Derivation : .
(a) Leftmost Derivation +A derivation A > w is called leftmost derivation
we apply production only to the leftmost variable at every step. Here,
means 0, 1, 2 number of derivations.
(6) Rightmost Derivation : A derivation A 5 wis.a rightmost derivation
we apply production to the rightmost variable at every step.
It is also called Canonical Derivation ~~ . : <5 AA - .
Example 6 : Let G bea CFG with productions. ~ ~ foek -
~ 8+ AA eer:
AaB
2 Bs ey
B Find (1) Leftmost (2) Rightmés!
Ans.: (1) Leftmost Derivation: Ss
bn
Ss = AA
Y
5 aBA -
v:
3 aba
im
=> abaB
a
> abab
(2) Rightmost Derivation:
nn .
S => AA . ae
rn .
=> AaB
m .
=> Aab
mm
> aBab
m
=> ababMf Context Free Grammar, . n
=
3.3. PARSE TREES (DERIVATION TREES)
AParse Tree for a OFG G = (VE, P, S) isa tree satisfying following conditions :
FY 4 Root has labelS
2. Every vertex has label which can be'a variable (V) terminal (S) ore.
3, IEA Cr C2.....Cais a production, then Cr, Cr.......4,Cn are children of
node labeled A
4, Leaf Nodes are terminals (£) & Interior nodes are variables ™.
Yield : Yield of Derivation Tree is the concatenation | of the labels of the leaves
in left to right ordering. 5 :
Example 7 : If CFG has productions.
S>aAS8|a..
A +SbA|SS| ba
a
Show that § +S .aa bb aa & construct parse ‘tree whose yield is aa bb aa.
Ans. 85 aA Ss
» .aSbAS
=>. aabAS a
> aabbaS
" = aabbaa A\
S3 aabbaa ~y
Derivation Tree :
Wn. |
i AN
Lf ”\
Yield = Left to Right ordering of leaves. = aa, bb aa
Example § : Consider ee CFG “ee
> bB\aA
y > b|bS|aAA
B +a|aS|bBB
Find (a) Leftmost (b) Rightmost derivation for string b aa baba, Also find
derivation Trees.ISHAN'S Compiler Design
ae EE
Ans. (a) Leftmost Derivation :
i
S2bB Derivation Tree for leftmost Derivation: :
= bbBB _
[Loe > bbaB
; > bbaaS.
=> bb aabB
= bbaabaS
=> bb aa babB
' : . ” = bbaa ba ba
-" "= °°) Rightmost Derivation :
S=bB Derivation Tree for Rightmost Derivation: -
= bbBB :
S.
= bbBaS J‘ \
> bbBabB. . DS
> bbBabaS / OB
= bbBababB “YN
‘s => bbBababa
Z BE . =» bbaababa A \,
= 4 Ny
\,
‘a
Example 9. Consider grammar
Srolal”
T>T,S|S
Find leftmost and Rightmost Derivation for :
® (2,0)
Gi) ((@, 4), A, @), a)ooo!
\
Free Gra
Ae (( (8 ation Rightmost Derivatio,
2 @ aay
. 9) > (1,8)
» 69 = @@)
> @9§) 2 (7,8)
» @@ > (T, (T,a))
2 @@9) .% 6a)
> @69) 2 (7 @a)
= (a,@§)) = (8, (a, a))
(a, (a, )) > (a, (a, a)
@® @ a A, (a)), a) :
Leftmost Derivation Rightmost Derivation
ss ‘| sm
- (7,8) = (1,8)
> (8) 2 (Ta)
= @,S) :
= (1,8),8) a 4
© (TS,8),8) = ((T, §), 4).
~ (8.6.6,8) > (T,(T)), a)
* (D),S,8),8) (7, (S)), a |
CT, (@)), a)
«2, A, (a), a)
CS, A, (a), a)?
(CT), A, (@)): 8)
= G8), a, @),9)
WCE, a), aeQ@), id
8, 0). 0 (ya) &
=
(T,8), §, 8), 8) °
«Gg, 8), 8, 8) 8) > (fT, S, (a)), a)
(a >
S
5
3), 8)
(@, 4), a, 8), 8)
((@, a), 0, (TL), 8)
((@@, a), A, ($)),8)
> (a), A, @), 8)
> (a, a), .
1A (sa) (ah A, a):
eee avay(SHAN'S Complier Design
7
10 :Consider Grammar given below :
psamle B> B+B|EeB id,
vad (1) Leftmost (2) Rightmost Derivation for string.
ae id +id +id
Ans) Leftmost Derivation os /\
> E+E+E B
sid+E+E /
sidtid+E
sidtid+id
(2) Rightmost Derivation :
EsE+E
2 E+E+E
= E+Etid
> Etidtid
>idtidtid
a Ambiguity
A context free Grammar _G is said to be ambiguous if there exists —
one derivation tree for a given string. - —
os or
_AGrammar that produces more than one Leftmost Derivation (or Rightmost
Derivation) for the same sentence is called ambiguous Grammar:
ee
eg. Following sentence in English.
“In Books selected information is given" 7
It gives two different meaning as:
~ In Books, “selected information is given"
> In Books selected, "information is given"
So, the above sentence can said as an ambiguous one in lan
of English
Exainple 11, Ver ify whether following Grammar is Ambiguous or. rt
Eo F+E| Bek lid .ext Free Grammar
"Ans. For string id + id * id, there exits two parse trees.
S56 = eave C1...
(AA ° aidt Beh ae i /
eidtideE Yer. | 7
oid tideid
Le- i
>E+tE«E - +
(<2 . : F
a eidtbeb o ae J
sidtid«E
> iA +id + id
So, same string is generated using two different leftmost __ -
derivations. Bach having a different parse tree. -
= Two different parse trees exist for string id + id’
Grammar is Ambiguous. .
ambiguous Grammar ~
an one occurrence of a iven non-terminal
5 — —e ~ |
|
|
he
It contain two B's on RHS
We can convert these typ’
production by introducing a new Non-terminal
terminal farther down the Parse Tree,
Example 12: Consider the Grammar
E> E+E \BE| (B) \id
thet this ambiguous Grammar into unambiguous Grammar. .
) which cannot.be further addition
‘Ans. Step 1 : Introduction a non terminal T (term
of two terms.
to Unambiguous
e of Ambiguous roduction
st of these non-
to move right mo
E -E+Tl T
7. +H*E/ OlidPARSING TECHNIQCES
4.1 PARSERS
Syntax Analysis or rser_is the second phase of compilation. Parser takes as
its input tokens generated from the previous phase i.e. Lexi lyzer phase and
_groups them in such & way stich that their syntax can be recognized. ~
It takes input as tokens & convert them into Parse Tree.”
= 0 Token x Analysis || Parse Tree
= Parser) | oe
Fig. 4.1: Parser
Eg. Statement a=b +c will be converted to sequence of tokens a, b, c, =, + and
these tokens Will be converted to Parse Tree by Parser. The correspondirg Parse
Tree will be generated as : —
. <<
debre
a=btc
Lexical Analysis
A |
UL,
Fig. 4.2: Conversion to Parse Tree
79y i ISHAN'S Compiler,
» 80
eo
aa 5
4.1.1 Types of Parser
* There are 2 types of Parser :~ _
1. Bottom-up Parser : It generates the parse tree from eaves to root fora
., Bottom-up Parser,
to’ the
a ~ “given input sting. In Grammar, the ingut_etring will will be xeduced_to the
: “starting symbol.
‘ Ex Shift"Reduce Parser, Operator precedence Parser, SLR, CLR &
. , LALR parsers.
2. Top -Down Parser : It generates the Parse ‘Tree form root to leay
_” Bg Predictive Parsers, Recursive descent parser.
we Parsing Techniques
Fol
lowing diagr ‘am shows the various pees of Parsing Techniques or Pareers.
|
:
4
|
na - [BatooupPa “| Sh, [Top Down Pareer]
Fig. 4.3 : Types of Parser
~\ es REDUCE PARSING
Generates the Parse Tree from Le:
In Shift Reduce ’ Parser, the input String will be reduced to the star ting
symbol, This reduction can be achieved by applying rightmost ie
c+ in reverse i.e, from starting symbol to the i input string—————_
: . Shift Reduce Parser re
y © fa) - Input Buffer. —~
quire 2 Data structures :
. ; |
“+ |. Operator VLR With Y Without
~ | Precedence Parser [Backtracking| Peciatng F
Parser — :
1 |_-
SLR an __ [Predictive ‘tera | |
Patsey Par ee Parser 2 Pescent
U0): Stack =,E noni 7 Parsing for gi given: sting on the Gra
n
i Mman 5
ie ni tring bbede or following Grammar:
ordain g2cABe bede.
Awabe[b APP
Bod |
“3 3 aABe Bottom-up Boa Ae,
| OQ? Ghbe: Be
aAde Parsing
med
v
: ab! bo Be
Sa Abede (Reduction) at ¢
> vabbede i
We can reduce the string abbede to the starti
i:
iE ing symbol S by spolving ,
as oie derivation in reverse at each step,
|
i
|
idle Bach replacement of Right side of production by left side in process
sulci i
andle”
Example 2. Consider the Grammar, ti
Eszsg os
Fe Boke C4 ta
| E+@) 7 j
| Co
i Exiga C
Perform Rightmost “ ;
ieee 7 Deviation to generate String ig, 4 age a) Le Be
5 S/Mind Hondles
: = x
, Ans 7. Ez °
: 3 gig
“zB Bottom ups a
> E+ Reid, Parsin
ts
> E+ ids «ia,
m
> ids + ids « idyrand. Yeo dw CO
a Complter Design’,
Handles at each step :- __
idi + ide * ida
E + ide * ids
\7 B+ Bids
E+E+*E.
L E+E
{ EFig. 4.6: Parse Tree for Precedence Rel,
Fig. 4.6: oe
S ‘minals of non-tey minal
d FIRST & LAST termina,
Example 9. Ain at
grammar, P ree
Or single non i ge Peri
Fo ia
Ans. On seeing the production, We can judge
*18 fist terminal of B
* is first terminal of T
(id are first terminale of F,
ations
E, Pond Fis py2 Contr Desi
But E+T>F
.. ‘First terminal of F are contained in First terminal of T and First
. Teri rminal of T are contained in First Terminal of E.
: First (F) = (( id}
oi First (T) =: * U First @F) ={«, ( id}
suse First (E) = + U First (T) = { +4, *, (, id}
~ Similarly , Last terminals can be found. °
-[Non-Terminal "| FIRST Terminal LAST Terminal
WeeEEtELE Gid ), id
pee T : *,Gid *,),id
E +, 4, (id ‘| +, *) id. |
Example 10. Find out all precedence relations be:ween various operators & symbols
in following Grammar & show them using precedence table.
"Eee - E>E+T|T :
T+-T#F|F 4 aoe oa
F >(B)\idTop DOWN PARSING
Jn top dow parsing, the parse tree is generated from top to bottom 1.2. from
‘ it ti ted.
o> Jeaves & expand it till all leaves are generate:
1 ee trucls the parse tree containing root as the starting symbet of the
Grammar. It starts derivation from start symbol of Grammar & performs leftmost
derivation at each step. - : ae
5.1.1 Types of Top Down Parsing
Top Down Parsing with Backtracking : In Backtracking, parser can
ng ef input. If-required input string is not achieved by a lyin
then another production rule can be applied at each step to get
5.1
1
make repeated sca
‘one production rule,
“9. Top Down Parsing without Backtracking : Once , the production rule
is applied, it cannot be undone, ;
The parsers which follow Top down parsing without Backtracking are given below:
1, Recursive Descent Parser 2. Predictive Parser 3. LL€1) Parser
Top Down Parsing
With Backtracking
Without Backtracking
“Recursive .
| [Descent Parser
Predictive
Parser
Fig. 5.1: Types of Top Down Parsing. 103
pee
eyOP DOWN PARSING WITH BACKTRACKING
Down Parsing with Backtracking, Parser will try different rules or
:figuetion to find the match for input string by backtracking at each step of
ivation. So, if the applied production-does not give the input string as required, or
- figes not matches with required string then we can undo that. move
grapte 1, Consider the Grammar
S+aAd
A ~belb
Mahe parse tree for the string a bd. Also show parse Tree when Backtracking is
pred when wrong alternative is chosen.
‘The derivation for the string abd will be :
S$=aAd = abd (Required String)
Jn Top
rma
Ifbc is substituted in place of Non-terminal A then string obtained will be abcd.
ie $2aAd = abcd (Wrong Alternative)
Figure 5.1 (a) fepresents'S > aAd -
Figure 5.1 (6) represents expansion of tree with production A-» be which gives
fe abcd which does not matches with string abd. So, it backtracks & chooses
ot
ther alternative. i.e A> hin figure 5.1 (¢) which matches with abd.
"Fig. 5.2: Backtracking
2.1 Algorithm for Backtracking /
ample 2, ‘Write.on Algorithm for’ Top down Parsing with backtracking for the
{lowing Grammar. eos :
: . >aAd
be\b °
“there is a procedure for each Non - terminal S & A.
Grammar are input symbols or Look ahead symbols to
inal symbols a, b, ¢;
{ tead by Parser.
' advance ( ): It is'a-function used to move the input pointer to next input04 . ISHAN'S Compiler,
Procedure S ()
€
If input symbol ='a' then . _
2 advance (); -
IfA() then . SoadAd
If Input symbol =
advance ();
declare success ;
Else
error ;
}
i Procedure A() “y
, { _ isave = inputpointer ; . i
If input symbol = 'b' then _ '
advance (); wh, m . i
If input symbol ='c' then >=. A.+ be ;
advance (); he 5 ‘
declare success ;
inputpointer = isave ;
If input symbol = 'b' then a
advance ();
declare success ;
elseerror; ~ os
}
5.2,2’Limitations of Top - Down Parsing ‘ ith Backtracking
Following are sorne. prob!
lems, which occur in Top down Parsing wit
Backtracking. “ot
J+ Backtracking : Backtracking looks very simple & easy to implement b#”
choosing a different alternative causes:lot of problems which are given below :
(0) Undoing semantic actions requir lot of overhead. :
(6) Entries made in the symb
luring parsing have to be remove
while back tracking . ” i
Due to these Reasons, backtrai ing is not'used for practical compilers.
.- Left Recursion : A gramuiar is said to be left Recursive if it hi
production of form: Pe
' A»Aalp.
=
~ semen SE =own Parsing . “—N
~ Top infinite loop.
ter into U os .
Which causes Pa! ar oi The order in which alternatives ary
Rigl .
3. Choosing
fi,
ted. PH
ot the language 20°°P ilure occurs, it is difficult
° Difficult to locate ‘error: : When fai - us hey
where error actually oceurred.
can
5.13 Elimination of Left Recursion : i
Left Recursion can be climinated by introducing new non- terminal 4 au
that. . 7
“Removal of
* _—>
Am Aals Left Recursion
Left Recursive Grammar -.
Fig. 5.3; Remoyal of Left Recursion
This type of recursion is also calléd Immediate left r ur:
In Left Recursive Grammar, expansion of A will An
cach step causing it to c iter in "
4o dn infinite loop, Senerate Aa, Aaa, Acaa, ...at
_ Removal of
Lelt Recursion7 ISHAN'S Compiler Design.
106 cee
3. Consider the left Recursive Granimar.
Bxample & E+E+TT \e
T+Te FIF 5 po ‘ b ;
: » p »\ & \
Bliminate # . o '
re Comparing EB EoEtT|T with J A>Aa Bb . » . OP
E E\+T %
A Al ¢ \
A> AaiBis changed to A-> BA’ ant :
AopA'meansB> TE «()
Aiea A'le means E 24 TEle. we(2)
“Compariny T> T+ F|F.with A> Aa 1B
TI > Tie
eee
“A=T,a=*F,p=F
A>pA' means T- FT* = . +(3)
AvaA'le means T'> ¢ FT'|e w(4)
Production F + (E) | id does not have’any left Recursion » 6)
Combining productions 1, 2, 3, 4, b, we get
E> TE
E's+TE' le.
To FT
TosFT le.
F+@) lid |.
Example 4. Eliminate the left recursion for follo ing grammar:
s >a) eS
>T,S\s °
Ans. We have vamatine left recursion in T-producticns. -
Comparing T+ T,S(S
With = AsAa{B — whereA=Ta=,SandB=§+ Top Down Parsing
A> Aal B
T>TS|S
ts
Removal of A> 6s
Left recursion | A'+aA'le
Removal of
Left recursion
T+ ST
T+ ST'le
\ -. Complete Grammar will be
- Seal*ti)
T>sT
T+, ST le
5.2.3.1 Non-Immediate Left Recursion :
In the following grammai, we don
| But here, we have non-immediate Left’ Recursion,
Ss Ab
AY Sela:
inte or direct left recursion.
Here, S is left Recursive, because S=Ab=Scb
Algorithm to Remove Left:Recursion:
Input : Grammar G witli no cycles or e-productions,
Output: Equivalent Grammar with no Left Recursion.
Method:
1 Order the non-termirials as Ai, Aa...
2. fori= Lton
{
forj=1toi-1
A
107Sas © Sompuler Vesign
A > Sold :
Ans. These Production @5n/
eliminated directly. We can
Step (1) Order the Nox
Substituting Values
"t have immediate left Recursion: ‘So, it cannot be
use the algorithm here to remove left Recursion,
1- terminals S, Aas Ay 1 Avie. Mark § = A; rA=Ay
Als Arb (1)
Ao Are ld v2) i
Step (2) Put value of (1) in (2) 7
“ Al Aeb :
Ar > Arbeld :
Step (8) ~. Again Substitute Ar=S and Asa ~ - :
. S+Ab : 23) =
(4) |
S+Ab Removal of |
A> Abel d Left recursion
A-~Ac|Sdle
Ans. As, we don't have immediate left recursion 80, we have to apply algorithm here.
Step (1) Rename S, 4 as Ai, Az respectively.. - .
Ai >Azalb +) :
Aa >AzclAidfe +. (2) '
Step (2) ee ° i
Substitute value of Au of statement (1) in R.H.S of-statement (2) !
y Al > Azalb st
Ar ~ Ave! Avadibal 2:Top Yon Tee
Step (3) Again, substitute Ar=8, Az
Sedat 2, “ly
A+ AciAadibdle- ‘ 0)
. Statement (4) bas immediate | left Recursion. Removing immediate ne
Step (4) State
“yecursion we get .
i
Removal of.- S>Aalb
S+ Aalb =|} A= bdA'led!
Left recursion A'> cA'ladA'le
A~ Ac| Aad|bdle
5.2.4 Left Factoring. ak
If more than one production i is available 4
clear that which production or alternative shoul
Eg A-abelabd
Both productions have same prefixes, ab on aR, Hi s S, it is not possible for us to
decide, which of alternative to choose.
‘o-expand non-terminal and it is not
ld be je used, to expand non-terminal.
So such type of Grammar can be ‘eft fact in, i filling way:
Amahilapy
Fig :5,
4: Left } Fishoing
Similarly, Above Grammar can be left factored as:
A+abel abd | Left Factoring
: Here, «= ab, Bi=c, fi=d
Example 7, Perform Left - + Factoring on Grammar,
S icy SliCtSe sys
i Cod
| Ans. In Production (1) and (2), we have i (4.6 prefix common between them,
N Sos110
~ Comparing ~ 5 + i048 ¢ [101 ge ISHANS Comparoesgy = T
with A +apilaps ’
Here, a=iCt S, Br =, Bo=eS
S + iCtss’
S > iCtS eliCeses | Left Factoring
. : S'+ eles
Ss
The complete grammar will be.
S +iCSsilb |
SreleS |
Cd. .
53 top DOWN PARSING WITHOUT BACKTRACKING
As, bagtracking looks more powerful by which we can select different
alternatives: But Backtracking*cannot be applied or implemented so easily in
Parsing. The various limitation of Backtracking which we have discussed in earlier
section makes the backtracking not: referable choice. So, to avoid these
limitations, we can use Top Down Parsing without Backtracking Mainly two types
of Purser uses the technique of Top Down'Parsing without Backtracking are —
1, Recursive Descent Parser.
2. Predictive Parser. . :
Left Recursion should be removed, before top down parsing without
backtracking can be applied.
Perform Left factoring on Grammar before doing top down parsing without
backtracking. 4
facktracking Vs Non-Backtracking
Top down Parsing with . | Top Down Parsing without
Backtracking q 3 _Backtracking.
er 7 ‘
Parser can try all alternatives in ‘Parser has to select correc
any order till it successfully | alternative at exch step.
parses the string. come
Backtracking takes lot of time
ie, exponential time to perform
Parsing.
Takes less time.
. 'Top Down Parsing
se : == —
Grammar can have lef - Left Recursion will be removed
; {Beurin. _ before doing parsing.”
: of overhead will be there | Le read.
while undoing the things. =e :
5. | Information once inserted will | I i
. a Information will not b
be removed, from symbol Table Cra
while backtracking——
6. | Difficult to find where actually Easy to find the location of error.
error has occured. :
5.4 RECURSIVE DESCENT PARSER : \
Recursive Descent Parser uses the ‘echnique of Top Down Parsing withov<
backtracking :
Definition ; Recursive Descent Parser.can be defined.as a Parser thit~
uses various recursive procedures to_process the input string with n_
backtracking, Recursive Descent Parsers can- be easily implemented using
_language which is Recursive in nature.
‘The first symbol of string of R.H.S of a production will uniquely determine th
correct alternative to choose. ~ /
E.g In following Grammar, first symbol ie. if, while. & begin umiguehy
determine, which of the alternative to choose.
As, Statement starting with if will be a conditional statement & statemer|—
starting with while will be an Iterative Statement
Stmt > Ifcondition then Stmt else Stmt ,
| While condition do Stmt q |
| begin Stmt end :
Recursive procedures 0 implemen
Example 8. Write down the algorithm using
wil mmar.
. following gram ‘ee ;
Bi ++TE'
TFT
T’ + sFT'\e
_F +@) lid
Procedure’ E ():-./
Sma; [Bote
BQ; 2 |.
Ans.2 a _________ISHAN'S Comite Design
ty
advance ();
TQ);
BQ;
E'++Ts'
Procedure E'()
{If input symbol = '+' then
}
Procedure T ()
{
F(); > FT
JO;
}
Procedure T'() ee
(If input symbol =" then a — a
advance (); : ase ey ;
FO; Tesrp Slee.
TQ;
}
Procedure F QO
{Ifi Input-symbol = ‘id! then
advance ();
else if input - “symbol = '(‘ then
advance ( UA
E();
If input -symbol = y
advance ();
ea
FS @)
else error ();
else error ();
} &
Oo ARDICIVE PARSERS
Predictive Parser is also another met!
hod that imy pleménts the techni f
i “technique of Top
down parsing without Backtracking. . coms“predictive Parsers has
vy Input Buffer : The input Buffer = the string to be parseg fay
by anend marker $ to indicate end: ae by,
en
ee Suing 7
Here a, +, b are terminal symbols, :
2. Stack : It contains combination of grammar symbols with
$ on botto;
stack. At start of Parsing, stack contains i? aki
ilorel'y start symbol of Gramma;ISHAN'S Comp
5, Actions : Parsing Program takes
various actions dependiny
; a £ & upon the
symbol on-top of Stack & Current input aymbol: Vatious Acti
‘ ee u ctions taken
hon Top of
tion
Dee .Stack.| Input | Action
L_ symbol
1. If stack is empty i.e it = Parsing will be
only contains $ and = |
current Input symbol is $ Rie & will be
also $ . 7 7 :
4
2. Msymbol.at top. of stack Bonga Gone
same po egies
and current input: : 5
symbol to be read are tH | @|b iF] oe to next input
both terminals and are |. tl
3. If both top of stack &j| - a Error
‘current input symbol © _ |
are terminals and top of ty - ©] $ .
stack # current input . |
symbol e.g. a # b.
14. If top of stack is non 7
terminal & — input] @|-
symbol is terminal tl
T Refer to entry M [Xa]
in Parsing Table.
@.$| | If Mixa] = X + ABC
then Pop X from stack |
S Push C, B, A onto stack.
struction of Predictive Parser
Steps to be followed while parsing a string using predictive parser :
1 Computation of FIRST & FOLLOW for different symbols of grammar.
2. Construct Predictive Parsi able using FIRST and FOLLOW.
3. Perform Parsing of input string with the help of Predictive Parsing Table.
5.5.2 Computation of FIRST
FIRST (a) is defined as collection of terminal symbols which are-first letters of
Strings derived from a. : ;
Hinst @) = {ole ——+af for some string 8ft
Jon Down Parsing
- 115 5
IfXis Grammar Symbol, then FIRST (4) will be : ;
1, IfX is termiral symbol, then FIRST (X) =.)
2. IfX +e, then FIRST (X) = €e}
3. If X is Non - terminal & X + aa, then FIRST (X) = {a}
4. IfX+Yi Ye Yo, then FIRST (X) willbe _- - =
(0) _IfY, is terminal, then :
FIRST (X) = FIRST (¥1 ¥2.Ys) = (¥1}
(6) IfYi is Non-terminal and
If Yi does not derive to an empty: siting i.e.If FIRST (¥1) does not contain ¢
then, FIRST (X) = FIRST (¥1 Y2 Ys) = FIRST (1)
(© If FIRST (¥») contains , then
FIRST (X) = FIRST (¥1 Yo Ys) .
= FIRST (¥1) - {e} U FIRST (Ye Ys)
Similarly, FIRST (Y2 Ys) = {Yo} , if Yo is terminal otherwise if Yz is Non-
terminal then
FIRST (¥2 Ya) = FIRST (¥2), if FIRST 2) does not contain €.
If FIRST (Y2) contains é, then - ;
FIRST (¥2 Ya) = FIRST (¥2) - {e} U FIRST(¥2)
: \
this method will be repeated for further Grammar symbols i.e. for
. Similarly,
fa Yo Yo... Yu.
8: 5.3 Computation of FOLLOW
Follow (A) is defined as collection of terminal symbols that appeat immediately
ito the right of A.
~ ROLLOW(A) = {
Rules to find FOLLOW:
1. IfSisstart symbol, FOLLOW (S) = {$}
2. Ifproduction vis of form Aa BA, pre.
a) If FIRST (6) does not contain.e then, FOLLOW (8) = FIRST @)}
IsSaAap where o,f can be any es
. )
!
b) IFIRST (contains (eB :
FOLLOW 8) = = FIRST @)= nau FOLLOW (A)
x when f derives €, then terminal after A will follow B.
3. if production is of form A> @B then FOLLOW (8) = {FOLLOW(A}:|
ISHAN'S.Compiler:Design ¢ Jo
116
ple Find FIRST & FOLLOW for following Grammar.
S+AaA|BbB
A+bB
Boe ~
‘Ans. Computation of FIRST
1 A*bB
——
<. FIRST (A) = {0}
2 Bre
-. FIRST (B) = €)
Exam] t
1
38. S+Aad
Applying Rule (4) of FIRST
ie. Comparing S+AaAwithX+YiY2Ys ~
<. FIRST (S) = FIRST (A aA) = FIRST (A) = (bf,
«: FIRST (8) = 0}
4. S+BbB Ba
FIRST (B) contains e or B derivese -. Applying Rule’(4c)
FIRST (S) = FIRST (B b B) :
”.__ FIRST (S)= FIRST (B) - (¢) U FIRST (bB)
[FIRST ()= FIRST @) 9 UM=€-OUH=0 |
+. FIRST (A) = {b} » ,
FIRST (B) = {e}
FIRST (8) = (b} /|
Computation of FOLLOW;
Applying Rule (1) of FOLLOW
1 B*AaA ,
Applying Rule (2) of FOLLOW
Comparing S > AaAwithA+aBpaunt y= Fn A) =
- “sine
which does not contain .
«Rule 2(@) of OO i iat ume a
i aT . : 0
Ts FOLLOW(A)=() f a
Applying Rule (8) of FOLLOW: -
Comparing 8+ Aa AwithA+aB
a
Follow (A) = (FOLLOWS)
2.
ou)
$= BbB cate
Applying Rule (2) of FOLLOW et
Comparing § + BbB with A + aBg . Ts :
(§-- 2 p s] Ass
B=p
A=bB
TIRST ()= FIRST OB) =) Which dag Not contain ¢ ~
“ Rule 2) of FOLLOW 9s
[FOLLOW @) = (rims 8) ] oS
* FOLLOW (B) = (i)
Applying Rule (3) of FOLLOW : ve
Comparing 8» Ba with AaB a118 ISHAN'S Compiler ol
= FOLLOW (B) = (FOLLOW®)}: _ |
3. A+bB /
Rule (2) cannot be applied. As we can't match A+ b BwithA =a BA. \
If we compare a = b, B= B, B=e. |
Here , f will become ¢, which is not possible i
Applying Rule (8) of FOLLOW |
Comparing A + bB with A+aB
a= [ofa
8
“. Follow (B) ={ FOLLOW(A)}
From statements (1) to (6)
FOLLOW (S) = {8}
FOLLOW (A) = {a} :
FOLLOW (A) = {FOLLOW (8)} ©
FOLLOW (B) = (b} .
FOLLOW (B) = {FOLLOW(S)}
FOLLOW (B) = {FOLLOW (A)}
“-FOLLOW (8) = {8)
From (1), (2), (3)
FOLLOW (A) = (8, a)
From (4), (5) , (6)
| FOLLOW (B) ={§, a, b}
Example 10. Find is FOLLOW for the fling grammar :
Abt 7
T+ Oe *F IF
FP +(B) lid.“Top Down Parsing . .
—_—_—— = _ 119
Ans. Computation of FIRST a
L EoE+T |?
Since FIRST (E) does not contain e.
FIRST (E)= FIRST (E+T)=FIRST(E) _
As, ET
-. FIRST ) = {FIRST (T)} (1)
2. T+T«F|F
As FIRST (T) does not contain ¢ or T does not derive ¢.
FIRST (T) = FIRST (T+F) = {FIRST (T)}
As, T-> F FIRST (1) = (FIRST (F)} . (2)
3. F+(@) [id :
-. By Rule (3) of FIRST
FIRST @) = {6 id}
From (1), (2) & (8)
FIRST) = {( id}
FIRST (1) = {FIRST @)}
FIRST @) ={FIRST(D)
FIRST (@) = FIRST(D) = FIRST () =
+-(3)
++-(8)
+(2)
o()
Computation of FOLLOW: -
EoBsTIT
ToTsFIF
F>@ lid - - : »
Apply Rule coffowLow @)= I wl
q) EoE+tT ~ /
Applying Rule (2) of FOLLOW: -
Since
Rule (2 a) of FOLLOW:
FOLLOW (&) = FIRST eBNO
120
Applying Rule (8) of FOLLOW
Es {E+ |T
+» |@ B .
ac)
FOLLOW (1) = (FOLLOW (H)} |.
(QT hF
Applying Rule (2),
> |T> le|T)] «F
A> j/a|/B] B
" A=T, c=e,B=T, B=+F
‘Since FIRST (6) = {#} does not contain e.
Rule (2 a). of FOLLOW
FOLLOW (T) = FIRST (*F)
Appling Rule (3)
T> [T= |F oe 2
Am i |B A 8a
[ouowm = l( JE
A> fee ta
i Since FIRST @) = {)} does Not contain e,
-+ Rule (2a) of FOLLOW
FOLLOW ©) = MRST(y,
: “FOLLOW @) =() a =
Rule (3) cannot be applie ":
. d to this Production
' Ithas) at Tight end corner :Top-Down Parsing
i
Sse remanence
ue we can't compare A +c Bwith F + (B),
Combining all § statements , we get. .
FOLLOW © =4) Poe
FOLLOW ©) =} ; Ea
POLLOW (FD ={FOLLOW@} . oe (3)
FOLLOW (1) = fe} oa ll
FOLLOW @) ={FOLLOW(D} ee (8)
- (6)
FOLLOW @) ={)} . oe.
-. From rule (1) , (2) & 6)
FOLLOW @) =18, +.) ea
From rule (3), (4), & (5) se
(a TOA)
icone far -
Ot RIROT E FONTOW fire
oe Ittown Fats 7 bl
~ 4 Construction of Predictive Parsing Table
554 . LLOW for each Grammar symbo}
2 ing FIRST & FO!
After finding F!
‘ag Table. The Algorithm to find entries of Table
eppayeatin to be already computed.
FOLLOW &
a
"We Will co,
TEQuites Fig |
ithm to construct Predictive Parsing Table: ~ k
: ori! : : !
thee - : Context Free Grammar G
Output Predictive Parsing Table M
|
a : For the production A + @ of Grammar G
oe -— each terminal ain FIRST (a) add A ->2 to MIA, a), |
| [a tteisin RST and ein FOLLOW (A)-then add
Ife is in FIRST (2) and § in FOLLOW (A), then add A
~_Allemaining entries in Table M are error,
Fig. 8.6: Algorithm to construct Predictive Parsing Table
A>atoM ay
2
"hs. >a toMIA, §)
“| 4.
_ Eg
Computation of FIR,
“Construction Of Pre
.. ‘Parse the Input str
55.5 Predicti
ST & FOLLOW
a
dictive Parsing Table,
ing.
ve Parsing Program
After Finding Variow: entries of Para: i
i ] a entries of Parsing Table “sing FIRgp ke
ae 4 ete a id USIng Parsing Program, If Will take tite i;
ing sym! & curray "mt tig
At input String, ,-
ISHAN'S Compiler Design: i
2 pai or Predict Parsing:
input: string to be parsed & Parsing Table M' for Grammar
output: Stack will become empty or It will give error.
method =
while Stack is, not empty do
{ Let X be top symbol on stack
Tet a be next input,syrabol
ie Xis ‘Terminal or $ then
~jpX=a then
77 Pop X form stack & remove a from input
else error ();
else -
ao eM [X, a) =A> Yi Ye.
= Pop Xand Push Yo Yo...
- alse error () 5
onto stack
} :
Example 13. Construct Predictive Parsing table for the following grammar & also
check whether stringid + id #id is qecepted or nol.
os1 ETE
El >+TE' |e
qr
f T’ «FT le
- F+(B)lid
Ans. Step 1: imination of left Recursion & pe!
As, thereis no left Recursion in Grammar so,
there is no need of Left Factoring.
Step 2: Computation of FIRST +(Taken from Previous Examples)
FIRST (E) = FIRST (1) = FIRST )=(( id}
rform left Factoring
we will proceed as it is. Also,
18 "<5 .
jon of FOLLOW:
FOLLOW @)=(),S}
FOLLOW (T)= (+). 8)
FOLLOW ®) = (+018) .
Step 4: Construction of Predictive Parsing Table
Create the table ie. write all Non: ;terminals Row wise & all terminal
columnwise, ;
“Step 3: Computati
FOLLOW (@)|
Top: Down Parsing
Fi —————— 134
E
Er
MIA, al = T
= 2
L_ F
Now, Fill the table by ‘applying rules from construction of | Predictive F Parsing
Table.
lL E+TE
Comparing E+ TE' withA+a
A> ASEa=TE
.. FIRST (@) = FIRST (TE = FIRST ON = ( i
_ Applying Rule (1) of Predictive Parsing Table:
Add E+ TE'to. MOB, (J and M [E, id]
= write:E + TE in front of Row (B) and Columns ((, id} | A)
2.
og Comparing itwithA+a
Po |+TE ,
A>
ASE a=+TE
: FIRST @) = FIRST GTE) = {4
“Add E' + +TE' toM[E', +]
> + TH in front of Row (E’) and Column (+) \ 1.42)
[ write production B
Bre
FIRST (@) =e}
* Applying Rule (2) of
Predictive Pareing Table.132
$32
Find FOLLOW (B') = {), $} /
= ‘Add Production poet, »)| and M(E’, $]
+ write B' > ¢ in front, n front of Row (B') and Columnt{ §, )}
4, TFT --
Comparing it with A > @
wal
Ts |FT
A> |@
a
.A=T, a=FT
FIRST (a) = FIRST (FT) =.FIRST (F) = {(, id}
Add Production T > FT" to M [T, (] and M[T, id]
[write T > > PT in front of Row (1) and Column {( i . wl’
5.0 T'>«FT'
Comparing it with A >a
Ts
A>
FIRST (@)= FIRST GFT) = {+}
Add Production T' > +FT" to-M [1 4].
: write T! + «FT" in front of Row {t) and Column (#)
G Ti+e
Comparing it with A> a:
15
ase
PIRST (a) = FIRST {e} = f@)
Applying Rule (2) of Predictive P.
Find FOLLOW (A) = FOLLOW =
Add Fos, 4), M i
‘arsing Table .
a4, ), 9}
and M ['T’, $]
(6)FIRST (@) =vIRST @)=(0-
ES AddP>@)oM EO”. : 7 a
@ ile F> (B) in front of Row (F) and Column « ]
g& Frid
\
wld)
FIRST (@)= FIRST (id) =(id)
Add F+idtoM[F id] 0
write F > id in front of Row (F) and Column (id ) «+ (8)
(8) will generate the following Predictive
i ; :
i id + eT. ( ) <_ 4
@) i | E E>TE . E+E
i fF eat _ [Boe |e
i . | ar
tlt TFT’ ; Torr 4
Toe [Teer]
F F id _./F>@
Fig. 5.7 : Predictive Parsing Table
Step 5 : Checking Acceptance of § id * id using Prog:
Program : ae ik Predictive Pa
Initially stack will contain the starting-symbol E sind § at botto,
Input Buffer will contain string attached with $.at the right end. of Stack
|. Hf top of stack = Current Input Symbol, then symbol from top of stacy,
Popped and also input pointer advances or read next symbol Will be
eC
TsingStack Input Action j
‘| SE id+ideid$ | E> TE !
SET . | idtidess |ToFr
SETE |. id+ideids | F> id
:[$ETid | id+id+id§ | Remove id
SET tideids [Te
$E t+id«id$ | E'++TE
. sett +id*id§$ | Remove +
[SET id+ids |T+FT
SEE id*id§ [F>id
_ [SE Tid id*id$ | Remove id
set . ids To4FT
SET'Fs «id $ Remove «
SET F id $ Fo id
geri | . ids Remove id
ser. |. $ Te
$B... $ Eine
eye $ Accept.
Fig. 5.8: Sequence of Moves ‘by Predictive Parser
5.5.6 LL (1) Grammar
InLL (2), :
First L means left to Right scanning of input
Secorid L means Jeftmost derivation and
lis number of look ahead Ge used by Parser at each step to
make parsing 2 actio
> Ifonly one symbol ill: de
are known‘as LL (1) rae Grainmars on which these parsers are ap]
plied are called
LL (1) Grammars
A Grammar is LL (1) if it satisfies following properties: :
1. Parsing table has no multiply defined entries.
2. :fproduction in grammar is of form A + @ |B then :
cide which action to perform, then these parsers._ FRST@) NFIRST 6); #*{a} where a can be any, terminal symbol
. empty string
(b) .FIRST (@) 9 FIRST @) # ¢ ie. at most one of a and B can derive
() iff Se, then FIRST. (@) N FOLLOW (A) = ¢'i.e. a does not derive
any string ; beginning with terminel in FOLLOW (A).
-prample 14. Consider the following grammar.
§--iC t SS'[a (1) Construct predictive Parsing table of Grammar
S'oeSle ‘(2) Is this Grammar LL (1) ?
Cob
‘Ans-Step 1, Eliminate left - Recursion and Left Factor the grammar if-
required.
As, there is no left - Recursion and grammar is already left Factored.
Step 2 Computation of FIRST ~ 5
@
Gi)
Gi co
“FIRST (C)= 0}
- FIRSTS) = fia) | /
FIRST (S') = fe, ¢}
: FIRST (0) ={b}
Step j Compittation of FOLLOW:
Applying Rule (1) of FOLLOW
FOLLOW (8) = {$}
7 pRST () does not contain.
wa): IsHAN'S Campiler Desiga
JsHays comin s=
136. .
“+ By. Rule (2 a), FOLLOW (C)= (FIRST (t SS)} >
[> FOLLOW @= (8 |
Again, Apply Rule (2) on different combination
T
So ict.)S Ss 4
A+ a {BIB - |
FIRST @) = FIRST (S') = {e, ¢} contain e. oe i
By Rule (2b), FOLLOW (S) = FIRST (81) - {«} U FOLLOW (8)
FOLLOW (8) = e, ¢} — {e} U FOLLOW (S)
“__ FOLLOW (8) = {e} U FOLLOW (8)
‘Applying Rule @) of FOLLOW
++-(8)
S>icts [s.-]. oe i
Ao|la !
a
“FOLLOW (= (FOLLOW 6}
(2) S'+es
Rule (2) cannot be applied on this P;
Applying Rule (3) of FOLLOW
. “[S2]eTs]* , i
A>[a[B] : '
FOL_OW (S)= FOLLOW @HP =
+e (5)
The Renaming Production § + a, Sise ~and Cb do not match with any
wy) |
roduction,
Rule
Combining statements (1) to (5) : :
FOLLOW (S) ={ 5} ou
wt)
FOLLOW (C) = re 22) |
FOLLOW 6) = (6) U FOLLOW @) 3)
FOLLOW (S}) = FOLLOW (4)
FOLLOW (8) = (FOLLOW (69) w«.(5)
FOLLOW (S) = FOLLOW ()=
FOLLOW (C)={t}*~~ step 4: Construction of Predictive Parsing Table. Pt ay >
| LN
ed, hes, t Colu
$},C,S, Row wise and All terminals, ‘ng: mn wise Ming
{ q) S#iCtss' - . /
Comparing S +i CtSS' with A +a
S- HCtss 7
| : Ay |@ | |
1 FIRST (a) = FIRST (i CtSS)={i} .
Applying Rule (1) of Predictive Parsing: Table -~
; Add 1CESS'toM [8,3] ” =.
write S +i C tS $' in front of Row 8 and Gia (i)
(2) Sa
i Comparngitwith Awa” 7 ace
“(l)
“FIRST @)= = FIRST (a) = {a} °
: Es Put S » a onto M (3, a}
8) S'+eg
7 -.(2)
Comparing itwithas @
(9)2 ISHAN'S Compiler Design
“FIRST c) = FIRST) =(
° Applying Rule @ of Predictive Parsing Table. i.e. If ¢ is in FIRST (a)
hen FOLLOW S)={er8)
Add 8.» ¢ toM[S',e] M[S/8]
Row (S’), Column {e, 9}
write S' +é infront of a!
@) Crb
Comparing it withA>a _
, [es |b
- A> |@
FIRST (a) = FIRST () =}
Add C>btoM[C,b) =
[waite C bin front of Row (0), Column (b) 7 i
= Combining statements (1) to ©) _we zet following Predictive Parsing Table.
YY a aie = “ii ts \
S |S7a L— §-ictss' }
: s' [Si eS
1 NSE
ie _[e=> | T _L Jo
This Grammar is not LL (1) because In Row (S') and column (e) we have
multiple Entries for productions - c .
Sireo, Ste . ae
S'+eS appears as FIRST 6) ={e} -
S'-+ appears as FOLLOW (S)={}
This Grammar is not LL (1) Grammar.
nm
v
o
|_