• Best of luck to the class of 2024 for their HSC exams. You got this!
    Let us know your thoughts on the HSC exams here
  • YOU can help the next generation of students in the community!
    Share your trial papers and notes on our Notes & Resources page
MedVision ad

Lexical Analysis (1 Viewer)

sikeveo

back after sem2
Joined
Feb 22, 2004
Messages
1,794
Location
North Shore
Gender
Male
HSC
2005
I dont understand what it says in the sdd book.

Is the table made up from what is in the source code? Or is it populated with all the legitimate lexemes of the language? A quick rundown of the process would greatly help.
 
Joined
Nov 4, 2004
Messages
3,550
Location
Sydney
Gender
Male
HSC
2005
the tokens are categorised in the table during lexical analysis
 
Joined
Nov 4, 2004
Messages
3,550
Location
Sydney
Gender
Male
HSC
2005
what?! thats actually what happends, dont you have the rest of the definition?

lexical analysis scans each element and gives it as a token, this token is categorised in a table. Tokens can be categorised 2 ways, either by the language syntax (ie. operators) or by the programmer elements (eg. variables)
 

sikeveo

back after sem2
Joined
Feb 22, 2004
Messages
1,794
Location
North Shore
Gender
Male
HSC
2005
So what goes into the table? The identifiers from the code, or is the table full of all the correct identifiers which is compared with the code?
 

sunny

meh.
Joined
Jul 7, 2002
Messages
5,350
Gender
Male
HSC
2002
What goes in depends on the lexer. The main idea here is that after lexical analysis, parsing becomes much simpler as only tokens are being dealt with making syntax checking a simpler task. Wikipedia has a good entry on this.
 

MarsBarz

Member
Joined
Aug 1, 2005
Messages
282
Location
Nsw
Gender
Male
HSC
2005
I don't have my notes by me right now so this is just from my memory.

Lexical analysis is the process that ensures that all reserved words, identifiers, constants and operators are legitimate members of the language's lexicon.
Process:
1. Redundant characters are removed (ie: remarks, indentation, spaces etc).
2. Identifies each element and assigns each of them with a token. If the element cannot be identified the translator generates an error message.



Whilst I don't think that it is necessary to know much about the token table (It's not even mentioned in the excel book), I'll try to answer your question with the Sam Davis book. According to him, the table initially contains all the reserved words and operators that are included elements of the language. As the lexical analysis continues, identifiers and constants will be added to the token table.

So I gather that from his explanation, it seems that at the end of the lexical analysis, the table should contain:
1. All the legitimate elements of the programming language such as reserved words and operators (already present).
2. The actual constants and identifiers which the programmer has created whilst coding.


That is my overall understanding of the lexical analysis process, but I do not think that it is necessary to learn all of that in that much detail.
 

Users Who Are Viewing This Thread (Users: 0, Guests: 1)

Top