Lexical analysis

What is Lexical Analysis?
Lexical analysis is a concept that is applied to computer science in a very similar way to how it is applied to linguistics. In essence, lexical analysis means grouping a stream of letters or sounds into sets of units that represent meaningful syntax. In linguistics this is called parsing and in computer science it can be called parsing or tokenizing.

The idea of lexical analysis in computer science is that lexical analysis breaks down streams into 'lexemes' in which a token represents the basic unit of meaning. Tokens are strung together in such a way that the language compiler has to go back to isolate them in order to implement the correct arithmetic instructions. Basically both humans and computers do lexical analysis, but computers do it differently and in a much more technical way. The way in which computers perform lexical analyzes does not have to be transparent to humans - they just have to be programmed into the computer system. Programs that perform lexical analyzes in computer science are often referred to as lexers, tokenizers or scanners.

Was the explanation to "Lexical analysis"Helpful? Rate now:

Further explanations for the first letter L