Lexical analysis is the process of converting a string of characters into a sequence of tokens. A token is a string with an assigned and thus identified meaning. The sequence of tokens is called a token stream. The token stream is used as input for the parser.
Recognizes all basic keywords such as cns,var,Tensor etc and operators used by our language
Our lexer can recognize unclosed comments ,strings and overflows in constant values
go inside the Lexer folder and then run
make
make test
This should run the test cases and report their results
C Lexer