Program Lexer
Program Lexer
Implement a lexer for the expanded compiler language (v2).
The lexer turns source text into a list of tokens with line and column information.
Function signature
func Lex(src string) ([]Token, error)
Token kinds
Your lexer must produce these token kinds:
Keywords
let,var,constfn,if,else,while,returntrue,false,nil
Identifiers
/[A-Za-z_][A-Za-z0-9_]*/
Numbers
/[0-9]+/(decimal integers)
Strings
- Double-quoted:
"..." - Supported escapes:
\",\\,\n,\t - Newlines are not allowed inside a string
- The token's
Lexemeshould be the unescaped string value (no quotes)
Operators
+-*/===!!=<<=>>=
Punctuation
(){},;
End of file
EOF
Whitespace and comments
- Skip spaces, tabs, and newlines.
- Line comments start with
//and run to the end of the line.
Positions
Each token must record the line and column where the token starts.
- Lines start at 1.
- Columns start at 1.
- The EOF token should use the line/column after the last character.
Errors
Return a non-nil error for:
- unexpected characters
- unterminated strings
- invalid escape sequences
Notes
- Prefer readability over cleverness.
- The lexer should be deterministic and easy to extend.
Run tests to see results
No issues detected