Tiny Lexer

easy · compilers, lexing, tokens

Tiny Lexer

Implement a lexer for the tiny language used in this pack.

The lexer turns source text into a list of tokens with line and column information.

Function signature

func Lex(src string) ([]Token, error)

Token kinds

Your lexer must produce these token kinds:

  • Keywords: let, fn, if, else, return
  • Identifiers: /[A-Za-z_][A-Za-z0-9_]*/
  • Numbers: /[0-9]+/ (decimal integers)
  • Operators:
    • + - * /
    • = ==
    • ! !=
    • < <=
    • > >=
  • Punctuation: ( ) { } , ; .
  • End of file: EOF

Whitespace and comments

  • Skip spaces, tabs, and newlines.
  • Line comments start with // and run to the end of the line.

Positions

Each token must record the line and column where the token starts.

  • Lines start at 1.
  • Columns start at 1.
  • The EOF token should use the line/column after the last character.

Errors

If you encounter an unexpected character, return a non-nil error.

Notes

  • Prefer readability over cleverness.
  • The lexer should be deterministic and easy to extend.
Run tests to see results
No issues detected