Classes
GoStateMachineLexer
2
▼
Go lexer using composable mixins.
Go has clean, regular syntax making it one of the simpler lexers…
GoStateMachineLexer
2
▼
Go lexer using composable mixins.
Go has clean, regular syntax making it one of the simpler lexers.
Token Classification:
- Declaration keywords: func, type, struct, interface, const, var
- Namespace keywords: import, package
- Constants: true, false, nil, iota
- Types: Primitive types (int, string, bool, etc.)
- Builtins: make, len, cap, append, etc.
Special Handling:
- Exported names (starting with uppercase) → NAME_CLASS
- Raw strings (backticks) can span multiple lines
- Runes (character literals) use single quotes
Methods
tokenize
2
Iterator[Token]
▼
Tokenize Go source code.
tokenize
2
Iterator[Token]
▼
def tokenize(self, code: str, config: LexerConfig | None = None) -> Iterator[Token]
Parameters
| Name | Type | Description |
|---|---|---|
code |
— |
|
config |
— |
Default:None
|
Returns
Iterator[Token]
Internal Methods 1 ▼
_classify_word
1
TokenType
▼
Classify an identifier.
_classify_word
1
TokenType
▼
def _classify_word(self, word: str) -> TokenType
Parameters
| Name | Type | Description |
|---|---|---|
word |
— |
Returns
TokenType