Module

tokens

Token and TokenType definitions for the Patitas lexer.

The lexer produces a stream of Token objects that the parser consumes. Each Token has a type, value, and source location.

Thread Safety:

Token is frozen (immutable) and safe to share across threads. TokenType is an enum (inherently immutable).

Classes

TokenType 0
Token types produced by the lexer. Organized by category for clarity: - Document structure (EOF, B…

Token types produced by the lexer.

Organized by category for clarity:

  • Document structure (EOF, BLANK_LINE)
  • Block elements (headings, code blocks, quotes, lists)
  • Inline elements (text, emphasis, links, images)
  • Directives and roles
Token 7
A token produced by the lexer. Tokens are the atomic units passed from lexer to parser. Each token…

A token produced by the lexer.

Tokens are the atomic units passed from lexer to parser. Each token has a type, string value, and source location.

Thread Safety: Frozen dataclass ensures immutability for safe sharing.

Attributes

Name Type Description
type TokenType

The token type (from TokenType enum)

value str

The raw string value from source

location SourceLocation

Source location for error messages

line_indent int

Pre-computed indent level of the line (spaces, tabs expand to 4). Set by lexer at token creation; -1 if not computed.

Methods

lineno 0 int
Line number (convenience accessor).
property
def lineno(self) -> int
Returns
int
col 0 int
Column offset (convenience accessor).
property
def col(self) -> int
Returns
int
Internal Methods 1
__repr__ 0 str
Compact repr for debugging.
def __repr__(self) -> str
Returns
str