Skip to content
Anssi Halmeaho edited this page Dec 10, 2022 · 1 revision

stdlex

Provides access to FunL interpreter lexer.

Functions

tokenize

Tokenizes string given as argument to list of tokens (represented as maps).

Return value is list:

  1. bool: true if tokenizing was successful, false otherwise
  2. string: error description if tokenizing failed
  3. list: contains map for each token

Format:

call(stdlex.tokenize <source-text:string>) -> list(<ok> <err> <tokens-list>)

Each token is map which has following key-values:

Key Value
'type' type of token (string)
'value' value of token as string
'line' line of source code (int)
'pos' position in line of source code (int)

Following token types are supported:

Token type Meaning
'tokenSymbol' symbol
'tokenAs' as keyword
'tokenStartNS' ns keyword
'tokenEndNS' endns keyword
'tokenNumber' number
'tokenString' string
'tokenDot' "." -character
'tokenOpenBracket' opening bracket
'tokenClosingBracket' closing bracket
'tokenComma' "," -character
'tokenTrue' true (bool)
'tokenFalse' false (bool)
'tokenFuncBegin' func keyword
'tokenProcBegin' proc keyword
'tokenFuncEnd' end keyword
'tokenImport' import keyword
'tokenEqualsSign' "=" -character
'tokenExpander' ":" -character
'tokenLineComment' line comment (started with #)
'tokenMultiLineComment' multiline comment (/*, */)
Clone this wiki locally