Skip to content

bbuck/go-lexer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This package provides a Lexer that functions similarly to Rob Pike's discussion about lexer design in this talk.

You can define your token types by using the lexer.TokenType type (int) via

const (
        StringToken lexer.TokenType = iota
        IntegerToken
        etc...
)

And then you define your own state functions (lexer.StateFunc) to handle analyzing the string.

func StringState(l *lexer.L) lexer.StateFunc {
        l.Next() // eat starting "
        l.Ignore() // drop current value
        for l.Peek() != '"' {
                l.Next()
        }
        l.Emit(StringToken)

        return SomeStateFunction
}

It should be easy to make this Lexer consumable by a parser generated by go yacc doing something alone the lines of the following:

type MyLexer struct {
        lexer.L
}

func (m *MyLexer) Lex(lval *yySymType) int {
        tok, done := m.NextToken()
        if done {
                return EOFToken
        } else {
                lval.val = tok.Value
                return tok.Type
        }
}

License

MIT

About

Lexer based on Rob Pike's talk on YouTube (view README)

Topics

Resources

Stars

Watchers

Forks

Packages

 
 
 

Languages