Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

First version of a Turtle Parser (cleaned) #152

Merged
merged 1 commit into from Nov 19, 2018
Merged

First version of a Turtle Parser (cleaned) #152

merged 1 commit into from Nov 19, 2018

Conversation

niklas88
Copy link
Member

Niklas: This is a git rebase --interactive master cleaned version of @joka921's code. I think the test failure in the orginal PR #140 is actually a result of his master branch not being git submodule sync && git submodule update --init --recursive synced.

  • Uses recursive descent parsing (LL1-Grammar)

  • Currently Lexing is done manually using google's re2 regexes.

  • Sometimes lexing is done with custom c++-code when faster (e.g. for
    whitespace)

  • Probably better to write a proper Lexer using longest match or use
    Lexer generator

  • Bzip2 Decompression still has some flaws.

  • passing a *.ttl file with the -n argument of IndexBuilderMain will use the turtle parser

- Uses recursive descent parsing (LL1-Grammar)

- Currently Lexing is done manually using google's re2 regexes.
- Sometimes lexing is done with custom c++-code when faster (e.g. for
  whitespace)
- Probably better to write a proper Lexer using longest match or use
   Lexer generator

- Bzip2 Decompression still has some flaws.
- passing a *.ttl file with the -n argument of IndexBuilderMain will use the turtle parser
@niklas88 niklas88 merged commit ccba014 into ad-freiburg:master Nov 19, 2018
@niklas88 niklas88 deleted the joka921-f.turtleParser branch July 19, 2019 09:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants