Support parsing with prepared tokens #13
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Rationale for this change
This change improves the flexibility of the SQL parser by allowing clients to pass a pre-tokenized input stream and an explicit dialect for parsing.
It enables custom tokenization workflows — including external tokenizers, dialect-specific preprocessing, or token manipulation — to be performed outside the DataFusion crate, without modifying internal parser logic.
The default behavior remains unchanged, ensuring full backward compatibility.
What changes are included in this PR?
DFParser::from_dialect_and_tokensfor creating a parser with an externally provided tokens.DFParser::parse_tokens_with_dialectfor parsing SQL statements from a given token sequence and dialect.DFParser's initialization (new,new_with_dialect) and functions (parse_sql,parse_sql_with_dialect) to internally delegate to the new functions, preserving original semantics.Note: The
sqlparserdependency remains pinned to a forked version (tarantool/datafusion-sqlparser-rs, branchrelease-42.0.0) that supports external tokenization workflows via iterator-based tokenizers. The parser itself does not depend on these tokenizers directly.Are these changes tested?
No new tests were added. The changes are fully integrated into the existing parsing flow, and all current tests pass. This confirms correctness and compatibility of the new implementation.
Are there any user-facing changes?
Yes, two new opt-in APIs are now exposed in the parser module, enabling advanced usage scenarios with pre-tokenized SQL input. These changes do not affect existing behavior or interfaces and are fully backward-compatible.