-
Notifications
You must be signed in to change notification settings - Fork 670
Fix redshift specific character #475
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -361,6 +361,25 @@ impl<'a> Tokenizer<'a> { | |
| Ok(tokens) | ||
| } | ||
|
|
||
| fn consume_sharp( | ||
| &self, | ||
| chars: &mut Peekable<Chars<'_>>, | ||
| ) -> Result<Option<Token>, TokenizerError> { | ||
| match chars.peek() { | ||
| Some('>') => { | ||
| chars.next(); | ||
| match chars.peek() { | ||
| Some('>') => { | ||
| chars.next(); | ||
| Ok(Some(Token::HashLongArrow)) | ||
| } | ||
| _ => Ok(Some(Token::HashArrow)), | ||
| } | ||
| } | ||
| _ => Ok(Some(Token::Sharp)), | ||
| } | ||
| } | ||
|
|
||
| /// Get the next token or return None | ||
| fn next_token(&self, chars: &mut Peekable<Chars<'_>>) -> Result<Option<Token>, TokenizerError> { | ||
| //println!("next_token: {:?}", chars.peek()); | ||
|
|
@@ -422,7 +441,11 @@ impl<'a> Tokenizer<'a> { | |
| s += s2.as_str(); | ||
| return Ok(Some(Token::Number(s, false))); | ||
| } | ||
| Ok(Some(Token::make_word(&s, None))) | ||
| if s == "#" { | ||
|
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @alamb What do you think about this change?
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Maybe we should check if the dialect is RedshiftSqlDialect?
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The change doesn't feel right to me (because it effectively skips |
||
| self.consume_sharp(chars) | ||
| } else { | ||
| Ok(Some(Token::make_word(&s, None))) | ||
| } | ||
| } | ||
| // string | ||
| '\'' => { | ||
|
|
@@ -624,19 +647,7 @@ impl<'a> Tokenizer<'a> { | |
| } | ||
| '#' => { | ||
| chars.next(); | ||
| match chars.peek() { | ||
| Some('>') => { | ||
| chars.next(); | ||
| match chars.peek() { | ||
| Some('>') => { | ||
| chars.next(); | ||
| Ok(Some(Token::HashLongArrow)) | ||
| } | ||
| _ => Ok(Some(Token::HashArrow)), | ||
| } | ||
| } | ||
| _ => Ok(Some(Token::Sharp)), | ||
| } | ||
| self.consume_sharp(chars) | ||
| } | ||
| '@' => self.consume_and_return(chars, Token::AtSign), | ||
| '?' => self.consume_and_return(chars, Token::Placeholder(String::from("?"))), | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah I am torn on this one - I worry about causing trouble for someone who is relaying on
#not being a valid pg identifiers. I don't really have a great understanding of what people are doing / using sqlparser for and I would say it is not particularly consistent about disallowing constructs that are not valid other dialects of SQL 🤔What do you think about adding a
RedshiftSqlDialectto avoid unintended consquences? I think it could be pretty straightforward, something like: