We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Before you file an issue
parse_one(sql, read="spark")
ast.sql(dialect="duckdb")
Fully reproducible code snippet Please include a fully reproducible code snippet or the input sql, dialect, and expected output.
Presto query
SELECT JSON_EXTRACT_SCALAR( TRY( FILTER( CAST(JSON_EXTRACT(context, '$.active_sessions') AS ARRAY(MAP(VARCHAR, VARCHAR))), x -> x['event_data_schema'] = 'PresentationSession' )[1]['event_data'] ), '$.thread_id' )
SQL Glot translated it to following incorrect Spark sql
SELECT GET_JSON_OBJECT( TRY( FILTER( CAST(GET_JSON_OBJECT(context, '$.active_sessions') AS ARRAY<MAP<STRING, STRING>>), x -> x['event_data_schema'] = 'PresentationSession' )[0]['event_data'] ), '$.thread_id' )
Correct Spark sql is as follows
SELECT GET_JSON_OBJECT( FILTER( FROM_JSON(GET_JSON_OBJECT(context, '$.active_sessions'), 'ARRAY<MAP<STRING, STRING>>'), x -> x['event_data_schema'] = 'PresentationSession' )[0]['event_data'] , '$.thread_id' )
Official Documentation Please include links to official SQL documentation related to your issue.
The text was updated successfully, but these errors were encountered:
georgesittas
Successfully merging a pull request may close this issue.
Before you file an issue
parse_one(sql, read="spark")
ast.sql(dialect="duckdb")
Fully reproducible code snippet
Please include a fully reproducible code snippet or the input sql, dialect, and expected output.
Presto query
SQL Glot translated it to following incorrect Spark sql
Correct Spark sql is as follows
Official Documentation
Please include links to official SQL documentation related to your issue.
The text was updated successfully, but these errors were encountered: