Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

query_runner -> query_results: improve logging, handle unhandled data types #6905

Open
wants to merge 24 commits into
base: master
Choose a base branch
from
Open
Changes from 15 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
6 changes: 5 additions & 1 deletion redash/query_runner/query_results.py
Original file line number Diff line number Diff line change
Expand Up @@ -109,9 +109,12 @@ def flatten(value):
return json_dumps(value)
elif isinstance(value, decimal.Decimal):
return float(value)
elif isinstance(value, datetime.timedelta):
elif isinstance(value, (datetime.date, datetime.time, datetime.datetime, datetime.timedelta)):
return str(value)
else:
if logging.isEnabledFor(logging.DEBUG):
vtatarin marked this conversation as resolved.
Show resolved Hide resolved
if not isinstance(value, (type(None), str, float, int)):
logger.debug("flatten() found unhandled type: %s", str(type(value)))
return value


Expand All @@ -134,6 +137,7 @@ def create_table(connection, table_name, query_results):
column_list=column_list,
place_holders=",".join(["?"] * len(columns)),
)
logger.debug("INSERT template: %s", insert_template)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pretty sure we don't want debugging statements being unconditionally run. At least with the change above it, it seems to be done conditionally there.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@justinclift thank you for the reply! So there are 2 things:

  • logger only logs (prints to stdout/stderr) messages, nothing is actually run/executed. The default log level is INFO, so logger.debug is actually a condition which logs the data only at a more verbose level
  • for another added logger there is a condition that checks the current log level and prevents going through too many checks unless its DEBUG (which is not default)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@vtatarin Sorry, but I'm short on time for the next few weeks.

Am getting some (Redash related) stuff deployed to a data centre, and that's taking the majority of my focus time. When that's done I'll be able to look at PRs properly. 😄


for row in query_results["rows"]:
values = [flatten(row.get(column)) for column in columns]
Expand Down