Skip to content

Commit

Permalink
v0.31 (#1378)
Browse files Browse the repository at this point in the history
* Changes to alter cgi dependency to email.Messages

* Changes to alter cgi dependency to email.Messages

* Changes to alter cgi dependency to email.Messages

* Changes to alter cgi dependency to email.Messages

* feat: allow viz height and width parameters

* fix: use python3.8 syntax

* fix: python3.8 syntax

* docs: comment PDF viz dimensions XOR

* Add support for System schedule type

I'm not fully clear on where these might come from, but this change should let TSC
work in such cases.

Fixes #1349

* Add failing test retrieving a task with 24 hour (aka daily) interval

* Add 24 (hours) as a valid interval which can be returned from the server

* Add Python 3.12 to test matrix

* Tweak test action to stop double-running everything

* feat: add description support on wb publish

* Add Data Acceleration and Data Freshness Policy support (#1343)

* Add data acceleration & data freshness policy functions

* Add unit tests and raise errors on missing params

* fix types & spell checks

* addressed some feedback

* addressed feedback

* cleanup code

* Revert "Merge branch 'add_data_acceleration_and_data_freshness_policy_support' of https://github.com/tableau/server-client-python into add_data_acceleration_and_data_freshness_policy_support"

This reverts commit 5b30e57, reversing
changes made to 5789e32.

* fix formatting

* Address feedback

* mypy & formatting changes

* Improve robustness of Pager results

In some cases, Tableau Server might have a different between the advertised total
number of object and the actual number returned via the Pager.

This change adds one more check to prevent errors from happening in these
situations.

Fixes #1304

* Add Cloud Flow Task endpoint

* cleanup

* black format

* add xml

* edit test initialization

* fix task initialization

* third times the charm

* cleanup

* fix formatting

* feat: pass parameters in request options

* chore: pin typing_extensions version

---------

Co-authored-by: markm <mark@securehst.com>
Co-authored-by: Mark Moreno <45011486+markm-io@users.noreply.github.com>
Co-authored-by: Jordan Woods <13803242+jorwoods@users.noreply.github.com>
Co-authored-by: Brian Cantoni <bcantoni@salesforce.com>
Co-authored-by: Brian Cantoni <bcantoni@bcantoni-ltm.internal.salesforce.com>
Co-authored-by: ltiffanydev <148500608+ltiffanydev@users.noreply.github.com>
Co-authored-by: liu.r <liu.r@salesforce.com>
  • Loading branch information
8 people committed Jun 3, 2024
1 parent f84d7d5 commit 4018a0f
Show file tree
Hide file tree
Showing 43 changed files with 1,428 additions and 38 deletions.
9 changes: 7 additions & 2 deletions .github/workflows/run-tests.yml
Original file line number Diff line number Diff line change
@@ -1,14 +1,19 @@
name: Python tests

on: [push, pull_request]
on:
pull_request: {}
push:
branches:
- development
- master

jobs:
build:
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest, macos-latest, windows-latest]
python-version: ['3.8', '3.9', '3.10', '3.11']
python-version: ['3.8', '3.9', '3.10', '3.11', '3.12']

runs-on: ${{ matrix.os }}

Expand Down
1 change: 1 addition & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ dependencies = [
'packaging>=23.1', # latest as at 7/31/23
'requests>=2.31', # latest as at 7/31/23
'urllib3==2.0.7', # latest as at 7/31/23
'typing_extensions>=4.0.1',
]
requires-python = ">=3.7"
classifiers = [
Expand Down
109 changes: 109 additions & 0 deletions samples/update_workbook_data_acceleration.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,109 @@
####
# This script demonstrates how to update workbook data acceleration using the Tableau
# Server Client.
#
# To run the script, you must have installed Python 3.7 or later.
####


import argparse
import logging

import tableauserverclient as TSC
from tableauserverclient import IntervalItem


def main():
parser = argparse.ArgumentParser(description="Creates sample schedules for each type of frequency.")
# Common options; please keep those in sync across all samples
parser.add_argument("--server", "-s", help="server address")
parser.add_argument("--site", "-S", help="site name")
parser.add_argument("--token-name", "-p", help="name of the personal access token used to sign into the server")
parser.add_argument("--token-value", "-v", help="value of the personal access token used to sign into the server")
parser.add_argument(
"--logging-level",
"-l",
choices=["debug", "info", "error"],
default="error",
help="desired logging level (set to error by default)",
)
# Options specific to this sample:
# This sample has no additional options, yet. If you add some, please add them here

args = parser.parse_args()

# Set logging level based on user input, or error by default
logging_level = getattr(logging, args.logging_level.upper())
logging.basicConfig(level=logging_level)

tableau_auth = TSC.PersonalAccessTokenAuth(args.token_name, args.token_value, site_id=args.site)
server = TSC.Server(args.server, use_server_version=False)
server.add_http_options({"verify": False})
server.use_server_version()
with server.auth.sign_in(tableau_auth):
# Get workbook
all_workbooks, pagination_item = server.workbooks.get()
print("\nThere are {} workbooks on site: ".format(pagination_item.total_available))
print([workbook.name for workbook in all_workbooks])

if all_workbooks:
# Pick 1 workbook to try data acceleration.
# Note that data acceleration has a couple of requirements, please check the Tableau help page
# to verify your workbook/view is eligible for data acceleration.

# Assuming 1st workbook is eligible for sample purposes
sample_workbook = all_workbooks[2]

# Enable acceleration for all the views in the workbook
enable_config = dict()
enable_config["acceleration_enabled"] = True
enable_config["accelerate_now"] = True

sample_workbook.data_acceleration_config = enable_config
updated: TSC.WorkbookItem = server.workbooks.update(sample_workbook)
# Since we did not set any specific view, we will enable all views in the workbook
print("Enable acceleration for all the views in the workbook " + updated.name + ".")

# Disable acceleration on one of the view in the workbook
# You have to populate_views first, then set the views of the workbook
# to the ones you want to update.
server.workbooks.populate_views(sample_workbook)
view_to_disable = sample_workbook.views[0]
sample_workbook.views = [view_to_disable]

disable_config = dict()
disable_config["acceleration_enabled"] = False
disable_config["accelerate_now"] = True

sample_workbook.data_acceleration_config = disable_config
# To get the acceleration status on the response, set includeViewAccelerationStatus=true
# Note that you have to populate_views first to get the acceleration status, since
# acceleration status is per view basis (not per workbook)
updated: TSC.WorkbookItem = server.workbooks.update(sample_workbook, True)
view1 = updated.views[0]
print('Disabled acceleration for 1 view "' + view1.name + '" in the workbook ' + updated.name + ".")

# Get acceleration status of the views in workbook using workbooks.get_by_id
# This won't need to do populate_views beforehand
my_workbook = server.workbooks.get_by_id(sample_workbook.id)
view1 = my_workbook.views[0]
view2 = my_workbook.views[1]
print(
"Fetching acceleration status for views in the workbook "
+ updated.name
+ ".\n"
+ 'View "'
+ view1.name
+ '" has acceleration_status = '
+ view1.data_acceleration_config["acceleration_status"]
+ ".\n"
+ 'View "'
+ view2.name
+ '" has acceleration_status = '
+ view2.data_acceleration_config["acceleration_status"]
+ "."
)


if __name__ == "__main__":
main()
218 changes: 218 additions & 0 deletions samples/update_workbook_data_freshness_policy.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,218 @@
####
# This script demonstrates how to update workbook data freshness policy using the Tableau
# Server Client.
#
# To run the script, you must have installed Python 3.7 or later.
####


import argparse
import logging

import tableauserverclient as TSC
from tableauserverclient import IntervalItem


def main():
parser = argparse.ArgumentParser(description="Creates sample schedules for each type of frequency.")
# Common options; please keep those in sync across all samples
parser.add_argument("--server", "-s", help="server address")
parser.add_argument("--site", "-S", help="site name")
parser.add_argument("--token-name", "-p", help="name of the personal access token " "used to sign into the server")
parser.add_argument(
"--token-value", "-v", help="value of the personal access token " "used to sign into the server"
)
parser.add_argument(
"--logging-level",
"-l",
choices=["debug", "info", "error"],
default="error",
help="desired logging level (set to error by default)",
)
# Options specific to this sample:
# This sample has no additional options, yet. If you add some, please add them here

args = parser.parse_args()

# Set logging level based on user input, or error by default
logging_level = getattr(logging, args.logging_level.upper())
logging.basicConfig(level=logging_level)

tableau_auth = TSC.PersonalAccessTokenAuth(args.token_name, args.token_value, site_id=args.site)
server = TSC.Server(args.server, use_server_version=False)
server.add_http_options({"verify": False})
server.use_server_version()
with server.auth.sign_in(tableau_auth):
# Get workbook
all_workbooks, pagination_item = server.workbooks.get()
print("\nThere are {} workbooks on site: ".format(pagination_item.total_available))
print([workbook.name for workbook in all_workbooks])

if all_workbooks:
# Pick 1 workbook that has live datasource connection.
# Assuming 1st workbook met the criteria for sample purposes
# Data Freshness Policy is not available on extract & file-based datasource.
sample_workbook = all_workbooks[2]

# Get more info from the workbook selected
# Troubleshoot: if sample_workbook_extended.data_freshness_policy.option returns with AttributeError
# it could mean the workbook selected does not have live connection, which means it doesn't have
# data freshness policy. Change to another workbook with live datasource connection.
sample_workbook_extended = server.workbooks.get_by_id(sample_workbook.id)
try:
print(
"Workbook "
+ sample_workbook.name
+ " has data freshness policy option set to: "
+ sample_workbook_extended.data_freshness_policy.option
)
except AttributeError as e:
print(
"Workbook does not have data freshness policy, possibly due to the workbook selected "
"does not have live connection. Change to another workbook using live datasource connection."
)

# Update Workbook Data Freshness Policy to "AlwaysLive"
sample_workbook.data_freshness_policy = TSC.DataFreshnessPolicyItem(
TSC.DataFreshnessPolicyItem.Option.AlwaysLive
)
updated: TSC.WorkbookItem = server.workbooks.update(sample_workbook)
print(
"Workbook "
+ updated.name
+ " updated data freshness policy option to: "
+ updated.data_freshness_policy.option
)

# Update Workbook Data Freshness Policy to "SiteDefault"
sample_workbook.data_freshness_policy = TSC.DataFreshnessPolicyItem(
TSC.DataFreshnessPolicyItem.Option.SiteDefault
)
updated: TSC.WorkbookItem = server.workbooks.update(sample_workbook)
print(
"Workbook "
+ updated.name
+ " updated data freshness policy option to: "
+ updated.data_freshness_policy.option
)

# Update Workbook Data Freshness Policy to "FreshEvery" schedule.
# Set the schedule to be fresh every 10 hours
# Once the data_freshness_policy is already populated (e.g. due to previous calls),
# it is possible to directly change the option & other parameters directly like below
sample_workbook.data_freshness_policy.option = TSC.DataFreshnessPolicyItem.Option.FreshEvery
fresh_every_ten_hours = TSC.DataFreshnessPolicyItem.FreshEvery(
TSC.DataFreshnessPolicyItem.FreshEvery.Frequency.Hours, 10
)
sample_workbook.data_freshness_policy.fresh_every_schedule = fresh_every_ten_hours
updated: TSC.WorkbookItem = server.workbooks.update(sample_workbook)
print(
"Workbook "
+ updated.name
+ " updated data freshness policy option to: "
+ updated.data_freshness_policy.option
+ " with frequency of "
+ str(updated.data_freshness_policy.fresh_every_schedule.value)
+ " "
+ updated.data_freshness_policy.fresh_every_schedule.frequency
)

# Update Workbook Data Freshness Policy to "FreshAt" schedule.
# Set the schedule to be fresh at 10AM every day
sample_workbook.data_freshness_policy.option = TSC.DataFreshnessPolicyItem.Option.FreshAt
fresh_at_ten_daily = TSC.DataFreshnessPolicyItem.FreshAt(
TSC.DataFreshnessPolicyItem.FreshAt.Frequency.Day, "10:00:00", "America/Los_Angeles"
)
sample_workbook.data_freshness_policy.fresh_at_schedule = fresh_at_ten_daily
updated: TSC.WorkbookItem = server.workbooks.update(sample_workbook)
print(
"Workbook "
+ updated.name
+ " updated data freshness policy option to: "
+ updated.data_freshness_policy.option
+ " with frequency of "
+ str(updated.data_freshness_policy.fresh_at_schedule.time)
+ " every "
+ updated.data_freshness_policy.fresh_at_schedule.frequency
)

# Set the schedule to be fresh at 6PM every week on Wednesday and Sunday
sample_workbook.data_freshness_policy = TSC.DataFreshnessPolicyItem(
TSC.DataFreshnessPolicyItem.Option.FreshAt
)
fresh_at_6pm_wed_sun = TSC.DataFreshnessPolicyItem.FreshAt(
TSC.DataFreshnessPolicyItem.FreshAt.Frequency.Week,
"18:00:00",
"America/Los_Angeles",
[IntervalItem.Day.Wednesday, "Sunday"],
)

sample_workbook.data_freshness_policy.fresh_at_schedule = fresh_at_6pm_wed_sun
updated: TSC.WorkbookItem = server.workbooks.update(sample_workbook)
new_fresh_at_schedule = updated.data_freshness_policy.fresh_at_schedule
print(
"Workbook "
+ updated.name
+ " updated data freshness policy option to: "
+ updated.data_freshness_policy.option
+ " with frequency of "
+ str(new_fresh_at_schedule.time)
+ " every "
+ new_fresh_at_schedule.frequency
+ " on "
+ new_fresh_at_schedule.interval_item[0]
+ ","
+ new_fresh_at_schedule.interval_item[1]
)

# Set the schedule to be fresh at 12AM every last day of the month
sample_workbook.data_freshness_policy = TSC.DataFreshnessPolicyItem(
TSC.DataFreshnessPolicyItem.Option.FreshAt
)
fresh_at_last_day_of_month = TSC.DataFreshnessPolicyItem.FreshAt(
TSC.DataFreshnessPolicyItem.FreshAt.Frequency.Month, "00:00:00", "America/Los_Angeles", ["LastDay"]
)

sample_workbook.data_freshness_policy.fresh_at_schedule = fresh_at_last_day_of_month
updated: TSC.WorkbookItem = server.workbooks.update(sample_workbook)
new_fresh_at_schedule = updated.data_freshness_policy.fresh_at_schedule
print(
"Workbook "
+ updated.name
+ " updated data freshness policy option to: "
+ updated.data_freshness_policy.option
+ " with frequency of "
+ str(new_fresh_at_schedule.time)
+ " every "
+ new_fresh_at_schedule.frequency
+ " on "
+ new_fresh_at_schedule.interval_item[0]
)

# Set the schedule to be fresh at 8PM every 1st,13th,20th day of the month
fresh_at_dates_of_month = TSC.DataFreshnessPolicyItem.FreshAt(
TSC.DataFreshnessPolicyItem.FreshAt.Frequency.Month,
"00:00:00",
"America/Los_Angeles",
["1", "13", "20"],
)

sample_workbook.data_freshness_policy.fresh_at_schedule = fresh_at_dates_of_month
updated: TSC.WorkbookItem = server.workbooks.update(sample_workbook)
new_fresh_at_schedule = updated.data_freshness_policy.fresh_at_schedule
print(
"Workbook "
+ updated.name
+ " updated data freshness policy option to: "
+ updated.data_freshness_policy.option
+ " with frequency of "
+ str(new_fresh_at_schedule.time)
+ " every "
+ new_fresh_at_schedule.frequency
+ " on "
+ str(new_fresh_at_schedule.interval_item)
)


if __name__ == "__main__":
main()
1 change: 1 addition & 0 deletions tableauserverclient/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
DailyInterval,
DataAlertItem,
DatabaseItem,
DataFreshnessPolicyItem,
DatasourceItem,
FavoriteItem,
FlowItem,
Expand Down
1 change: 1 addition & 0 deletions tableauserverclient/models/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
from .data_acceleration_report_item import DataAccelerationReportItem
from .data_alert_item import DataAlertItem
from .database_item import DatabaseItem
from .data_freshness_policy_item import DataFreshnessPolicyItem
from .datasource_item import DatasourceItem
from .dqw_item import DQWItem
from .exceptions import UnpopulatedPropertyError
Expand Down
Loading

0 comments on commit 4018a0f

Please sign in to comment.