Skip to content

dabouav/gws-addon-dashboard

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

68 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Instructions for creating a usage dashboard for your Google Workspace Add-on

By: Dave Abouav
Last Updated: December 18, 2023

This is not an officially supported Google product.

At the moment, Add-ons in Google Workspace offer only basic usage analytics via the Workspace Marketplace SDK. These include install data broken out by domains and seats (for Add-ons installed by Workspace admins), and individual end-user installs. This is useful information, but doesn't tell you much about who is actively using your Add-on, nor give you the ability to breakdown that usage by different dimensions.

The code and instructions in this repo will help you gather and visualize Add-on usage data, such as active usage of your Add-on broken out by user characteristics. It also shows you how to log specific events that correspond to use you want to track (i.e. new installs, uses of particular features, etc). You can copy the contents of dashboard.gs into your Apps Script project to get started. This project was developed to help me study usage of my open-source, Sheets Add-on Flubaroo, which lets teachers grade assignments. Hence you'll see references to "grading assignments" in my description below when discussing app-specific logging. The majority of the code and examples given are generic though, and can be used for any Add-on.

Step 1: Requirements and Initial Setup

Requirements

We make the assumption here that you already have a published Add-on in the Workspace Marketplace, and hence also a GCP project with billing enabled. You should also have access to a Google Cloud Identity or Workspace account that has access to the Google Cloud Console.

This project requires use of Google BigQuery, which is not free to use (unlike Workspace APIs). While the cost is not typically high (a few dollars a month in my experience), you should be aware that it is not free and your billing account should be up-to-date as it will be charged. Note that Google Cloud sometimes offers up to $300 in free credits, which you could apply towards this accured cost. While in general there can be costs associated with long-term logs storage, we'll be using the default log expiration window of 30 days, which is free.

You will also need Google Cloud IAM permissions in this GCP project to enable the Cloud Logging API, create Logging Sinks, and create a BigQuery dataset in which you can create and write to tables, as well as create and schedule queries. If you don't have this type of GCP access, you can ask someone who does to follow these steps on your behalf. You will only need to be granted read access to the final BigQuery tables that are used to generate dashboards in Looker Studio.

Initial Setup

To start, make sure to enable the Cloud Logging API in GCP for your project. You can find this API by searching for it in the Cloud Console, like so:

Step 2: Add Logging to Your Add-on

Customize the Code

You should copy dashboard.gs into your project and customize it according to the instructions below:

  • Change the const variable addOnName to be a string that identifies your Add-on. This string will appear in Cloud Logging, and will be needed to distinguish this Add-on from any others you may have that you wish to have a separate dashboard for.
  • The code in dashboard.gs assumes you have a user's timezone accessible (stored in user properties). If you don't already have a way to collect or store this, you can just edit the function _getUserTimezone() to return a string like "unknown".
  • The code calls MailApp.getRemainingDailyQuota() as part of determining if the user is a regular consumer user (i.e. @gmail.com), versus a Google Workspace user. This will result in the scope https://www.googleapis.com/auth/script.send_mail being requested. If you don't want your app to request this scope, you'll need to delete the code related to isWorkspaceUserAccount, as well as references to the in the BigQuery queries shown later in these instructions.

Add Active Usage Logging

Place a call to logDailyPing() somewhere in your code. The function takes no arguments. Because this is used to track daily active usage, you should consider what "active usage" means for your app and place it in a related part of your code. For example, in Flubaroo I consider "active usage" to be only when an assignment gets graded, versus just when a user loads the Sheet or clicks a menu. Hence I placed my call to logDailyPing() in the function that's called when grading takes place. This same function is called both when the teacher grades the assignment by clicking a menu item, as well as when grading happens automatically (on Google Form submission). Hence anytime an assignment gets graded, regardless of what triggered it, that usage is tracked. You may wish to track active usage differently, and may even choose to place calls logDailyPing() in more than one location, which is fine. The tracking that results from logDailyPing() will only happen once per 24 hour period, so there is no problem if it gets called multiple times per day.

Add Event Logging

The function logEvent(eventName, eventSpecificDetails) logs details about a specific event that took place during the execution of your Add-on, so call it wherever and whenever you wish to track an event. For example, you could call it to record when a user first installs your Add-on, or when they take a certain action like making a selection or processing a file. For example, in Flubaroo I call it to track the first time a user installs Flubaroo, and whenever an assignment is graded.

The first argument eventName is a string that identifies the event being logged. The string can be anything, but be sure you use it for this event type only. That is, have a unique event name for each type of event you wish to log, and use it consistently.

The second argument eventSpecificDetails is an optional object with details you wish to log related to the specific event. If no details are to be logged, pass the second argument as null, or simply don't pass at all. Note: Only object fields of type 'string', 'number', and 'boolean' are supported for logging purposes (others are skipped).

Here's an example of how you might log an event that takes place when a user first installs your Add-on (i.e. a new user): logEvent('EVENT_FIRST_INSTALL')

Here's an example of how you might log an event that takes place when your Add-on has successfully processed a file: logEvent('EVENT_FILE_PROCESSED', {'fileName': 'report.pdf', 'fileSizeKB': 68.8})

Once your calls are in place, you should confirm see the corresponding logs entries in Google Cloud Logs Explorer. If you don't see them, be sure you're checking logs for the correct GCP project, have waited a minute or two for the logs entries to show up, and have the correct time window set. As an example, a Flubaroo log entry from logDailyPing() is shown in the image below. Note that the actual usage data is in the jsonPayload object:

Step 3: Route Logs to BigQuery

In order for your Active Usage and Event Logging to end-up in BigQuery for analysis, we'll need to setup Log Routing. This is done from the Google Cloud Log Router.

Route Active Usage Logs

On the Log Router page, click "Create sink". For the information requested, enter the following information in each of the sections:

  • Sink details:
    • Sink name: addOnName + "UsageLogSink" (i.e. flubarooUsageLogSink)
  • Sink destination:
    • Select sink service: BigQuery dataset
    • Select BigQuery dataset:
      • Select "Create new BigQuery dataset"
      • In the "Create dataset" panel that opens, enter a dataset ID such as addOnName + "UsageData" (i.e. flubarooUsageData). Remember this data set name as you'll reference it again later when setting up your BigQuery query.
      • Ensure "Enable table expiration" is not checked.
      • Accept all other defaults and create the dataset.
  • Choose logs to include in sink:
    Copy/paste the filter shown below into the "inclusion filter" box, but replace "flubarooDailyPing" with addOnName + "DailyPing", and replace "<project-id-your-project-id-here>" with the ID of your GCP project.

Click "Create Sink". If you encounter errors trying to create the BigQuery dataset, or the new sink, ensure that you have all of the necessary IAM permissions, and have enabled Cloud Logging API in your project.

Inclusion Filter:

"flubarooDailyPing"
resource.type="app_script_function"
logName="projects/<project-id-your-project-id-here>/logs/script.googleapis.com%2Fconsole_logs"
severity=DEBUG

Route Event Logs

Repeat the same exact steps in "Route Active Usage Logs" above, but with these changes:

  • For "Sink name", use addOnName + "EventLogSink" (i.e. flubarooEventLogSink)
  • When creating a new BigQuery dataset, enter a dataset ID such as addOnName + "EventData" (i.e. flubarooEventData)
  • For the inclusion filter, change the first line from "flubarooDailyPing" to addOnName + "EventLog" (i.e. flubarooEventLog)

Here is an example of Flubaroo's Active Usage log routing sink:

Step 4: Create and Schedule BigQuery Queries

Next we'll want to create 2 BigQuery queries, and schedule them to run daily. To get started, visit the (BigQuery Studio) [https://console.cloud.google.com/bigquery] page in the GCP console, and make sure your project is selected in the project selector at the top.

Active Usage BigQuery Query

Click the "Compose a New Query" button to start a blank query. Copy the contents of dauQuery.txt file in this repository, and paste the contents into the query. Save the query with a name like "DAU Query", but don't run the query yet.

At the top of the query are instructions to replace the Project ID and Table Set name. Review them, then use the query editor's search and replace tool (Control+F on PC or Chrome OS, Command+F on Mac) to replace the occurrences of <your-gcp-project-id> and <yourDatSetName>.

Save the query again, then run it. If it fails to execute, be sure you copy/pasted it correctly, and correctly indicated the name of your GCP Project (it should be the same one that your Add-on is associated with), and the name of your data set (this is the same data set you created when setting up log routing).

Once it executes successfully, you should notice a new BigQuery Data Set and Table, similar to the ones shown below:

Next, search for the line that starts with "CREATE OR REPLACE TABLE". Comment out this line by adding two dashes (--) infront of it. On the line immediately following, remove the two dashes that precede the word "INSERT". Once done, save your change. Now when the script runs in the future, it will insert new data in the table you just created, rather than creating a new table each time it runs.

Finally, you'll need to schedule this query to run each day. To do so, click the "Schedule" button at the top of the query editor. In the "New scheduled query" pane that opens, enter the following:

  • Give your query a name, like "Scheduled DAU Query".
  • For the "Schedule options" section, select a "Repeat frequency" value of "Days".
  • For the "Schedule options" section, select a "At" time of your choosing. Note that because this determines the approximate time your query will run, it also determines the ending time window used when evaluating logs for that day's table entries. For example, if you select a start time of 12:00 UTC, then your daily active users (1DAU) for that day will be those with log entry timestamps from 12:00:00 UTC on the day the script runs, back to 12:00:00 UTC the previous day.
  • Ensure the script is configured to "Start now" and "End never".
  • Leave all other fields with their default value. In particular, do not choose to "Set a destination table for query results", as this is already taken care of in the query.

Once configured, click "Save". You can see your scheduled query in the Scheduled Queries section of BigQuery. Note that the scheduled makes a copy of your query at the time of schedule creation. So if you make changes to your query in the future you will need to delete the old scheduled query and then create a new schedule.

Event BigQuery Query

As in the prior section, click the "Compose a New Query" button to start a blank query. This time copy the contents of eventsQuery.txt file in this repository, and paste the contents into the query. Save the query with a name like "Events Query", but don't run the query yet.

At the top of the query are instructions to replace the Project ID and Table Set name. Review them, then use the query editor's search and replace tool (Control+F on PC or Chrome OS, Command+F on Mac) to replace the occurrences of <your-gcp-project-id> and <yourDatSetName>.

Save the query again, then run it. If it fails to execute, be sure you copy/pasted it correctly, and correctly indicated the name of your GCP Project (it should be the same one that your Add-on is associated with), and the name of your data set (this is the same data set you created when setting up log routing).

Once it executes successfully, you should notice a new BigQuery Data Set and Table, similar to the ones shown in the image in the prior section above.

Next, search for the line that starts with "CREATE OR REPLACE TABLE". Comment out this line by adding two dashes (--) infront of it. On the line immediately following, remove the two dashes that precede the word "INSERT". Once done, save your change. Now when the script runs in the future, it will insert new data in the table you just created, rather than creating a new table each time it runs.

Finally, you'll need to schedule this query to run each day. To do so, click the "Schedule" button at the top of the query editor. In the "New scheduled query" pane that opens, enter the following:

  • Give your query a name, like "Scheduled Events Query".
  • For the "Schedule options" section, select a "Repeat frequency" value of "Days".
  • For the "Schedule options" section, select a "At" time of your choosing. Note that because this determines the approximate time your query will run, it also determines the ending time window used when evaluating logs for that day's table entries. For example, if you select a start time of 12:00 UTC, then your events included for that day will be those with log entry timestamps from 12:00:00 UTC on the day the script runs, back to 12:00:00 UTC the previous day.
  • Ensure the script is configured to "Start now" and "End never".
  • Leave all other fields with their default value. In particular, do not choose to "Set a destination table for query results", as this is already taken care of in the query.

Once configured, click "Save". You can see your scheduled query in the Scheduled Queries section of BigQuery. Note that the scheduled makes a copy of your query at the time of schedule creation. So if you make changes to your query in the future you will need to edit the copy contained in the scheduled query entry, or just delete the old scheduled query and then create a new schedule from the original query.

Step 5: Creating Dashboards in Looker Studio

LookerStudio (formerly DataStudio) is a free tool from Google for visualizing data and sharing reports or dashboards. It can connect to a variety of data sources, including BigQuery tables. This section discusses how to set up dashboard in Looker Studio to visualize data for your Add-on.

Setup Data Sources and Report

In LookerStudio, choose "Create > Data source". Select "BigQuery" as the type of data source. You will be asked to authorize access, and select the GCP project associated with your BigQuery tables. After selecting it, you should see the BigQuery Datasets you created in Step 4. Select the one associated with Active Usage, and then click on the name of the table in that Dataset (dailyActiveUsers). Your selection should look similar to this:

Give the data source the name "DailyActiveUsers Table" in the top-left. Click the "Connect" button in the top-right once ready to add this data source to your Looker Studio instance.

Repeat this same process a second time for the event data, calling the table the "EventTallies Table".

Finally, from the LookerStudio home page click "Create > Report". You will be prompted to add a data source to the report. Click the "My data sources" tab and select the "DailyActiveUsers Table" you just added as Data Sources. Once you add it you will be dropped into the LookerStudio report editor. Next, add the "EventTallies Table" as a data source for this report too by clicking on the "Add data" button in the editor's toolbar and selecting it from the list in "My data sources".

Active Usage Dashboard Page

We can add a number of different charts to visualize active usage. Below are a few examples.

**Tip: If you run out of room on the page for your charts, you can increase the size of the page from "Page > Current page settings > Style".

Time Series of DAU30 (30 Day Active Users)

Click the "Add a chart" button in the LookerStudio toolbar and select "Time series", as shown below:

Place the time series chart object somewhere on the page, and use these options for the Chart "Setup":

  • Data Source: DailyActiveUsers Table
  • Date Range Dimention: date
  • Dimension: date
  • Metric: SUM dau30

The image below shows the settings:

Time Series of DAU1 (Single Day Daily Active Users)

Duplicate the "Time Series of 30DAU" chart you just created. In the duplicate chart, change "Metric > dau30" to "Metric > dau1".

Time Series of 30DAU - By Country

Duplicate the "Time Series of 30DAU" chart you created earlier. In the duplicate chart, for "Drill Down > Breakdown Dimension" add the "country" field. This will give you a chart of 30DAU usage with a seperate series per country.

Map of Usage by Country

Click the "Add a chart" button in the LookerStudio toolbar and select "Geo Chart".

Place the geo chart object somewhere on the page, and use these options for the Chart "Setup":

  • Data Source: DailyActiveUsers Table
  • Date Range Dimention: date
  • Geo Dimension: country
  • Metric: SUM dau30
  • Default date range: Custom > Today

Table of Usage by Country

Click the "Add a chart" button in the LookerStudio toolbar and select "Table with bars".

Place the table chart object somewhere on the page, and use these options for the Chart "Setup":

  • Data Source: DailyActiveUsers Table
  • Date Range Dimention: date
  • Dimension: country
  • Metric: SUM dau30
  • Rows per page: 100
  • Sort: SUM dau30 Descending
  • Default date range: Custom > Today

Customizing Your Dasboard

You can add text boxes, chart headers, fitlers, and various other stylings to customize your dashboard to your liking. The image below shows the final product for the usage page of Flubaroo's dashboard:

Event Dashboard Page

Next we'll add a second page to this dashboard that is specific to events. Click "Add Page" from the toolbar, and insert a new chart of type "Table". Configure the chart using the following options:

  • Data Source: EventTallies Table
  • Date Range Dimention: date
  • Dimensions:
    • eventName
    • eventSpecificDetailsString
  • Metric:
    • eventCount
    • userCount
  • Rows per page: 100
  • Sort: SUM eventCount Descending

The image below shows these options configured:

To allow for filtering of the table by date, eventName, etc, add one or more "controls" using the "Add a control > Drop-down list" and placing the control above the table of events. Use the "Control field" setting to adjust which field the drop-down is filtering for. See the image below for an example, where the 'country' control drop-down is being configured:

Specific Event Drilldown Dashboard Page

At this point you have a very useful dashboard that highlights usage and lets you monitor specific events. Regarding the events, it may be helpful to create further pages in your dashboard that drill into the specifics of particular events. For example, I may want to have a page just for events related to assignments that have been graded, where I can drill-down on the differet event specific details related to that event. To do this, you will need to go back to BigQuery and create a scheduled query that queries against the eventTallies table, after which you can visualize that data in your dashboard.

The query below shows an example of how this could be done:

CREATE OR REPLACE TABLE `<project-id>.flubarooEventData.gradingData` AS 
SELECT date,
       country, 
       JSON_VALUE(eventSpecificDetailsJson.isSampleAssignment) as isSampleAssignment,
       JSON_VALUE(eventSpecificDetailsJson.gradedWithAutoGrade) as gradedWithAutoGrade,
       JSON_VALUE(eventSpecificDetailsJson.usedCustomFormulas) as usedCustomFormulas,
       event_count as gradingOccurrences,
       user_count as userCount
FROM `<project-id>.flubarooEventData.eventTallies`
WHERE event_name = "FLB_EVENT_ASSIGNMENT_GRADED";

Step 6: Share Your Dashboard

Once your dashboard has been completed and tested, you can share it with others. Click the blue "+Share button in the top-right of the LookerStudio Report editor and add one or more individuals or groups with "Viewer" access.

About

Code snippets and instructions for creating a usage dashboard for your Google Workspace Add-on

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published