Skip to content
This repository has been archived by the owner on Aug 10, 2023. It is now read-only.

Latest commit

 

History

History
257 lines (199 loc) · 11.3 KB

catchpoint-to-grafana.md

File metadata and controls

257 lines (199 loc) · 11.3 KB
title description author tags date_published
Catchpoint data pipeline to Grafana
Learn to ingest data from Catchpoint into Google Cloud for visualization and analysis with Grafana.
drit
telemetry, probes, monitors
2021-04-20

Dritan Suljoti | Chief Product and Technology Officer | Catchpoint Systems, Inc.

Contributed by the Google Cloud community. Not official Google documentation.

Catchpoint’s digital experience monitoring platform provides an extensive fleet of network telemetry probes, as well as tools for capturing real user experience metrics, which give instant insight into the performance of networks, apps, and digital services. You can use Cloud Monitoring, in conjunction with the open-source analytics and interactive visualization web application Grafana, for all of your network performance monitoring and analysis.

This tutorial and its companion tutorial provide two methods of ingesting and visualizing data from Catchpoint within Google Cloud:

  • This tutorial shows you how to deliver data to Grafana for visualization and analysis.
  • The companion tutorial shows you how to set up a pipeline that ingests data captured by Catchpoint into Cloud Monitoring and use Metrics Explorer for visualization and analysis.

This tutorial uses Node.js, the Cloud Console, and gcloud commands on the Cloud SDK command line.

The fully configured data pipeline from Catchpoint to Grafana is illustrated in the following diagram:

integration pipeline

  1. Catchpoint posts data to an HTTP webhook set up in App Engine.
  2. App Engine publishes the data to a Pub/Sub channel.
  3. A Cloud Dataflow job listens to the Pub/Sub channel and inserts a BigQuery-accessible dataset.
  4. Data is sent to BigQuery along with the Catchpoint schema.
  5. Grafana's BigQuery plugin is used as a data source to visualize the data.

Objectives

  1. Create a Pub/Sub Topic.
  2. Build a webhook in Google Cloud.
  3. Configure Catchpoint.
  4. Build your pipeline.
  5. Configure Grafana.

Costs

This tutorial uses billable components of Google Cloud, including the following:

Use the pricing calculator to generate a cost estimate based on your projected usage.

Initial setup

  1. Create a new Google Cloud project or select an existing project.

    For information about creating and selecting projects, see Creating and managing projects.

    You need the Google Cloud project ID when you configure the Catchpoint script.

  2. Create two BigQuery tables:

    • Main table:

      {
        "fields": [
          {
            "mode": "NULLABLE",
            "name": "TestName",
            "type": "STRING"
          },
          {
            "mode": "NULLABLE",
            "name": "TestURL",
            "type": "STRING"
          },
          {
            "mode": "NULLABLE",
            "name": "TimeStamp",
            "type": "TIMESTAMP"
          },
          {
            "mode": "NULLABLE",
            "name": "NodeName",
            "type": "STRING"
          },
          {
            "mode": "NULLABLE",
            "name": "DNSTime",
            "type": "NUMERIC"
          },
          {
            "mode": "NULLABLE",
            "name": "Connect",
            "type": "NUMERIC"
          },
          {
            "mode": "NULLABLE",
            "name": "SSL",
            "type": "NUMERIC"
          },
          {
            "mode": "NULLABLE",
            "name": "SendTime",
            "type": "NUMERIC"
          },
          {
            "mode": "NULLABLE",
            "name": "WaitTime",
            "type": "NUMERIC"
          },
          {
            "mode": "NULLABLE",
            "name": "Total",
            "type": "NUMERIC"
          }
        ]
      }
      
    • Dead letter table:

      {
        "fields": [
          {
            "mode": "NULLABLE",
            "name": "inputData",
            "type": "STRING"
          },
          {
            "mode": "NULLABLE",
            "name": "errorMessage",
            "type": "STRING"
          }
        ]
      }
      
  3. Deploy Grafana and the BigQuery Plugin for Grafana.

    For details about deploying Grafana and adding the BigQuery plugin for Grafana, see the Grafana deployment guide and BigQuery Grafana plugin guide.

Create a Pub/Sub topic

This section covers publishing and subscribing to a topic with the Cloud Console. You can also configure Pub/Sub using the gcloud command-line tool or the API. For more information about these methods, see the gcloud documentation.

  1. Go to the Pub/Sub topics page in the Cloud Console.
  2. Click Create a Topic.
  3. Enter a unique topic name in the Topic ID field. This example uses catchpoint-topic.
  4. Click Save.
  5. Display the menu for the topic you just created, and click New Subscription.
  6. Enter a name for the subscription. This example uses catchpoint-bq-dataset.
  7. Leave the delivery type set to Pull.
  8. Click Create.

For more information about configuring Pub/Sub in the Cloud Console, see Quickstart: Using the Cloud Console.

Build a webhook in Google Cloud

A webhook (web application) provides a URL where vendors can post data to your application. The app listens on the defined URL and pushes posted data to the Pub/Sub topic created in the previous step.

  1. Download the Go script in the Cloud Storage bucket here.

  2. Edit the script and replace the DefaultCloudProjectName value with your project ID.

  3. If you chose a Pub/Sub topic name other than catchpoint-topic, change the CatchpointTopicProd value to your chosen topic name.

    You may keep /cppush as the CatchpointPushURL value or use another value of your choosing. After deploying the script, be sure to capture the entire webhook URL, which you need when configuring Catchpoint.

  4. Deploy the script on App Engine by following the App Engine deployment instructions.

Configure Catchpoint

  1. Go to Catchpoint API Detail.

  2. Select Add URL under Test Data Webhook.

  3. Enter the entire URL for your webhook in the URL field.

  4. Under Format either choose JSON to have Catchpoint send its default data payload in JSON format, or choose Template to customize the data payload.

    If you chose Template, then do the following:

    1. Click Select Template.
    2. Click Add New.
    3. Enter a name for this template and select JSON as the format.
    4. Enter valid JSON specifying the format of the payload that will be posted to the webhook. Each value in the template is set using a macro, which will be replaced with actual data at run time. See Test Data Webhook Macros for all available options.

    Here is a sample JSON template containing recommended macros:

        {
        "TestName": "${TestName}",
        "TestURL": "${testurl}",
        "TimeStamp": "${timestamp}",
        "NodeName": "${nodeName}",
        "PacketLoss": "${pingpacketlosspct}",
        "RTTAvg": "${pingroundtriptimeavg}",
        "DNSTime": "${timingdns}", 
        "Connect": "${timingconnect}", 
        "SSL": "${timingssl}", 
        "SendTime": "${timingsend}",
        "WaitTime": "${timingwait}", 
        "Total": "${timingtotal}"
        }
    
  5. Click Save at the bottom of the page.

For more information about configuring Catchpoint, see the Catchpoint webhook document.

Build your pipeline

  1. Clone the data loader repository.

  2. Change the metric.java file to match Catchpoint’s test data schema. You can download a ready metric.java file from this Cloud Storage bucket.

  3. Switch to Java 8 in Cloud Shell:

    sudo update-java-alternatives \
      -s java-1.8.0-openjdk-amd64 && \
      export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64/jre
    
  4. Build the uber JAR file (the JAR file with dependencies) by running the following command in the project root directory:

    ./gradlew clean && ./gradlew shadowJar
    
  5. Replace the placeholders in the following command with values for your environment, and then run the command in the project root directory:

    cd build/libs && java -jar perf-data-loader-1.0.jar \
      --dataSet=[TARGET_DATASET] \
      --table=[TARGET_TABLE] \
      --deadLetterDataSet=[DEAD_LETTER_DATASET] \
      --deadLetterTable=[DEAD_LETTER_TABLE] \
      --runner=DataflowRunner \
      --project=[GOOGLE_CLOUD_PROJECT_NAME] \
      --subscription=projects/[GOOGLE_CLOUD_PROJECT_NAME]/subscriptions/[PUBSUB_SUBSCRIPTION] \
      --jobName=[PIPELINE_JOB_NAME]
    

    If you need to update or change the pipeline, run the command with updated values and include --update as an additional argument.

If the job deployed successfully, then you should see it listed in the Jobs view:

sample-deployed-job

At this point, your data pipeline configuration is complete. Data posted to the webhook by Catchpoint should be propagating to your BigQuery tables and available for visualization in Grafana. For details, see the Grafana documentation.

Cleaning up

To avoid incurring charges to your Google Cloud account for the resources used in this tutorial, you can delete the project:

  1. In the Cloud Console, go to the Projects page.
  2. In the project list, select the project that you want to delete and click Delete.
  3. In the dialog, type the project ID, and then click Shut down to delete the project.