Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP
Fetching contributors…

Cannot retrieve contributors at this time

211 lines (148 sloc) 16.3 kB

GoodData CL Commands

GoodData Cl supports following groups of commands:

  • Project Management Commands: create/delete/use/remember a project CreateProject, DeleteProject, OpenProject, RememberProject. UseProject

  • Connector Commands that either generate the XML configuration (Generate<Source-Type>Config) for a specific data source and load the source data (Load<Source-Type>). Connector commands require a project to be activated via a project management command before they are invoked.

  • Metadata Management Commands: work with project metadata (reports, dashboards, metrics, folders)
    RetrieveMetadataObject, StoreMetadataObject, DropMetadataObject, RetrieveAllObjects, StoreAllObjects

  • Logical Model Management Commands generate & execute MAQL DDL script for a connector that has been previously loaded via the Load<Source-Type> command.
    GenerateMaql, GenerateUpdateMaql, ExecuteMaql

  • Data Transfer Commands that transform, and transfer the data from a previously loaded (Load<Source-Type>) connector.
    TransferAllSnapshots, TransferSnapshots, TransferLastSnapshot, ListSnapshots, DropSnapshots

Project Initialization Workflow

Usually you want to initialize your project with following commands:

  1. CreateProject or OpenProject.
  2. Optionally you generate XML configuration for your data source using a Generate<Source-Type>Config command that yields an XML configuration file. This file describes your data structure and a way how the GoodData Logical Data Model is going to be generated. Sometimes you might want to review the XML config file and perform some changes. You'll most probably want to comment out the Generate<Source-Type>Config after the first run.
  3. Initialize your data source Connector using a Load<Source-Type> command. The Load<Source-Type> command requires the XML config file and a specific parameters that define the data source data or query (e.g. a SQL query).
  4. Generate and execute MAQL DDL for your data source using the GenerateMaql and ExecuteMaql commands. The MAQL DDL generates your project's Logical Data Model (LDM) and Data Loading Interface (DLI). The DLI is later used by the following Data Transfer commands. You need to generate your LDM and DLI only once per each project. That is why the scripts that transfer data on regular basis don't use the the GenerateMaql and ExecuteMaql commands.
  5. TransferAllSnapshots or TransferLastSnapshot commands that transform, package, and transfer the data.

Project Data Loading Workflow

The ongoing data loading scripts usually:

  1. OpenProject or UseProject command.
  2. Initialize your data source Connector using a Load<Source-Type> command. The Load<Source-Type> command requires the XML config file and a specific parameters that define the data source data or query (e.g. a SQL query).
  3. TransferAllSnapshots or TransferLastSnapshot commands that transform, package, and transfer the data.

Commands Reference

The following paragraphs describe the specific GoodData CL commands.

Project Management Commands

  • CreateProject(name=<project-name>, desc=<description>, templateUri=<templateUri>) - create a new project on the <hostname> server

    • project-name - name of the new project
    • description - (optional) project description
    • templateUri - (optional) project template to create the project from
  • DeleteProject(id=<project-id>) - drop the project on the <hostname> server

    • project-id - optional project id, if not specified, the command tries to drop the current project
  • OpenProject(id=<identifier>) - open an existing project for data modeling and data upload. Equivalent to providing the project identifier using the "-e" command line option.

    • identifier - id of an existing project (takes the form of an com.gooddata.MD5 hash)
  • RememberProject(fileName=<file>) - saves the current project identifier into the specified file

    • fileName - file to save the project identifier
  • UseProject(fileName=<file>) - loads the current project identifier from the specified file

    • fileName - file to load the project identifier from
  • InviteUser(email=<email>, msg=<msg>[, role=<admin|editor|dashboard only>]) - invites a new user to the project

    • email - the invited user's e-mail
    • msg - optional invitation message
    • role - optional initial user's role
  • Lock(path=<file>) - prevents concurrent run of multiple instances sharing the same lock file. Lock files older than 1 hour are discarded.

Metadata Management Commands

  • RetrieveMetadataObject(id=<object-id>, file=<file-to-store-the-object>) - retrieves a metadata object and stores it in a file, must call CreateProject or OpenProject before

    • object-id - valid object id (integer number)
    • file - file where the object content (JSON) is going to be stored
  • StoreMetadataObject([id=<object-id>,] file=<file-with-the-object-content>) - stores a metadata object with a content (JSON) in file to the metadata server, must call CreateProject or OpenProject before

    • object-id - valid object id (integer number), if the id is specified, the object is going to be modified, if not, a new object is created
    • file - file where the object content (JSON) is stored
  • DropMetadataObject(id=<object-id>) - drops the object with specified id from the project's metadata, must call CreateProject or OpenProject before

    - object-id - valid object id (integer number)

  • RetrieveAllObjects(dir=<directory>) - download all metadata objects (reports, dashboards, metrics, folders) and store them locally in the specified directory

    - directory - an existing directory which CL tool will use to write store object files

  • CopyObjects(dir=<directory>, overwrite=<true | false>) - load a whole directory of objects (from RetrieveAllObjects) into an existing project

    • directory - a directory with object files
    • overwrite - overwrite existing objects (if conflicts are discovered)

Logical Model Management Commands

  • GenerateMaql(maqlFile=<maql>) - generate MAQL DDL script describing data model from the local config file, must call CreateProject or OpenProject and a UseXXX before

    • maqlFile - path to MAQL file (will be overwritten)
  • GenerateUpdateMaql(maqlFile=<maql>) - generate MAQL DDL alter script that creates the columns available in the local configuration but missing in the remote GoodData project, must call CreateProject or OpenProject and UseXXX before

    • maqlFile - path to MAQL file (will be overwritten)
  • ExecuteMaql(maqlFile=<maql> [, ifExists=<true | false>]) - run MAQL DDL script on server to generate data model, must call CreateProject or OpenProject and UseXXX before

    • maqlFile - path to the MAQL file (relative to PWD)
    • ifExists - if set to true the command quits silently if the maqlFile does not exist, default is false

Data Transfer Commands

  • TransferAllSnapshots([incremental=<true | false>] [, waitForFinish=<true | false>]) - upload data (all snapshots) to the server, must call CreateProject or OpenProject and Load<Connector> before. Not allowed for data set defining a connection point unless only one snapshot is present.

    • incremental - incremental transfer (true | false), default is false
    • waitForFinish - waits for the server-side processing (true | false), default is true
  • TransferSnapshots(firstSnapshot=snapshot-id, lastSnapshot=snapshot-id [,incremental=<true | false>] [, waitForFinish=<true | false>]) - uploads all snapshots between the firstSnapshot and the lastSnapshot (inclusive). Only one snapshot is allowed for data set defining a connection point.

    • firstSnapshot - the first transferred snapshot id
    • lastSnapshot - the last transferred snapshot id
    • incremental - incremental transfer (true | false), default is false
    • waitForFinish - waits for the server-side processing (true | false), default is true
  • TransferLastSnapshot([incremental=<true | false>] [, waitForFinish=<true | false>]) - uploads the lastSnapshot

    • incremental - incremental transfer (true | false), default is false
    • waitForFinish - waits for the server-side processing (true | false), default is true

Connector Commands

CSV Connector Commands

  • GenerateCsvConfig(csvHeaderFile=<file>, configFile=<config> [, defaultLdmType=<mode>] [, folder=<folder>], [separator = <separator-char>]) - generate a sample XML config file based on the fields from your CSV file. If the config file exists already, only new columns are added. The config file must be edited as the LDM types (attribute | fact | label etc.) are assigned randomly.

    • csvHeaderFile - path to CSV file (only the first header row will be used)
    • configFile - path to configuration file (will be overwritten)
    • defaultLdmType - LDM mode to be associated with new columns (only ATTRIBUTE mode is supported by the ProcessNewColumns task at this time)
    • folder - folder where to place new attributes
    • separator - optional field separator, the default is ','
  • UseCsv(csvDataFile=<data>, configFile=<config>, header=<true | false>, [separator = <separator-char>]) - load CSV data file using config file describing the file structure, must call CreateProject or OpenProject before

    • csvDataFile - path to CSV datafile
    • configFile - path to XML configuration file (see the GenerateCsvConfig command that generates the config file template)
    • header - true if the CSV file has header in the first row, false otherwise
    • separator - optional field separator, the default is ','

GoogleAnalytics Connector Commands

  • GenerateGoogleAnalyticsConfig(name=<name>, configFile=<config>, dimensions=<pipe-separated-ga-dimensions>, metrics=<pipe-separated-ga-metrics>) - generate an XML config file based on the fields from your GA query.

    • name - the new dataset name
    • configFile - path to configuration file (will be overwritten)
    • dimensions - pipe (|) separated list of Google Analytics dimensions (see GData Reference)
    • metrics - pipe (|) separated list of Google Analytics metrics (see GData Reference)
  • UseGoogleAnalytics(configFile=<config>, username=<ga-username>, password=<ga-password>, profileId=<ga-profile-id>, dimensions=<pipe-separated-ga-dimensions>, metrics=<pipe-separated-ga-metrics>, startDate=<date>, endDate=<date>, filters=<ga-filter-string>) - load GA data file using config file describing the file structure, must call CreateProject or OpenProject before

    • configFile - path to configuration file (will be overwritten)
    • token - Google Analytics AuthSub token (you must specify either the token or username/password)
    • username - Google Analytics username (you must specify either the token or username/password)
    • password - Google Analytics password (you must specify either the token or username/password)
    • profileId - Google Analytics profile ID (this is a value of the id query parameter in the GA url)
    • dimensions - pipe (|) separated list of Google Analytics dimensions (see GData Reference)
    • metrics - pipe (|) separated list of Google Analytics metrics (see GData Reference)
    • startDate - the GA start date in the yyyy-mm-dd format
    • endDate - the GA end date in the yyyy-mm-dd format
    • filters - the GA filters (see GData Documentation)

JDBC Connector Commands

  • GenerateJdbcConfig(name=<name>, configFile=<config>, driver=<jdbc-driver>, url=<jdbc-url>, query=<sql-query> [, username=<jdbc-username>] [, password=<jdbc-password>]) - generate an XML config file based on the fields from your JDBC query.

    • name - the new dataset name
    • configFile - path to configuration file (will be overwritten)
    • driver - JDBC driver string (e.g. "org.apache.derby.jdbc.EmbeddedDriver"), you'll need to place the JAR with the JDBC driver to the lib subdirectory
    • url - JDBC url (e.g. "jdbc:derby:mydb")
    • query - SQL query (e.g. "SELECT employee,dept,salary FROM payroll")
    • username - JDBC username
    • password - JDBC password
  • UseJdbc(configFile=<config>, driver=<jdbc-driver>, url=<jdbc-url>, query=<sql-query> [, username=<jdbc-username>] [, password=<jdbc-password>]) - load JDBC data file using config file describing the file structure, must call CreateProject or OpenProject before

    • configFile - path to configuration file (will be overwritten)
    • driver - JDBC driver string (e.g. "org.apache.derby.jdbc.EmbeddedDriver"), you'll need to place the JAR with the JDBC driver to the lib subdirectory
    • url - JDBC url (e.g. "jdbc:derby:mydb")
    • query - SQL query (e.g. "SELECT employee,dept,salary FROM payroll")
    • username - JDBC username
    • password - JDBC password
  • ExportJdbcToCsv(dir=<dir>, driver=<jdbc-driver>, url=<jdbc-url> [, username=<jdbc-username>] [, password=<jdbc-password>]) - exports all tables from the database to CSV file

    • dir - target directory
    • driver - JDBC driver string (e.g. "org.apache.derby.jdbc.EmbeddedDriver"), you'll need to place the JAR with the JDBC driver to the lib subdirectory
    • url - JDBC url (e.g. "jdbc:derby:mydb")
    • username - JDBC username
    • password - JDBC password

SalesForce Connector Commands

  • GenerateSfdcConfig(name=<name>, configFile=<config>, query=<soql-query>, username=<sfdc-username>, password=<sfdc-password>, token=<sfdc-security-token>) - generate an XML config file based on the fields from your SFDC query.

    • name - the new dataset name
    • configFile - path to configuration file (will be overwritten)
    • query - SOQL query (e.g. "SELECT Id, Name FROM Account"), see Salesforce API
    • username - SFDC username
    • password - SFDC password
    • token - SFDC security token (you may append the security token to the password instead using this parameter)
  • UseSfdc(configFile=<config>, query=<soql-query>, username=<sfdc-username>, password=<sfdc-password>, token=<sfdc-security-token>) - load SalesForce data file using config file describing the file structure, must call CreateProject or OpenProject before

    • configFile - path to configuration file (will be overwritten)
    • query - SOQL query (e.g. "SELECT Id, Name FROM Account"), see Salesforce API
    • username - SFDC username
    • password - SFDC password
    • token - SFDC security token (you may append the security token to the password instead using this parameter)

Time Dimension Connector Commands

  • UseDateDimension(name=<name>) - load new time dimension into the project, must call CreateProject or OpenProject before

    • name - the time dimension name differentiates the time dimension form others. This is typically something like "closed", "created" etc.
Jump to Line
Something went wrong with that request. Please try again.