Skip to content
Andrey Gavrikov edited this page Sep 10, 2013 · 13 revisions

This page contains some examples of configurations for various frequent scenarios. In most cases connection part (login, pass, proxy) is omitted.
For more detailed description of each parameter see ./config/sample-configuration.properties

Scenario 1 - extracting specific fields

Extract all data (unconditionally) from Account and Contact objects from Production organisation

  • Account fields: Id, Name, Custom_Email__c
  • Contact fields: Id, FirstName, LastName
# output details
outputFolder=c:/extract
backup.objects=Account, Contact
backup.soql.Account=select Id, Name, Custom_Email__c from Account
backup.soql.Contact=select Id, FirstName, LastName from Contact

#connection details
sf.username = name@company.com
sf.password = passwordAndToken

# https://login.salesforce.com points to Production Org
sf.serverurl = https://login.salesforce.com
# for Sandbox use https://test.salesforce.com

Assuming the above conifg is placed in a file c:/myextract.properties you can execute it like so: java -jar backup-force.com-0.3.2.jar --config c:/myextract.properties

Scenario 2 - per-object conditions

Extract (conditionally) data from Account and Contact objects from Sandbox organisation

  • Account
    fields: Id, Name
    condition: NumberOfEmployees > 100 and Type = 'Prospect'
  • Contact
    fields: all fields
    condition: contacts that belong to the extracted accounts
# output details
outputFolder=c:/extract
backup.objects=Account, Contact
backup.soql.Account=select Id, Name, Custom_Email__c from Account where NumberOfEmployees > 100 and Type = 'Prospect'
backup.soql.Contact=select * from Contact where Account.NumberOfEmployees > 100 and Account.Type = 'Prospect'

#connection details
sf.username = name@company.com
sf.password = passwordAndToken

# https://login.salesforce.com points to Production Org
# for Sandbox use https://test.salesforce.com
sf.serverurl = https://test.salesforce.com

Scenario 3 - all data from all objects

Extract all data (unconditionally) from all objects objects from Production organisation

Note: at the time of writing (v0.3.2) backup-force.com can not extract large Chatter/Content/Files (see Limitations), so in salesforce Orgs with large Chatter/Content/Files we have to use extra condition and limit size of ContentVersion records.

# output details
outputFolder=c:/extract
backup.objects=*

# workaround to avoid problems with extra large files
backup.soql.ContentVersion=select * from ContentVersion where ContentSize < 20000000

#connection details
sf.username = name@company.com
sf.password = passwordAndToken

# https://login.salesforce.com points to Production Org
sf.serverurl = https://login.salesforce.com
# for Sandbox use https://test.salesforce.com

Scenario 4 - incremental extract

Extract (incrementally) all fields from Account and Contact objects and full (non incremental) data from Opportunity object.
Incremental extract means that every time the process runs it will bring only records changed since last run.

In this scenario we introduce parameters: lastRunOutputFile and backup.global.where

# output details
outputFolder=c:/extract
lastRunOutputFile=c:/lastRun.properties
backup.objects=Account, Contact, Opportunity
backup.soql.Opportunity=select * from Opportunity
backup.global.where=LastModifiedDate >= $Object.LastModifiedDate

#connection details
# NOTE: connection details omitted

Scenario 5 - incremental extract done properly

The incremental extract example in Scenario 4 works well for integration purposes but will not work for backup purpose because extracted files (Account.csv, Contact.csv, etc) will be overwritten every time when process runs. Ideally we want to keep previously extracted files. One potential solution is to create new extract folder every time the process runs.

In this scenario we introduce shell expression.
Shell expression is valid command line expression enclosed in back-ticks

# output details
# for windows use data.exe utility from UnixUtils: http://sourceforge.net/projects/unxutils/files/
outputFolder=c:/extract/`date.exe +%Y%m%d-%H%M`
# for Linux/Unix/OSX use built in date command
#outputFolder=c:/extract/`date +%Y%m%d-%H%M`

lastRunOutputFile=c:/lastRun.properties
backup.objects=Account, Contact, Opportunity
backup.soql.Opportunity=select * from Opportunity
backup.global.where=LastModifiedDate >= $Object.LastModifiedDate

#connection details
# NOTE: connection details omitted

Scenario 6 - extraction details in one config, connection details in another

In many cases you will have several extract configuration setup against the same salesforce.com Org. In order to avoid duplicating connection details in each extract config you can take advantage of the fact that backup-force.com supports multiple --config parameters on the command line. If two or more configs contain the same parameter (i.e. outputFolder) then the config specified last wins.

Suppose we have 2 files:

  • AccountExtract.properties - contains outputFolder, backup.objects, etc
  • MyOrgConnection.properties - contains sf.username, sf.password, etc

We could run extract process like so java -jar backup-force.com-0.3.2.jar --config c:/AccountExtract.properties --config c:/MyOrgConnection.properties

Scenario 7 - override config file parameters from command line

Suppose you have a config file describing backup/extract process and you want to run a very similar extract but change just a couple of parameters. You have two main options:

  • Create yet another config and add it via --config, i.e. ... --config c:/extract.properties --config c:/connection.properties
  • Specify parameter value directly on the command line

In some cases a quick command line option is easier. For example, if your original process was executed via command:
java -jar backup-force.com-0.3.2.jar --config c:/AccountExtract.properties

then to add or override something (e.g. outputFolder) inside of AccountExtract.properties config file you could do something like this:
java -jar backup-force.com-0.3.2.jar --config c:/AccountExtract.properties --outputFolder c:/extract2

Options defined on the command line will always override options defined in configuration files specified via --config.

Scenario 8 - export deleted record

You can easily add deleted records to your export by adding IsDeleted = true somewhere in where clause. Deleted records are those currently in salesforce Recycle Bin.

Example below shows config to export 3 Objects:

  • Accounts (only existing ones)
  • Contacts (existing and deleted)
  • Opportunities (only names of deleted Opportunities)
backup.soql.Account=select * from Account
backup.soql.Contact=select * from Contact where IsDeleted = true or IsDeleted = false
backup.soql.Opportunity=select Name from Opportunity where IsDeleted = true

# output/connection details
# NOTE: output and connection details are omitted

Scenario 9 - export documents/attachments as files

When export config includes objects like Attachment, Document, ContentVersion it is possible to save those documents as normal files, as well as CSV data. The file name pattern is controlled by backup.extract.file key.

Example below shows how to make sure that document type data will be also saved as files with naming convention: <file-name>-<sfdc-record-id>.<extension>, e.g. Screenshot-01560000000Jezo.jpg

outputFolder=c:/export/
backup.objects=Attachment, Document, ContentVersion
# if files need to be saved as real files then use backup.extract.file 
# available values: $id, $name, $ext
backup.extract.file=$name-$id.$ext
# connection details
# NOTE: connection details are omitted

Scenario 10 - run user defined script upon process completion

backup-force.com supports several hooks which can be used to execute user defined scripts. Such scripts may be useful to check log files for errors, to archive exported data, to start further processing on extracted data and so on.

Hooks allow to run user defined script at the following points in time:

  • before the whole process is started (useful when more than 1 Object Type is exported)
  • before each Object type extract is started
  • after each Object type extract is started
  • after the end of the whole process (useful when more than 1 Object Type is exported)

Each script can be executed on its own or with user defined parameters.
Regardless of whether or not user defined parameters have been specified each script will be called with several predefined parameters, see ./config/sample-configuration.properties for more details.

Following example demonstrates config with user defined global.after script which receives 2 user defined parameters.

outputFolder=c:/export/
lastRunOutputFile=/full/path/to/lastRun.properties
backup.objects=Account, Contact
hook.global.after=/home/user/check_errors.sh
hook.global.after.params=log_file_name.log myextract
# connection details
# NOTE: connection details are omitted

The above mentioned config will execute check_errors.sh like so:

/home/user/check_errors.sh log_file_name.log myextract c:/export/ /full/path/to/lastRun.properties

If hook.global.after.params was not specified then the script would be called like so:

/home/user/check_errors.sh c:/export/ /full/path/to/lastRun.properties