Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 4 additions & 9 deletions command-snapshot.json
Original file line number Diff line number Diff line change
Expand Up @@ -37,10 +37,9 @@
"alias": [],
"command": "data:delete:bulk",
"flagAliases": ["apiversion", "csvfile", "sobjecttype", "targetusername", "u"],
"flagChars": ["a", "f", "o", "s", "w"],
"flagChars": ["f", "o", "s", "w"],
"flags": [
"api-version",
"async",
"file",
"flags-dir",
"hard-delete",
Expand Down Expand Up @@ -88,7 +87,6 @@
"flags": [
"all-rows",
"api-version",
"async",
"column-delimiter",
"flags-dir",
"json",
Expand Down Expand Up @@ -141,10 +139,9 @@
"alias": [],
"command": "data:import:bulk",
"flagAliases": [],
"flagChars": ["a", "f", "o", "s", "w"],
"flagChars": ["f", "o", "s", "w"],
"flags": [
"api-version",
"async",
"column-delimiter",
"file",
"flags-dir",
Expand Down Expand Up @@ -213,10 +210,9 @@
"alias": [],
"command": "data:update:bulk",
"flagAliases": [],
"flagChars": ["a", "f", "o", "s", "w"],
"flagChars": ["f", "o", "s", "w"],
"flags": [
"api-version",
"async",
"column-delimiter",
"file",
"flags-dir",
Expand Down Expand Up @@ -260,10 +256,9 @@
"alias": [],
"command": "data:upsert:bulk",
"flagAliases": ["apiversion", "csvfile", "externalid", "sobjecttype", "targetusername", "u"],
"flagChars": ["a", "f", "i", "o", "s", "w"],
"flagChars": ["f", "i", "o", "s", "w"],
"flags": [
"api-version",
"async",
"column-delimiter",
"external-id",
"file",
Expand Down
8 changes: 2 additions & 6 deletions messages/data.export.bulk.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@ Bulk export records from an org into a file using a SOQL query. Uses Bulk API 2.

You can use this command to export millions of records from an org, either to migrate data or to back it up.

Use a SOQL query to specify the fields of a standard or custom object that you want to export. Specify the SOQL query either at the command line with the --query flag or read it from a file with the --query-file flag; you can't specify both flags. The --output-file flag is required, which means you can only write the records to a file, in either CSV or JSON format.
Use a SOQL query to specify the fields of a standard or custom object that you want to export. Specify the SOQL query either at the command line with the --query flag or read it from a file with the --query-file flag; you can't specify both flags. The --output-file flag is required, which means you can only write the records to a file, in either CSV or JSON format.

Bulk exports can take a while, depending on how many records are returned by the SOQL query. If the command times out, or you specified the --async flag, the command displays the job ID. To see the status and get the results of the job, run "sf data export resume" and pass the job ID to the --job-id flag.
Bulk exports can take a while, depending on how many records are returned by the SOQL query. If the command times out, the command displays the job ID. To see the status and get the results of the job, run "sf data export resume" and pass the job ID to the --job-id flag.

IMPORTANT: This command uses Bulk API 2.0, which limits the type of SOQL queries you can run. For example, you can't use aggregate functions such as count(). For the complete list of limitations, see the "SOQL Considerations" section in the "Bulk API 2.0 and Bulk API Developer Guide" (https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/queries.htm).

Expand All @@ -22,10 +22,6 @@ IMPORTANT: This command uses Bulk API 2.0, which limits the type of SOQL queries

<%= config.bin %> <%= command.id %> --query "SELECT Id, Name, Account.Name FROM Contact" --output-file export-accounts.json --result-format json --wait 10 --all-rows

- Export asynchronously; the command immediately returns a job ID that you then pass to the "sf data export resume" command:

<%= config.bin %> <%= command.id %> --query "SELECT Id, Name, Account.Name FROM Contact" --output-file export-accounts.json --result-format json --async

# flags.wait.summary

Time to wait for the command to finish, in minutes.
Expand Down
2 changes: 1 addition & 1 deletion messages/data.export.resume.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Resume a bulk export job that you previously started. Uses Bulk API 2.0.

# description

When the original "data export bulk" command either times out or is run with the --async flag, it displays a job ID. To see the status and get the results of the bulk export, run this command by either passing it the job ID or using the --use-most-recent flag to specify the most recent bulk export job.
When the original "data export bulk" command times out, it displays a job ID. To see the status and get the results of the bulk export, run this command by either passing it the job ID or using the --use-most-recent flag to specify the most recent bulk export job.

Using either `--job-id` or `--use-most-recent` will properly resolve to the correct org where the bulk job was started based on the cached data by "data export bulk".

Expand Down
6 changes: 1 addition & 5 deletions messages/data.import.bulk.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ You can use this command to import millions of records into the object from a fi

All the records in the CSV file must be for the same Salesforce object. Specify the object with the `--sobject` flag.

Bulk imports can take a while, depending on how many records are in the CSV file. If the command times out, or you specified the --async flag, the command displays the job ID. To see the status and get the results of the job, run "sf data import resume" and pass the job ID to the --job-id flag.
Bulk imports can take a while, depending on how many records are in the CSV file. If the command times out, the command displays the job ID. To see the status and get the results of the job, run "sf data import resume" and pass the job ID to the --job-id flag.

For information and examples about how to prepare your CSV files, see "Prepare Data to Ingest" in the "Bulk API 2.0 and Bulk API Developer Guide" (https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/datafiles_prepare_data.htm).

Expand All @@ -18,10 +18,6 @@ For information and examples about how to prepare your CSV files, see "Prepare D

<%= config.bin %> <%= command.id %> --file accounts.csv --sobject Account --wait 10 --target-org my-scratch

- Import asynchronously and use the default org; the command immediately returns a job ID that you then pass to the "sf data import resume" command:

<%= config.bin %> <%= command.id %> --file accounts.csv --sobject Account --async

# flags.async.summary

Don't wait for the command to complete.
Expand Down
2 changes: 1 addition & 1 deletion messages/data.import.resume.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Resume a bulk import job that you previously started. Uses Bulk API 2.0.

# description

When the original "sf data import bulk" command either times out or is run with the --async flag, it displays a job ID. To see the status and get the results of the bulk import, run this command by either passing it the job ID or using the --use-most-recent flag to specify the most recent bulk import job.
When the original "sf data import bulk" command times out, it displays a job ID. To see the status and get the results of the bulk import, run this command by either passing it the job ID or using the --use-most-recent flag to specify the most recent bulk import job.

# examples

Expand Down
6 changes: 1 addition & 5 deletions messages/data.update.bulk.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ You can use this command to update millions of Salesforce object records based o

All the records in the CSV file must be for the same Salesforce object. Specify the object with the `--sobject` flag. The first column of every line in the CSV file must be an ID of the record you want to update. The CSV file can contain only existing records; if a record in the file doesn't currently exist in the Salesforce object, the command fails. Consider using "sf data upsert bulk" if you also want to insert new records.

Bulk updates can take a while, depending on how many records are in the CSV file. If the command times out, or you specified the --async flag, the command displays the job ID. To see the status and get the results of the job, run "sf data update resume" and pass the job ID to the --job-id flag.
Bulk updates can take a while, depending on how many records are in the CSV file. If the command times out, the command displays the job ID. To see the status and get the results of the job, run "sf data update resume" and pass the job ID to the --job-id flag.

For information and examples about how to prepare your CSV files, see "Prepare Data to Ingest" in the "Bulk API 2.0 and Bulk API Developer Guide" (https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/datafiles_prepare_data.htm).

Expand All @@ -18,10 +18,6 @@ For information and examples about how to prepare your CSV files, see "Prepare D

<%= config.bin %> <%= command.id %> --file accounts.csv --sobject Account --wait 10 --target-org my-scratch

- Update asynchronously and use the default org; the command immediately returns a job ID that you then pass to the "sf data update resume" command:

<%= config.bin %> <%= command.id %> --file accounts.csv --sobject Account --async

# flags.async.summary

Don't wait for the command to complete.
Expand Down
2 changes: 1 addition & 1 deletion messages/data.update.resume.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Resume a bulk update job that you previously started. Uses Bulk API 2.0.

# description

When the original "sf data update bulk" command either times out or is run with the --async flag, it displays a job ID. To see the status and get the results of the bulk update, run this command by either passing it the job ID or using the --use-most-recent flag to specify the most recent bulk update job.
When the original "sf data update bulk" command times out, it displays a job ID. To see the status and get the results of the bulk update, run this command by either passing it the job ID or using the --use-most-recent flag to specify the most recent bulk update job.

Using either `--job-id` or `--use-most-recent` will properly resolve to the correct org where the bulk job was started based on the cached data by "data update bulk".

Expand Down
13 changes: 2 additions & 11 deletions src/bulkIngest.ts
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,6 @@ type ResumeCommandIDs = 'data import resume' | 'data update resume' | 'data upse
*
* It will create the specified bulk ingest job, set up the oclif/MSO stages and return the job info.
* */
// eslint-disable-next-line complexity
export async function bulkIngest(opts: {
resumeCmdId: ResumeCommandIDs;
stageTitle: string;
Expand All @@ -49,7 +48,6 @@ export async function bulkIngest(opts: {
externalId?: JobInfoV2['externalIdFieldName'];
conn: Connection;
cache: BulkUpdateRequestCache | BulkImportRequestCache | BulkUpsertRequestCache;
async: boolean;
wait: Duration;
file: string;
jsonEnabled: boolean;
Expand All @@ -63,7 +61,7 @@ export async function bulkIngest(opts: {
throw new SfError('External ID is only required for `sf data upsert bulk`.');
}

const timeout = opts.async ? Duration.minutes(0) : opts.wait ?? Duration.minutes(0);
const timeout = opts.wait ?? Duration.minutes(0);
const async = timeout.milliseconds === 0;

// CSV file for `delete/HardDelete` operations only have 1 column (ID), we set it to `COMMA` if not specified but any delimiter works.
Expand Down Expand Up @@ -344,7 +342,7 @@ export const lineEndingFlag = Flags.option({
})();

/**
* Use only for commands that maintain sfdx compatibility.
* Use only for commands that maintain sfdx compatibility.1
*
* @deprecated
*/
Expand All @@ -371,13 +369,6 @@ export const baseUpsertDeleteFlags = {
summary: messages.getMessage('flags.wait.summary'),
min: 0,
defaultValue: 0,
exclusive: ['async'],
}),
async: Flags.boolean({
char: 'a',
summary: messages.getMessage('flags.async.summary'),
exclusive: ['wait'],
deprecated: true,
}),
};

Expand Down
1 change: 0 additions & 1 deletion src/commands/data/delete/bulk.ts
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,6 @@ export default class Delete extends SfCommand<BulkResultV2> {
columnDelimiter: undefined,
conn: flags['target-org'].getConnection(flags['api-version']),
cache: await BulkDeleteRequestCache.create(),
async: flags.async,
wait: flags.wait,
file: flags.file,
jsonEnabled: this.jsonEnabled(),
Expand Down
8 changes: 1 addition & 7 deletions src/commands/data/export/bulk.ts
Original file line number Diff line number Diff line change
Expand Up @@ -38,12 +38,6 @@ export default class DataExportBulk extends SfCommand<DataExportBulkResult> {
char: 'w',
helpValue: '<minutes>',
unit: 'minutes',
exclusive: ['async'],
}),
async: Flags.boolean({
summary: messages.getMessage('flags.async.summary'),
exclusive: ['wait'],
deprecated: true,
}),
query: Flags.string({
summary: messages.getMessage('flags.query.summary'),
Expand Down Expand Up @@ -108,7 +102,7 @@ export default class DataExportBulk extends SfCommand<DataExportBulkResult> {

const conn = flags['target-org'].getConnection(flags['api-version']);

const timeout = flags.async ? Duration.minutes(0) : flags.wait ?? Duration.minutes(0);
const timeout = flags.wait ?? Duration.minutes(0);

// `flags['query-file']` will be present if `flags.query` isn't. oclif's `exclusive` isn't quite that clever
const soqlQuery = flags.query ?? fs.readFileSync(flags['query-file'] as string, 'utf8');
Expand Down
8 changes: 0 additions & 8 deletions src/commands/data/import/bulk.ts
Original file line number Diff line number Diff line change
Expand Up @@ -26,12 +26,6 @@ export default class DataImportBulk extends SfCommand<DataImportBulkResult> {
public static readonly examples = messages.getMessages('examples');

public static readonly flags = {
async: Flags.boolean({
summary: messages.getMessage('flags.async.summary'),
char: 'a',
exclusive: ['wait'],
deprecated: true,
}),
file: Flags.file({
summary: messages.getMessage('flags.file.summary'),
char: 'f',
Expand All @@ -48,7 +42,6 @@ export default class DataImportBulk extends SfCommand<DataImportBulkResult> {
summary: messages.getMessage('flags.wait.summary'),
char: 'w',
unit: 'minutes',
exclusive: ['async'],
}),
'target-org': Flags.requiredOrg(),
'line-ending': Flags.option({
Expand All @@ -71,7 +64,6 @@ export default class DataImportBulk extends SfCommand<DataImportBulkResult> {
columnDelimiter: flags['column-delimiter'],
conn: flags['target-org'].getConnection(flags['api-version']),
cache: await BulkImportRequestCache.create(),
async: flags.async,
wait: flags.wait,
file: flags.file,
jsonEnabled: this.jsonEnabled(),
Expand Down
6 changes: 0 additions & 6 deletions src/commands/data/update/bulk.ts
Original file line number Diff line number Diff line change
Expand Up @@ -26,11 +26,6 @@ export default class DataUpdateBulk extends SfCommand<DataUpdateBulkResult> {
public static readonly examples = messages.getMessages('examples');

public static readonly flags = {
async: Flags.boolean({
summary: messages.getMessage('flags.async.summary'),
char: 'a',
deprecated: true,
}),
wait: Flags.duration({
summary: messages.getMessage('flags.wait.summary'),
char: 'w',
Expand Down Expand Up @@ -65,7 +60,6 @@ export default class DataUpdateBulk extends SfCommand<DataUpdateBulkResult> {
columnDelimiter: flags['column-delimiter'],
conn: flags['target-org'].getConnection(flags['api-version']),
cache: await BulkUpdateRequestCache.create(),
async: flags.async,
wait: flags.wait,
file: flags.file,
jsonEnabled: this.jsonEnabled(),
Expand Down
1 change: 0 additions & 1 deletion src/commands/data/upsert/bulk.ts
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,6 @@ export default class Upsert extends SfCommand<BulkResultV2> {
externalId: flags['external-id'],
conn: flags['target-org'].getConnection(flags['api-version']),
cache: await BulkUpsertRequestCache.create(),
async: flags.async,
wait: flags.wait,
file: flags.file,
jsonEnabled: this.jsonEnabled(),
Expand Down
7 changes: 1 addition & 6 deletions test/commands/data/bulk/results.nut.ts
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@ import { EOL } from 'node:os';
import { execCmd, TestSession } from '@salesforce/cli-plugins-testkit';
import { expect } from 'chai';
import { ensureString } from '@salesforce/ts-types';
import { Duration, sleep } from '@salesforce/kit';
import { validateCsv } from '../../../testUtil.js';
import { DataImportBulkResult } from '../../../../src/commands/data/import/bulk.js';
import { DataBulkResultsResult } from '../../../../src/commands/data/bulk/results.js';
Expand Down Expand Up @@ -63,17 +62,13 @@ describe('data bulk results NUTs', () => {
const csvFile = await generateAccountsCsv(session.project.dir, 5000);

const bulkImportAsync = execCmd<DataImportBulkResult>(
`data import bulk --file ${csvFile} --sobject account --async --json`,
`data import bulk --file ${csvFile} --sobject account --wait 3 --json`,
{ ensureExitCode: 0 }
).jsonOutput?.result as DataImportBulkResult;

expect(bulkImportAsync.jobId).not.to.be.undefined;
expect(bulkImportAsync.jobId).to.be.length(18);

// wait 2 minutes for the async bulk import above to finish.
// we can't use `data import resume` because we expect record failures to happen.
await sleep(Duration.minutes(2));

const results = execCmd<DataBulkResultsResult>(`data bulk results --job-id ${bulkImportAsync.jobId} --json`, {
ensureExitCode: 0,
}).jsonOutput?.result as DataBulkResultsResult;
Expand Down
4 changes: 2 additions & 2 deletions test/commands/data/export/resume.nut.ts
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ describe('data export resume NUTs', () => {

it('should resume export in csv format', async () => {
const outputFile = 'export-accounts.csv';
const command = `data export bulk -q "${soqlQuery}" --output-file ${outputFile} --async --json`;
const command = `data export bulk -q "${soqlQuery}" --output-file ${outputFile} --json`;

const exportAsyncResult = execCmd<DataExportBulkResult>(command, { ensureExitCode: 0 }).jsonOutput?.result;

Expand All @@ -71,7 +71,7 @@ describe('data export resume NUTs', () => {

it('should resume export in json format', async () => {
const outputFile = 'export-accounts.json';
const command = `data export bulk -q "${soqlQuery}" --output-file ${outputFile} --async --result-format json --json`;
const command = `data export bulk -q "${soqlQuery}" --output-file ${outputFile} --result-format json --json`;

const exportAsyncResult = execCmd<DataExportBulkResult>(command, { ensureExitCode: 0 }).jsonOutput?.result;

Expand Down
4 changes: 2 additions & 2 deletions test/commands/data/import/resume.nut.ts
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ describe('data import resume NUTs', () => {
const csvFile = await generateAccountsCsv(session.dir);

const importAsyncRes = execCmd<DataImportBulkResult>(
`data import bulk --file ${csvFile} --sobject account --async --json`,
`data import bulk --file ${csvFile} --sobject account --json`,
{ ensureExitCode: 0 }
).jsonOutput?.result;

Expand All @@ -59,7 +59,7 @@ describe('data import resume NUTs', () => {
it('should resume bulk import via--use-most-recent', async () => {
const csvFile = await generateAccountsCsv(session.dir);

const command = `data import bulk --file ${csvFile} --sobject account --async --json`;
const command = `data import bulk --file ${csvFile} --sobject account --json`;

const exportAsyncResult = execCmd<DataImportBulkResult>(command, { ensureExitCode: 0 }).jsonOutput?.result;

Expand Down
2 changes: 1 addition & 1 deletion test/commands/data/update/resume.nut.ts
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ describe('data update resume NUTs', () => {
);

const dataUpdateAsyncRes = execCmd<DataUpdateBulkResult>(
`data update bulk --file ${updatedCsv} --sobject account --async --json`,
`data update bulk --file ${updatedCsv} --sobject account --json`,
{ ensureExitCode: 0 }
).jsonOutput?.result;

Expand Down
Loading