SalesforceBulkWriter

Not available in Community Designer

Short Description
Ports
Metadata
SalesforceBulkWriter Attributes
Details
Examples
Compatibility
See also

Short Description

SalesforceBulkWriter writes, updates, or deletes records in Salesforce using Bulk API.

Component Data output Input ports Output ports Transformation Transf. req. Java CTL Auto-propagated metadata
SalesforceBulkWriterdatabase12
yes
no
no
yes
yes

Icon

Ports

Port typeNumberRequiredDescriptionMetadata
Input0
yes
For data records to be inserted, updated, or deletedinput0
Output0
no
For accepted data recordsinput0 plus ObjectId field
Output1
no
For rejected data recordsinput0 plus Error message field

If you do not map error port and an error occurs, the component fails.

Metadata

SalesforceBulkWriter propagates metadata from left to right.

The metadata on the right side on first output port have an additional field ObjectId. If operation is upsert, the output metadata on first output port also have an additional field Created.

The metadata on the right side on second output port have and additional field Error.

Metadata cannot use Autofilling Functions.

SalesforceBulkWriter Attributes

AttributeReqDescriptionPossible values
Basic
ConnectionyesA Salesforce connection. See Salesforce connection. e.g. MySFConnection
Salesforce objectyesAn object affected by operatione.g. Account
Operation Operation performed on the Salesforce objectinsert (default) | update | upsert | delete | hardDelete
Input mapping 

Mapping of fields to be inserted/updated/deleted in Salesforce

Unmapped mandatory output fields have exclamation mark icon.

Map by name (default)
Output mapping Mapping of successfully inserted/updated/deleted fieldsMap by name (default)
Error mapping Mapping of records that were not inserted/updated/deletedMap by name (default)
Advanced
Result polling interval (seconds) 

Time between queries for result of the batch processing.

The default value is taken from the connection configuration.

5 (default)
Upsert external ID field(yes) Field of object specified in Salesforce object which will be used to match records in Upsert operation. Mandatory for Upsert operation. Not used in any other operation. e.g. Id
Job concurrency mode  If SalesforceBulkWriter uses parallel mode, all batches in the job run at once. Using parallel mode improves speed of processing and lowers needed API requests (less polling requests for job status), but it can introduce lots of lock contention on salesforce objects.

See documentation on parallel and serial modes: https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/asynch_api_planning_guidelines.htm

parallel (default) | serial
Batch size  Size of the batch. The default value is 10000.e.g. 10000

Details

Supported Operations

Insert - inserts new records

Update - updates existing records

Upsert - inserts or updates records

Delete - moves records to recycle bin.

HardDelete - removes records permanently. The operation requires a special permission.

According to the operation you have chosen, different output metadata in Input mapping is displayed in transform editor.

[Note]Note

Bulk API does asynchronous calls. Therefore data records written to Salesforce may appear in the Salesforce web gui after several seconds or even minutes.

SOAP or Bulk API

If you write more than 1500-2000 records, it is better to use Bulk API because it will use less API requests.

Mapping Dialogs

Mapping dialogs uses SOAP API to extract metadata fields on the Salesforce-side of the dialog.

Names of Objects and fields

When specifying Salesforce objects and fields, always use the API Name of the element.

Notes and Limitations

SalesforceBulkWriter does not support writing attachments.

Details on API Calls

SalesforceBulkWriter automatically groups records to batches and uploads them to Salesforce. Batch size limit is 10k records or 10MB of data. The limit is introduced by the Salesforce Bulk API.

SalesforceBulkWriter uses multiple API calls during its run. All of them count towards your Salesforce API request limit. The precise call flow is:

  1. Login

  2. Extract fields of object specified in Salesforce object attribute.

  3. Create a bulk job.

  4. Upload batches. The number of calls is the same as number of batches.

  5. Close the bulk job.

  6. Get job completion status. This call is repeated in interval specified by Result polling interval attribute until the job is completed.

  7. Download batch results. The number of calls is the same as number of batches.

Examples

Insert records into Salesforce
Updating records
Upserting records
Deleting records
Hard-Deleting records

Insert records into Salesforce

This example shows a basic use case with insertion of records.

Insert records with new products into Salesforce. Input data fields have the same field names as Salesforce data fieds.

Solution

Connect input port of SalesforceBulkWriter with data source.

Create a Salesforce connection.

In SalesforceBulkWriter, fill in Connection and Object.

AttributeValue
ConnectionConnection from the second step
ObjectProduct2

We do not have to specify the operation, as insert is the default one. We do not have to specify Input mapping, as metadata field names are the same as Salesforce field names and the default mapping (by name) is used.

You can attach an edge to the first output port to obtain object ids of inserted records. You can attach an edge to the second output port to obtain records that have not been inserted.

Updating records

This example shows updating of Salesforce objects.

We do not sell products from our 'PaddleSteamer' family anymore. Set IsActive to false for all products of this family.

Solution

In this example, we have to read IDs of the objects to be updated first (with SalesforceBulkReader). Secondly, set IsActive to false. Finally, update the records in Salesforce (with SalesforceBulkWriter).

Create a Salesforce connection. In SalesforceBulkReader, set Connection, SOQL query, and Output mapping.

AttributeValue
ConnectionConnection from the first step
SOQL querySELECT Id FROM Product2 WHERE Family = 'PaddleSteamer'
Output mappingSee the code below
//#CTL2

function integer transform() {
 	$out.0.Id = $in.0.Id;
 	$out.0.IsActive = false;

 	return ALL;
}

In SalesforceBulkWriter, set Connection, Salesforce object, and Operation.

AttributeValue
ConnectionConnection from the first step
Salesforce objectProduct2
OperationUpdate

In SalesforceBulkWriter, no output mapping is specified as mapping by field names is used.

Upserting records

This example shows usage of upsert operation.

There is a list of companies and their web sites. Update the websites in salesforce.

Solution

Read records with company name and object id from Account object. Match the companies with their websites. Write only the new records or records that have been updated.

Create a salesforce connection.

In SalesforceBulkWriter, use attributes Connection, Salesforce Object Operation, Output mapping, and Upsert external ID field.

AttributeValue
ConnectionConnection from the first step
Salesforce objectAccount
OperationUpsert
Input MappingSee the code below
Upsert external ID fieldId
//#CTL2

function integer transform() {
 	$out.0.Id = $in.0.Id;

 	return ALL;
}
[Note]Note

The records containing valid record ID are updated; the records with ID set to null are inserted.

Deleting records

This example shows deleting records.

The product called 'abc' has been inserted multiple times. Furthermore, we do not have 'abc' product anymore. Remove the 'abc' product.

Solution

Create a Salesforce connection.

Read object IDs of 'abc' product with SalesforceBulkReader.

In SalesforceBulkWriter, set Connection, Salesforce object, and Operation.

AttributeValue
ConnectionConnection from the first step
Salesforce objectProduct2
OperationDelete

Hard-Deleting records

This example shows usage of Hard Delete.

Permanently delete records of specified IDs from Account object. The IDs are received from an input edge.

Solution

Create a salesforce connection.

In SalesforceBulkWriter, set Connection, Salesforce object, Operation, and Input mapping.

AttributeValue
ConnectionConnection from the first step
Salesforce objectAccount
OperationHard Delete
Input mappingSee the code below.
//#CTL2

function integer transform() {
 	$out.0.Id = $in.0.Id;

 	return ALL;
}
[Note]Note

The user has to have Bulk API Hard Delete privilege to use Hard Delete operation.

See https://help.salesforce.com/apex/HTViewSolution?id=000171306&language=en_US.

Compatibility

4.3.0-M2

SalesforceBulkWriter is available since 4.3.0-M2. It uses Salesforce Bulk API version 37.0.

4.5.0-M2

Since CloverETL 4.5.0-M2, SalesforceBulkWriter uses Salesforce Bulk API version 39.0.

4.5.0

Since CloverETL 4.5.0, you can set job concurrency mode and batch size.

See also

SalesforceBulkReader
SalesforceWriter
SalesforceWaveWriter
Common Properties of Components
Specific Attribute Types
Common Properties of Readers
Readers Comparison
Salesforce connection
Extracting Metadata from Salesforce