Short Description |
Ports |
Metadata |
SalesforceBulkWriter Attributes |
Details |
Examples |
Compatibility |
See also |
SalesforceBulkWriter writes, updates, or deletes records in Salesforce using Bulk API.
Component | Data output | Input ports | Output ports | Transformation | Transf. req. | Java | CTL | Auto-propagated metadata |
---|---|---|---|---|---|---|---|---|
SalesforceBulkWriter | database | 1 | 2 |
Port type | Number | Required | Description | Metadata |
---|---|---|---|---|
Input | 0 | For data records to be inserted, updated, or deleted | input0 | |
Output | 0 | For accepted data records | input0 plus ObjectId field | |
Output | 1 | For rejected data records | input0 plus Error message field |
If you do not map an error port and an error occurs, the component fails.
SalesforceBulkWriter propagates metadata from left to right.
The metadata on the right side on first output port have an additional field ObjectId. If the operation is upsert, the output metadata on first output port also have an additional field Created.
The metadata on the right side on the second output port have and additional field Error.
Metadata cannot use Autofilling Functions.
Attribute | Req | Description | Possible values |
---|---|---|---|
Basic | |||
Connection | yes | A Salesforce connection. See Salesforce connection. | e.g. MySFConnection |
Salesforce object | yes | An object affected by operation | e.g. Account |
Operation | An operation performed on the Salesforce object | insert (default) | update | upsert | delete | hardDelete | |
Input mapping |
Mapping of fields to be inserted/updated/deleted in Salesforce Unmapped mandatory output fields have an exclamation mark icon. | Map by name (default) | |
Output mapping | Mapping of successfully inserted/updated/deleted fields | Map by name (default) | |
Error mapping | Mapping of records that were not inserted/updated/deleted | Map by name (default) | |
Advanced | |||
Result polling interval (seconds) |
Time between queries for result of the batch processing. The default value is taken from the connection configuration. | 5 (default) | |
Upsert external ID field | (yes) | A field of object specified in Salesforce object which will be used to match records in the Upsert operation. Mandatory for the Upsert operation. Not used in any other operation. | e.g. Id |
Job concurrency mode |
If SalesforceBulkWriter uses parallel mode, all batches in the job run at once.
Using parallel mode improves speed of processing and lowers needed API requests (less polling requests for job status),
but it can introduce lots of lock contention on salesforce objects.
See documentation on parallel and serial modes: https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/asynch_api_planning_guidelines.htm | parallel (default) | serial | |
Batch size | Size of the batch. The default value is 10,000 records. | e.g. 10000 |
Insert - inserts new records
Update - updates existing records
Upsert - inserts or updates records
Delete - moves records to recycle bin.
HardDelete - removes records permanently. The operation requires a special permission.
According to the operation you have chosen, different output metadata in Input mapping is displayed in transform editor.
Note | |
---|---|
Bulk API does asynchronous calls. Therefore data records written to Salesforce may appear in the Salesforce web gui after several seconds or even minutes. |
If you write more than 1500-2000 records, it is better to use Bulk API because it will use less API requests.
Mapping dialogs uses SOAP API to extract metadata fields on the Salesforce-side of the dialog.
When specifying Salesforce objects and fields, always use the API Name of the element.
SalesforceBulkWriter does not support writing attachments.
SalesforceBulkWriter automatically groups records to batches and uploads them to Salesforce. Batch size limit is 10k records or 10MB of data. The limit is introduced by the Salesforce Bulk API.
SalesforceBulkWriter uses multiple API calls during its run. All of them count towards your Salesforce API request limit. The precise call flow is:
Login
Extract fields of object specified in Salesforce object attribute.
Create a bulk job.
Upload batches. The number of calls is the same as number of batches.
Close the bulk job.
Get job completion status. This call is repeated in interval specified by Result polling interval attribute until the job is completed.
Download batch results. The number of calls is the same as number of batches.
Insert records into Salesforce |
Updating records |
Upserting records |
Deleting records |
Hard-Deleting records |
This example shows a basic use case with insertion of records.
Insert records with new products into Salesforce. Input data fields have the same field names as Salesforce data fieds.
Connect input port of SalesforceBulkWriter with data source.
Create a Salesforce connection.
In SalesforceBulkWriter, fill in Connection and Object.
Attribute | Value |
---|---|
Connection | Connection from the second step |
Object | Product2 |
We do not have to specify the operation, as insert is the default one. We do not have to specify Input mapping, as metadata field names are the same as Salesforce field names and the default mapping (by name) is used.
You can attach an edge to the first output port to obtain object ids of inserted records. You can attach an edge to the second output port to obtain records that have not been inserted.
This example shows updating of Salesforce objects.
We do not sell products from our 'PaddleSteamer' family anymore. Set IsActive to false for all products of this family.
In this example, we have to read IDs of the objects to be updated first (with SalesforceBulkReader). Secondly, set IsActive to false. Finally, update the records in Salesforce (with SalesforceBulkWriter).
Create a Salesforce connection. In SalesforceBulkReader, set Connection, SOQL query, and Output mapping.
Attribute | Value |
---|---|
Connection | Connection from the first step |
SOQL query | SELECT Id FROM Product2 WHERE Family = 'PaddleSteamer' |
Output mapping | See the code below |
//#CTL2 function integer transform() { $out.0.Id = $in.0.Id; $out.0.IsActive = false; return ALL; }
In SalesforceBulkWriter, set Connection, Salesforce object, and Operation.
Attribute | Value |
---|---|
Connection | Connection from the first step |
Salesforce object | Product2 |
Operation | Update |
In SalesforceBulkWriter, no output mapping is specified as mapping by field names is used.
This example shows usage of upsert operation.
There is a list of companies and their web sites. Update the websites in salesforce.
Read records with company name and object id from Account object. Match the companies with their websites. Write only the new records or records that have been updated.
Create a salesforce connection.
In SalesforceBulkWriter, use attributes Connection, Salesforce Object Operation, Output mapping, and Upsert external ID field.
Attribute | Value |
---|---|
Connection | Connection from the first step |
Salesforce object | Account |
Operation | Upsert |
Input Mapping | See the code below |
Upsert external ID field | Id |
//#CTL2 function integer transform() { $out.0.Id = $in.0.Id; return ALL; }
Note | |
---|---|
The records containing valid record ID are updated; the records with ID set to null are inserted. |
This example shows deleting records.
The product called 'abc' has been inserted multiple times. Furthermore, we do not have 'abc' product anymore. Remove the 'abc' product.
Create a Salesforce connection.
Read object IDs of 'abc' product with SalesforceBulkReader.
In SalesforceBulkWriter, set Connection, Salesforce object, and Operation.
Attribute | Value |
---|---|
Connection | Connection from the first step |
Salesforce object | Product2 |
Operation | Delete |
This example shows usage of Hard Delete.
Permanently delete records of specified IDs from Account object. The IDs are received from an input edge.
Create a salesforce connection.
In SalesforceBulkWriter, set Connection, Salesforce object, Operation, and Input mapping.
Attribute | Value |
---|---|
Connection | Connection from the first step |
Salesforce object | Account |
Operation | Hard Delete |
Input mapping | See the code below. |
//#CTL2 function integer transform() { $out.0.Id = $in.0.Id; return ALL; }
Note | |
---|---|
The user has to have Bulk API Hard Delete privilege to use Hard Delete operation. See https://help.salesforce.com/apex/HTViewSolution?id=000171306&language=en_US. |
SalesforceBulkWriter is available since 4.3.0-M2. It uses Salesforce Bulk API version 37.0.
Since CloverETL 4.5.0-M2, SalesforceBulkWriter uses Salesforce Bulk API version 39.0.
Since CloverETL 4.5.0, you can set job concurrency mode and batch size.