Tuning Operations by Configuring Batch Sizes

The Salesforce API allows you to specify the maximum number of rows that can be sent in a single Bulk API request. The Bulk API Batch Size datastore option specifies the number of rows at a time that the Simba Salesforce SAP Data Services Adapter can send to Salesforce.com, and the Loader Batch Size adapter target parameter specifies the number of rows at a time that SAP Data Services can send to the target table. You can configure these options to tune operations that involve rows containing large amounts of data.

It is recommended that you do not set the Loader Batch Size parameter in the target table to a value that is greater than the Bulk API Batch Size datastore option. Doing so might cause the Simba Salesforce SAP Data Services Adapter to handle operations less efficiently.

Example: Configuring Batch Sizes to Optimize Adapter Performance

For example, if Loader Batch Size is set to 7500, Bulk API Batch Size is set to 5000, and SAP Data Services is sending 15000 rows, then the following occurs:

  • Data Services sends a batch of 7500 rows to the adapter, causing the adapter to send a batch of 5000 rows and then a second batch of 2500 rows to Salesforce.com.
  • Then, Data Services sends a second batch of 7500 rows to the adapter, again causing the adapter to send a batch of 5000 rows and then a second batch of 2500 rows to Salesforce.com.

In this scenario, the adapter uses a total of 4 API calls to send 15000 rows.

Setting these batch size options properly can optimize how the Simba Salesforce SAP Data Services Adapter handles operations. For example, if Loader Batch Size and Bulk API Batch Size are both set to 5000 and SAP Data Services is sending 15000 rows, then the following occurs:

  • Data Services sends a batch of 5000 rows to the adapter, causing the adapter to send a batch of 5000 rows to Salesforce.com.
  • This same operation repeats twice.

In this scenario, the adapter uses only 3 API calls to send 15000 rows.