Automate Salesforce Data Import/Export Using Data Loader CLI

One of the key facets of Salesforce is import and export data from salesforce. Data loader is a great tool provided by Salesforce to achieve this. Meanwhile, the Data loader user interface required manual work for this operation however there are many scenarios where we need to automate this process like every night at 12:0 pm. To implement theses scenario we need Data Loader CLI (Command Line Interface) to implement this process and windows task scheduler to schedule this process. In this article, we will walk through data export from Salesforce using Data Loader CLI and schedule this using the Windows task scheduler. Here are the required steps for this process:

Don’t forget to check out:  How To Plan For A Successful Salesforce Data Migration?

  1. Install Data Loader In Your Local System.
  2. Create a Secret key file Using encrypt.bat file.
  3. Create Password using Secret key created.
  4. Create “Process-conf.xml” file.
  5. Create a field mapping (SDL) file If you are doing import operation.
  6. Execute Data loader from Command-line Interface.
  7. Schedule process using Windows Task Scheduler.

Install Data Loader In Your Local System:

You can download and install Data Loader from the Salesforce Data Management section.

download

Create a Secret key file Using encrypt.bat file:

  1. Navigate to the Data Loader \bin directory by entering this command.   cd C:\Program Files (x86)\salesforce.com\Apex Data Loader 36.0\bin. Replace it with your bin directory address.
  2. Generate encryption key using  this command : encrypt.bat —g <yourSecretKey> .Replace your secret key with any string which you like.
  3. Copy generated key from the command line to a text file named key.txt and not down the path of the file.

encrypt1

Create Password using Secret key created:

In the same command window type following command.encrypt.bat —e "Yor Org password" "path of the key.text"\key.txt"

enpassword

Create “Process-conf.xml” file:

The process-conf.xml file contains all your Salesforce configuration data that the data loader required for this process.
Each bean tag in this file contains a single process. So you can perform multiple operations in a single file.
Open the process-conf.xml file from the \samples\conf directory in a text editor and map the following configuration.

<!DOCTYPE beans PUBLIC “-//SPRING//DTD BEAN//EN” “http://www.springframework.org/dtd/spring-beans.dtd”>
    <beans>
    <bean id=”Account”
        class=”com.salesforce.dataloader.process.ProcessRunner”
        singleton=”false”>
        <description>Import Account Data.</description>
        <property name=”name” value=”Lead”/>
        <property name=”configOverrideMap”>
            <map>
                <entry key=”sfdc.debugMessages” value=”true”/>
                <entry key=”sfdc.debugMessagesFile” value=”C:\Users\Ravikant Maurya\Desktop\Key1.og”/>
                <entry key=”sfdc.endpoint” value=”https://login.salesforce.com/”/>
                <entry key=”sfdc.Server host” value=”https://www.salesforce.com”/>
                <entry key=”sfdc.username” value=”*************************”/>
                <entry key=”sfdc.password” value=”********************************”/>
                <entry key=”process.encryptionKeyFile” value=”C:\Data Loader\bin\key.txt”/>
                <entry key=”sfdc.timeoutSecs” value=”600″/>
                <entry key=”sfdc.useBulkApi” value=”true”/>
                <entry key=”sfdc.loadBatchSize” value=”200″/>
                <entry key=”sfdc.externalIdField” value=”id”/>
                <entry key=”sfdc.entity” value=”Account”/>
                <entry key=”sfdc.extractionRequestSize” value=”10000″/>
                <entry key=”sfdc.extractionSOQL” value=”Select Id, Name FROM Account “/>
                <entry key=”process.operation” value=”extract”/>
                <entry key=”process.mappingFile” value=”C:\Data Loader\samples\conf\accountExtractMap.sdl”/>
                <entry key=”dataAccess.name” value=”C:\SalesforceData\Account\account.csv”/>
                <entry key=”dataAccess.type” value=”csvWrite”/>
                <entry key=”process.initialLastRunDate” value=”2005-12-01T00:00:00.000-0800″/>
            </map>
        </property>
    </bean>
</beans>

Create field mapping (SDL) file If you are doing import operation:

Copy the following file content in file name accountInsertMap.sdl. This mapping is for data import so the data source on the left side of the equal sign is for the Salesforce field and right side for your CSV column. For complex mapping, you can use help from Data loader user Interface Mapping.

#Mapping values
#Thu May 26 16:19:33 GMT 2011
Name=Name
NumberOfEmployees=NumberOfEmployees
Industry=Industry

Execute Data loader from Command-line Interface.

This is an optional step If you want to test this process before scheduling. Enter the following command to execute this process.
process.bat “file path to process-conf.xml” "process name"

Schedule process using Windows Task Scheduler:

Open Process.bat file from \bin directory replaces with following shell script command.

@echo off
:run
set PROCESS_OPTION=process.name=Account
IF “%JAVA_HOME%” == “” (
for /f “tokens=*” %%i in (‘dataloader-37.0.0-java-home.exe’) do (
IF EXIST “%%i” (
set JAVA_HOME=%%i
) ELSE (
echo %%i
)
)
)
IF “%JAVA_HOME%” == “” (
echo To run process.bat, set the JAVA_HOME environment variable to the directory where the Java Runtime Environment ^(JRE^) is installed.
) ELSE (
IF NOT EXIST “%JAVA_HOME%” (
echo We couldn’t find the Java Runtime Environment ^(JRE^) in directory “%JAVA_HOME%”.
To run process.bat, set the JAVA_HOME environment variable to the directory where the JRE is installed.
) ELSE (
“%JAVA_HOME%\bin\java” -cp ..\dataloader-37.0.0-uber.jar -Dsalesforce.config.dir=
“C:\Program Files (x86)\salesforce.com\Data Loader\samples\conf” com.salesforce.dataloader.process.ProcessRunner %PROCESS_OPTION%
)
)
:end

Where PROCESS_OPTION=process.name=Account is your bin Id in the process-conf.xml file you created above.
-Dsalesforce.config.dir=”C:\Program Files (x86)\salesforce.com\Data Loader\samples\conf” is the dir of your process-conf.xml file you created above.
Use Windows Task scheduler to schedule process.bat file to import or export record as your required time(Daily, Weekly, Monthly) basis.

schedule

Responses

  1. Thank You for presenting the elaborative note on how to automate the Salesforce Data Import/Export Using Data Loader CLI. The blog was much helpful for me to understand about the technique to use the data loader tool to import and export the data in Salesforce.

  2. Is this also compatible with Salesforce Lightning? This looks like Java code; isnt salesforce lightning not compatible with Java?

  3. For those who finds this way as "too many lines of code" I'd recommend using Skyvia tool - it automates Salesforce data export easily and with no coding at all.

Comments are closed.

Popular Salesforce Blogs