Salesforce Data Migration Process

Salesforce Data Migration Process

Best Practices for Data Migration

The process of migrating data in Salesforce Project appears to be an easy task for Salesforce Developers but it’s not the way it usually looks like. There is a lot of sincere effort required to make the data uploaded successfully in Salesforce Instance.

It is, in fact, a challenging and time-consuming task that demands ample planning along with a good understanding of the current system.

If data is uploaded incorrectly imported, implications are huge. Imagine what would happen if revenue data is imported incorrectly. or record owners get set incorrectly or tasks set for one user get assigned to some other user in the Org.

This article will cover the basics of data migration planning. What good planning looks like, the essentials questions to ask, planning for risks

Data migration is a topic that’s often daunting, risky and with conflicting advice online, which is why I hope to use my experience to make the subject clearer.

All this can become even more troublesome when migrating data into a running instance which is being used by several users as opposed to a new one.

No matter how many projects you work on, you will learn something new from each one of them.


1. Check the Timezone of the target org and make sure you have configured your data loader accordingly.

2.Turn-off Email-Deliverability as you don’t send emails to users associated with records during your data load.

3. Determine the Order of Migration of objects. In Salesforce relationships that exist between objects and dependencies dictate the order of migration.

4. Create a data template for each Object, in excel, using a data export from the Data Loader (use the export file as your template). Use those templates to import data back into the system.


1. As we initiate the data migration process, one of the first things to perform is the user mapping. We need to have a clear understanding of how will the user ids of the existing system map to the new user ids on the Salesforce platform.

2. If you are migrating the data from one Salesforce instance to another Salesforce instance, then plan to have a couple of licenses available of the old instance for a few months after the cutover date. After the migration, if there are issues, then this old instance will come in extremely useful in investigating issues. This user mapping information is critical to ensure that the record ownership is set up correctly in the new system. This is also a good time to revisit which all users have what kind of licenses.As you set up all the new data in the new system, make sure that all the page layouts and mapping of page layouts to profiles is set up properly for all the record types.If you are migrating the data from one Salesforce instance to another Salesforce instance, then plan to have a couple of licenses available of the old instance for a few months after the cutover date. After the migration, if there are issues, then this old instance will come in extremely useful in investigating issues.

3. When inserting the records into the new system, it is extremely important to identify the ordering of insertion of objects into Salesforce. As a trivial example, you will need to insert all the accounts first, and then all the contacts so that the relationship between contacts and accounts is set up properly. In real situations, the relationships between objects can be fairly complex. Besides the ordering of objects, you will need to insert fields of the same object multiple times again so that all the lookups are set up properly.

4. If you are migrating code from one Salesforce instance to another make sure that the Apex classes do not have any hard-coded instance names like ns1. These can be easily removed using Apex methods like get Salesforce Base URL. Also if there are any hard-coded ids of Account, Opportunity ids in your classes or test case, then this may lead to incorrect behavior of Apex programs in the new instance.

5 . As you move in additional data into Salesforce instance, keep a watch on the space you are consuming. In case you are reaching your permitted space limits, it is time to call your Salesforce representative to purchase additional space.

6. Revisit the apps that are installed the Appexchange apps. Most Salesforce instances that are active for many years tend to have many apps that are not really being used. A large data migration may be a good time to uninstall those unused apps.

7. It is important to plan out the data migration. The users should be informed well in advance about the cut-off date, and possible issues that may happen. Also, it is always a good idea to have a few pilot users available to test over the weekend in which a major data migration is planned.

8. From a developer perspective, it is important to plan/provide sufficient time for testing the results of data migration before you roll out the instance to users. Just because the data loader has executed without any errors, does not mean that the migration is complete. Use the Developer console to perform basic queries like – total number of accounts of a certain record type, number of accounts without any contacts, number of accounts owned by user XYZ etc.

9. It is also import to do sanity testing directly in Salesforce. Login as different type of users, and view a few records of different objects. Compare these same records manually with the original system. Although all kinds of tools available, there is no replacement to manual verification when testing data migration.

10. Before you run a tool like data loader to import the data into your new instance, revisit the active workflows and triggers. You need to evaluate if all the workflows and triggers of the impacted objects should be disabled before uploading data. Often there are active workflows that send emails to customers when a certain stage is reached. Imagine the situation if unwanted/incorrect emails get sent to thousands of Customers as you upload all the contacts.

11. Typically CreatedBy, CreatedDate, LastModifiedByID, LastModifiedDate are read-only fields. However, during data migrations, it is generally desirable to insert records with legacy dates and ids to match the source system. You can raise a case with Salesforce Support and they will make these fields editable for a limited time. You will need to mention in the Case that you will like to enable “Create Audit Fields”.

12. When inserting date fields, Salesforce data loader uses the time zone of a user inserting records. The user id being used to insert records should have “proper” time zone settings.

13. Last but not least, make sure you take a backup of existing data of new Salesforce organization before you initiate the Data Migration.

Data Migration Steps to validate data after migration:.

  • Generating reports on source & target to verify record count
  • Checking data integrity by verifying some test records.
  • Check if the duplicate already exists in source & if you have already moved in duplicates then use some free apps on app exchange to identify & eliminate the same


  1. DATA QUALITY: We may find that the data used in the legacy application is of poor quality in the new/upgraded application. In such cases, data quality has to be improved to meet business standards. Factors like assumptions, data conversions after migrations, data entered in the legacy application itself are invalid, poor data analysis etc. leads to poor data quality. This results in high operational costs, increased data integration risks, and deviation from the purpose of business.
  2. DATA MISMATCH: Data migrated from the legacy to the new/upgraded application may be found mismatching in the new one. This may be due to the change in data type, format of data storage, the purpose for which the data is being used may be redefined.This result in huge effort to modify the necessary changes to either correct the mismatched data or accept it and tweak to that purpose.
  3. DATA LOSS: Data might be lost while migrating from the legacy to the new/upgraded application. This may be with mandatory fields or non-mandatory fields. If the data lost is for non-mandatory fields, then the record for it will still be valid and can be updated again.But if the mandatory field’s data is lost, then the record itself becomes void and it cannot be retracted. This will result in huge data loss and should have to be retrieved either from the backup database or audit logs if captured correctly.
  4. DATA VOLUME: Huge Data that requires a lot of time to migrate within the downtime window of the migration activity. E.g: Scratch cards in Telecom industry, users on an Intelligent network platform etc., here the challenge is by the time, the legacy data is cleared, a huge new data will be created, which needs to be migrated again. Automation is the solution for huge data migration.
  5. Simulation of a real-time environment (with the actual data): Simulation of a real-time environment in the testing lab is another real challenge, where testers get into different kind of issues with the real data and the real system, which is not faced during testing. So, data sampling, replication of real environment, identification of volume of data involved in migration is quite important while carrying out data Migration Testing.
  6. Simulation of the volume of data: Teams need to study the data in the live system very carefully and should come up with the typical analysis and sampling of the data.E.g: users with age group below 10 years, 10-30 years etc., As far as possible, data from the live needs to be obtained, if not data creation needs to be done in the testing environment. Automated tools need to be used to create a large volume of data. Extrapolation, wherever applicable can be used, if the volume cannot be simulated.


  • Standardize data used in legacy system, so that when migrated, standard data will be available in new system.
  • Enhance quality of the data, so that when migrated, there is a qualitative data to test giving the feel of testing as an end-user.
  • Clean the data before migrating, so that when migrated, duplicate data will not be present in the new system and also this keeps the entire system clean.
  • Recheck the constraints, stored procedures, complex queries which yield accurate results, so that when migrated, correct data is returned in the new system as well.

Related Articles