0% found this document useful (0 votes)
15 views4 pages

4 - Data Loader

Data Loader is a data management tool for processing both Standard and Custom objects, capable of handling over 50,000 records, with specific limits such as a maximum of 5 million records at a time and no support for User objects. Users can sign in via Password Authentication or OAuth, and settings allow for connection to either Sandbox or Production environments, with adjustable batch sizes to avoid server timeouts. The tool includes features for exporting data, including soft-deleted records, and generates success and error files after operations.

Uploaded by

pratikkamble
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views4 pages

4 - Data Loader

Data Loader is a data management tool for processing both Standard and Custom objects, capable of handling over 50,000 records, with specific limits such as a maximum of 5 million records at a time and no support for User objects. Users can sign in via Password Authentication or OAuth, and settings allow for connection to either Sandbox or Production environments, with adjustable batch sizes to avoid server timeouts. The tool includes features for exporting data, including soft-deleted records, and generates success and error files after operations.

Uploaded by

pratikkamble
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Data Loader

It is data management tool that works for both Standard as well as Custom
objects. It can process more than 50,000 records which Import Wizards
cannot do.

When we click Export All button, then in the csv/excel file, there you can see
IsDeleted column. That column tell data/record is in object (active) or is in
recycle bin (soft deleted). If IsDeleted=True, then data is exported from
recycle bin, and if If IsDeleted=False, then data is active.

Limits:

- Can process up to only 5 million records at a time.


- Duplicates cannot be ignored.
- Doesn’t support User object.
- Can’t work on more than 1 object at a time.
- Maximum Batch Size is 10,000.

Sign In options:

There are two option to sign in to data loader: 1) Password Authentication


2)OAuth

When you click on Password Authentication, following wizard opens. Write


your username and in password you need to enter your password plus
Security Token without giving any space in between. Security token can be
generated from your profile setting.

When you click on OAuth, following wizard opens. There you need to select
where do you want to load your data or from where you want to download
the data. Here you can see two options Production, and Sandbox. Select one
of them and another wizard will open. There you need to write your
Salesforce orgs login id and password, click on login, then click on allows
access, and login process is done.
Settings in Data Loader:

*** Where do you want to connect your data loader with, either to Sandbox
or to Production (Live Environment) that should be written in the “Server
Host” text area in the settings of data loader.

If you want to connect to the Sandbox (Development), then


https://test.salesforce.com and if you want to connect to the Production (Live
Environment), then https://login.salesforce.com

*** “Batch Size” means number of records moving/transferring from source


file to destination at a time. Default batch size is 200, and max batch size is
10,000, no matter what number you have entered greater than 10k.

While transferring records, many codes might be running in background for


these records, due to which more load comes on server and you may get
“server time out” error. Best practice for this is to keep batch size low.

Like, if on the batch size 200 you are getting error, then make it 100, still
getting error, then make it 90, and so on.

*** There is just a small difference in between Export and Export All button
present on Data Loader interface.
Export pulls the data that is present in the Object itself, and Export All
pulls the data that is present in the Object itself as well as data present in
the Recycle Bin related to that object.
Situations when to use Export All command:
1) In some banking/financial projects, where we needed to keep an “audit” of
“everything”, even cancelled or deleted transactions/deals. For safer side,
we download all data even from the recycle bin and store somewhere else.
2)In some “data sensitive projects”, if by mistake anyone deletes the record
that is present in recycle bin, then also Export All practice helped us to get
that data backup, as after days all content in recycle bin will be wiped out.

*** In settings, if you checked the checkbox Insert NULL Values then the any
cell having no value or the cell is blank, then data loader will update field in
Salesforce table with blank if there is any value.

If this checkbox is unchecked then even if csv file contains blank cell or null
value, doesn’t going to affect the field in Salesforce table.

When the Use Bulk API checkbox unchecked then Insert Null Values options
will be enabled to check or uncheck.

If this checkbox is checked then Insert Null Values option will be disabled and
blank values or null won’t be inserted.

Working with Commands:

After every operation performed on with data loader, two files created in
local disk.
One is Success file and one is Error file. When you performed Upsert
operation, in Success file, column named “Status” tell ‘Item Updated’ for
records present in csv file for updating, and ‘Item Created’ for records
present in csv file for Inserting.

You might also like