Abhijit,
There are no specific features that are activated when a data load crosses a specific threshold/size, but it should be noted that the Console Data Loader is specifically designed for scalability. In other words, you will not see a linear increase in the length of time required for a job to execute as the number of records/entities in the job increases. The Data Loader will analyze the job size and complexity during pre-processing and allocate resources accordingly. That said, we do often recommend as a best practice that giant data loads be "chunked" into smaller batches, just as a way to reduce the possibility of a point of failure impacting the entire load. In your case, for instance, it might make sense to break the 300M record job into, say, 6 jobs at 50M records apiece.
------------------------------
Best regards,
Jack Hain
Senior Product Manager
Reltio, Inc.
------------------------------
Original Message:
Sent: 02-16-2023 10:17
From: Abhijit Auddy
Subject: What are the feature available in Reltio if we need to load 300 Million records as initial data load?
What are the feature available in Reltio if we need to load 300 Million records as initial data load? Normal load via data loader may take huge time.
------------------------------
Abhijit Auddy
Cognizant
------------------------------