Thanks for the reply Ankur. Yes, we do have the Snowflake connector in place and the Reltio data is being written to Snowflake. But, I'm looking for guidance on using Snowflake as the source of data that we need to master in Reltio, thus we have multiple systems with customer data, but don't want to connect to each of these platforms individually as we already have a Snowflake data replication process in place. The question is then, can we use RIH to extract the data in bulk directly from Snowflake, or do we need to make use of Mulesoft/Kafka streaming to get to the data? We do have a requirement that this must be bulk and executed on a hourly/daily basis, and just be able to determine the delta for that period that needs to be processed as new/updated source records.
Original Message:
Sent: 03-03-2026 12:48
From: Ankur Gupta
Subject: Guidance on Data Ingestion and Consumption Between Snowflake and Reltio MDM
Hello Christo and Rushyendar,
Could you please help clarify the following:
- For ingestion - What data do you plan to bring in. If it is just interactions, we have zero copy integration with Snowflake. Please check the following documentation
- For consumption - we have an offering of Reltio Data Pipeline with Snowflake. Please see the following documentation.
Please let us know if you have any further questions on the above.
------------------------------
Ankur Gupta
Original Message:
Sent: 03-02-2026 10:29
From: Christo Pretorius
Subject: Guidance on Data Ingestion and Consumption Between Snowflake and Reltio MDM
Hi
I have the same requirement to enable us to consume data from Snowflake into Reltio. It was my understanding that the connector only works from Reltio to Snowflake (as you indicated), but I need to be able to create a pipeline where Snowflake is the source and have the ability to process data deltas after the initial data sync with csv files through the bulk loader. Is there any guidance on how to accomplish this through the RIH utility? I would hate to look at other solutions, like Mulesoft, if this could be accomplished directly.
Regards
------------------------------
Christo Pretorius
Global Payments
Dublin
Original Message:
Sent: 01-24-2025 10:20
From: Curt Pearlman
Subject: Guidance on Data Ingestion and Consumption Between Snowflake and Reltio MDM
Hi Rushyendar,
If I follow you correctly, your bullet, ''Leverage the Snowflake connector in Reltio Integration Hub (RIH) to set up near real-time updates" under the section "Data Ingestion (Snowflake to Reltio)", indicates possibly a misunderstanding about the Reltio Snowflake Connector. The Reltio Snowflake Connector is unidirectional only, delivering data FROM Reltio TO Snowflake. It does not support updates from Snowflake back into Reltio.
Let me know if I've misinterpreted your steps.
Curt
------------------------------
Curt Pearlman
Reltio
Original Message:
Sent: 01-23-2025 21:04
From: Rushyendar Akula
Subject: Guidance on Data Ingestion and Consumption Between Snowflake and Reltio MDM
Hello Reltio Community,
I am currently working on a project where we need to set up data ingestion and consumption between our Snowflake database and Reltio MDM, and I'd like to get your feedback on the approach we're considering.
Data Ingestion (Snowflake to Reltio):
Initial Load:
- Use an ETL tool to extract data from Snowflake and create files.
- Load these files into Reltio using the Data Loader tool.
Incremental Load:
- Leverage the Snowflake connector in Reltio Integration Hub (RIH) to set up near real-time updates.
- Snowflake updates happen in batch mode, with data changes often in the range of a few hundred thousand records per batch.
Data Consumption (Reltio to Snowflake):
For sending data back to Snowflake from Reltio, here's my understanding:
Initial Export:
- Similar to the initial data load, we can use an ETL tool to export data from Reltio and send it to Snowflake.
Incremental Updates:
- The Snowflake connector in Reltio Integration Hub (RIH) can be used to send incremental updates directly to Snowflake.
Questions for the Community:
Data Ingestion:
- Is our proposed approach for data ingestion correct?
- Some data architects are questioning the need to create files for incremental loads when real-time integration via the Snowflake connector is available. Are there better approaches or best practices we should consider?
Data Consumption:
- Can the Snowflake connector in Reltio Integration Hub handle sending both incremental updates and large initial exports effectively?
- Are there alternative approaches for exporting data to Snowflake that are more efficient or commonly used?
I'd really appreciate any insights, validation, or suggestions you can share to improve our understanding and implementation of this process.
Looking forward to learning from your experiences!
------------------------------
Rushy A
------------------------------