Reltio Connect

 View Only
  • 1.  Reltio Streaming Events: Aggregation and Traffic Reduction Guidelines and Best Practices: Webinar Follow-up

    Posted 08-01-2021 09:01
    I attended the 7/29 webinar "Event-driven integration with Reltio, including a look at Reltio with Kafka" and posted this per recommendation during webinar to do so.

    We are implementing a co-existence implementation style and looking to feed Reltio changes to both contributing applications and additional destination(s) for centralized storage of the domains we are working with.

    We are currently working with Organization and Contact domain with a single relationship supported by Reference Attribute pointing to a generated ID at the Organization level.

    We are planning to use Reltio Streaming events to SQS queue and then leverage Talend ETL tool to process the event queue entries, generate a canonical json documents in a format clearly showing the operational values for each attribute, contributing crosswalks, RDM translated values in each crosswalk, identification of changed attributes and the type of change(s) reflected in the generated json payload. For example, type of change would include Create Entity, Change Entity, etc.

    We would like to minimize the traffic to consuming applications by aggregating co-occurring and closely occurring events related to the same entity or related entities.

    Consuming applications will need to reshuffle data to reflect Reltio merge operations and also manage relationship between Organization and Contact.

    Looking for Guidelines, Best Practices, Suggestions and / or warnings to refine our planned approach.

    ------------------------------
    Mark Burlock
    Dodge Data & Analytics
    Hamilton NJ
    ------------------------------


  • 2.  RE: Reltio Streaming Events: Aggregation and Traffic Reduction Guidelines and Best Practices: Webinar Follow-up

    Reltio Employee
    Posted 08-02-2021 17:51

    Mark,
       How would you define closely occurring events?  Is there a time window that you are looking to achieve this in?  If so, you could read groups of messages from the queue in order to process them downstream.  That does force you into a much more batch-centric view of the integration architecture though. 

    I would suggest as an alternative that processing all events in near real time will save you processing power and get your data to business faster.  Under normal real-time loads, it's unlikely that a single object is being updated over and over and over, but even if this does happen, your consumers can use the event timestamps on the message to ignore messages which they may already have the update values from if that is desired.  

    Mike Frasca



    ------------------------------
    Mike Frasca
    ------------------------------



  • 3.  RE: Reltio Streaming Events: Aggregation and Traffic Reduction Guidelines and Best Practices: Webinar Follow-up

    Posted 08-03-2021 12:52
    Mike,

    Thinking of closely occurring within a couple of minutes ... use case of biggest concern would be an auto merge separately doing multiple mini-merges until it gets to "done".

    Leaning away from batch and will consider timestamp filtering by downstream applications. 

    We'll also likely include some delivery day of about 30 seconds or so (time to be adjusted based on experience) of the Reltio event queue.

    Thanks for your response.

    Mark

    ------------------------------
    Mark Burlock
    Dodge Data & Analytics
    Hamilton NJ
    ------------------------------