Blogs

Community Show: Stream Kafka with confidence for real-time, AI-ready operations

By Sara Brams-Miller posted 12 hours ago

  

Our most recent community show explored how organizations can stream trusted data with more speed and less complexity using Confluent Kafka support in Reltio external queues. The session, led by Karthikeyan Mani from Product Management, walked through the new capability, key configuration steps, filter options, and a live demo showing how unified data can move in real time to downstream systems.

Real-time streaming for faster, more reliable operations

Karthik opened by framing the challenge many teams face today: real-time operations and AI initiatives depend on timely, reliable data reaching the right systems at the right moment. While Reltio already supports several event streaming ecosystems, this release adds out-of-the-box support for Confluent Kafka—giving customers a more direct way to stream trusted data to the systems, applications, and pipelines that rely on it.

Key takeaways from the session

  • Confluent Kafka support: Reltio external queues now support Confluent Kafka as part of the April release

  • Direct streaming: Customers can stream trusted, unified data directly to Kafka without relying on intermediate queue systems and ETL handoffs

  • Event-driven activation: Create, update, delete, and merge events can trigger downstream streams in near real-time

  • Flexible configuration: Teams can configure payload formats, object filters, event filters, field selection, and payload types

  • Monitoring options: Users can track outbound events through external queue monitoring and activity logs inside Reltio

  • Customer-managed Kafka: Customers bring their own Confluent Kafka environment, including broker, topic, API key, and secret

This new support helps reduce infrastructure complexity, minimize inconsistency across systems, and align with enterprise streaming standards already in place.

A practical walkthrough of setup and streaming

The session covered the essentials required to get started. Karthik explained that users must first enable message streaming and the message streaming API in tenant management. He also highlighted an important detail around JMS event filtering fields at the tenant level—because those settings can take precedence over the fields selected in the external queue configuration.

From there, the session walked through the Confluent Kafka setup requirements, including:

  • Topic name

  • Broker bootstrap server and port

  • API key and secret

  • Required user roles and access

The demo then showed how to configure an external queue, create and update a profile in Reltio, and immediately view those payloads arriving in Kafka. It also highlighted available payload options such as snapshot, selected fields, delta, and snapshot with delta.

Questions from the community

The Q&A reinforced the value of this launch for customers already using Kafka as their enterprise streaming standard. Karthik noted that many teams previously had to route events through supported queue systems and then move data into Kafka with additional tools. With direct Kafka support, customers can simplify that architecture and avoid extra infrastructure and handoffs.

The discussion also covered roadmap interest in OAuth-based authentication, monitoring considerations, and the current availability of the feature in production as part of the April upgrade.

Conclusion

This community show offered a practical look at how direct Confluent Kafka support can help organizations activate trusted data in real time for downstream applications, analytics, and AI-driven workflows. From simplified architecture to flexible filtering and monitoring, the session highlighted how Reltio can help teams build a stronger real-time data foundation for faster decisions and more confident operations.

#CommunityWebinar

#Featured

#AIML

0 comments
5 views

Permalink