Afleveringen
-
Want your real-time data streaming initiative to stick? Success hinges on more than pipelinesâitâs about people, governance, and business impacts. Jeffrey Johnathon Jennings (J3), managing principal at signalRoom, shares how to bring it all together.
In this episode, J3 shares how heâs used impactful proofs of concepts to demonstrate value early, then scaled effectively through shift left with governance and stronger cross-team collaboration.
Youâll learn about:
Why proofs of concept are key to securing buy-in and demonstrating ROI earlyHow data governance as a shared language create consistency across teamsStrategies for establishing a data streaming center of excellenceThe role of business outcomes in guiding streaming data adoption strategiesIf youâre building or scaling a data streaming practice, this episode goes beyond the technology, showing you how to drive real impact.
About the Guest:
Jeffrey Johnathan Jennings is the managing principal of signalRoom, a dedicated father, avid traveler, and EDM enthusiast whose creativity and energy shape both his personal and professional life. As a cloud-native data streaming expert, he specializes in integrating ML/AI technologies to drive transformative change and improve business outcomes. With a focus on innovation, he designs scalable data architectures that enable real-time insights and smarter decision-making. Committed to continuous learning, Jeffrey stays ahead of technological advancements to help businesses navigate the evolving digital landscape and achieve lasting growth.Guest Highlight:
âWe need to speak the same language. The only way to speak the same language is to have a Schema Registry. I don't think there's an option. You just have to do this. We share a common language and therefore we build common libraries.âEpisode Timestamps:
*(01:13) - J3âs Data Streaming Journey
*(07:07) - âData Streaming Goodness: Strategies to Demonstrate Value
*(26:38) - ââThe Runbook: Data Streaming Center of Excellence
*(37:00) - ââData Streaming Street Cred: Improve Data Streaming Adoption
*(42:35) - Quick Bytes
*(45:00) - Josephâs Top 3 TakeawaysDive Deeper into Data Streaming:
EP1âStream On: Unleashing Innovation with Data Streaming | Life Is But A StreamEP2âProcessing Without Pause: Continuous Stream Processing and Apache FlinkÂź | Life Is But A StreamEP3âThe Connective Tissue: Shift Left to turn Data Chaos to Clarity | Life Is But A StreamLinks & Resources:
Connect with Joseph: @thedatagiantJosephâs LinkedIn: linkedin.com/in/thedatagiantJ3âs LinkedIn: linkedin.com/in/jeffreyjonathanjennings/J3âs GitHub: github.com/j3-signalroomâFall in Love with the Problem, Not the Solutionâ by Uri LevineâAsk Your Developerâ by Jeff LawsonLearn more at Confluent.ioOur Sponsor:
Your data shouldnât be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io. -
Despite its value, legacy data can feel like a roadblock in a fast-paced digital worldâHenry Schein One is clearing the path forward with real-time data streaming.
In this episode, Chris Kapp, Software Architect at Henry Schein One (HS1), shares how his team modernizes data management to stay competitive and unlock real-time insights.
Youâll learn about:
How tagging strategy, immutable audit log, and governance keep data secure and reliableThe challenges (and wins) of getting leadership buy-in for data modernizationHS1âs approach to decentralized data ownership, domain-driven design, and the importance of stream processing for scalingThe role of GenAI in the future of real-time stream processingGet ready to future-proof your data strategy with this must-listen episode for technology leaders facing scalability, governance, or integration challenges.
About the Guest:
Chris Kapp is a Software Architect at Henry Schein One specializing in domain-driven design and event-driven patterns. He has 34 years of experience in the software industry including Target and Henry Schein One. He is passionate about teaching patterns for scalable data architectures. Heâs currently focused on the One-Platform initiative to allow Henry Schein applications to work together as a single suite of products.Guest Highlight:
âIt's important to collect the data, try to eliminate our biases and go towards what is delivering quickly. The key is to start small, agile, iterative, and build something small with the people that are excited and willing to learn new things. If it doesn't work, then be agile, adjust, and find the thing that does work."Episode Timestamps:
*(01:18) - Chrisâ Data Streaming Journey
*(03:35) - ââData Streaming Goodness: AI-Driven Reporting & Data Streaming
*(21:08) - ââThe Playbook: Data Revitalization & Event-Driven Architecture
*(31:55) - ââData Streaming Street Cred: Executive Alignment & Engineering Collaboration
*(32:03) - Quick Bytes
*(40:14) - Josephâs Top 3 TakeawaysLinks & Resources:
Connect with Joseph: @thedatagiantJosephâs LinkedIn: linkedin.com/in/thedatagiantChrisâ LinkedIn: linkedin.com/in/chris-kapp-87868a4Designing Event-Driven Systems eBookDesigning Data-Intensive Applications eBookCurrent 2025âThe Data Streaming EventLearn more at Confluent.ioOur Sponsor:
Your data shouldnât be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io. -
Zijn er afleveringen die ontbreken?
-
In the final episode of our 3-part series on the basics of data streaming, we take a deep dive into data integrationâcovering everything from data governance to data quality.
Our guests, Mike Agnich, General Manager of Data Streaming Platform, and David Araujo, Director of Product Management at Confluent, explain why connectors are must-haves for integrating systems.
Youâll learn:
Why real-time ETL out performs the old-school approachHow shifting left with governance saves time and pain laterThe overlooked role of schemas in data qualityAnd moreâŠAbout the Guests:
Mike Agnich is the General Manager and VP of Product for Confluent's Data Streaming Platform (DSP). Mike manages a product portfolio that includes stream processing, connectors and integrations, governance, partnerships, and developer tooling. Over the last six years at Confluent, Mike has held various product leadership roles spanning Apache KafkaÂź, Confluent Cloud, and Confluent Platform. Working closely with customers, partners, and R&D to drive adoption and execution of Confluent products. Prior to his work at Confluent, Mike was the founder and CEO of Terrain Data (acquired by Confluent in 2018).
David Araujo is a Director of Product Management at Confluent, focusing on data governance with products such as Schema Registry, Data Catalog, and Data Lineage. David previously held positions at companies like Amobee, Turn, WeDo Technologies Australia, and Saphety, where David worked on various aspects of data management, analytics, and infrastructure. With a background in Computer Science from the University of Ăvora, David has a strong foundation of technical expertise and leadership roles in the tech industry.
Guest Highlights:
"If a ton of raw data shows up on your doorstep, it's like shipping an unlabeled CSV into a finance organization and telling them to build their annual forecast. By shifting that cleaning and structure into streaming, we remove a massive amount of toil for our organizations⊠Instead of punting the problem down to our analytics friends, we can solve it because we're the ones that created the data." - Mike Agnich
"We've had data contracts in Kafka long before it became a buzzwordâwe called them schemas⊠But more recently, we've evolved this concept beyond just schemas. In streaming, a data contract is an agreement between producers and consumers on both the structure (schema) and the semantics of data in motion. It serves as a governance artifact, ensuring consistency, reliability, and quality while providing a single source of truth for understanding streaming data." - David Araujo
Links & Resources:
Connect with Joseph: @thedatagiantJosephâs LinkedIn: linkedin.com/in/thedatagiantMikeâs LinkedIn: linkedin.com/in/magnichDavidâs LinkedIn: linkedin.com/in/davidaraujoWhat Is a Data Streaming Platform (DSP)Learn more at Confluent.ioEpisode Timestamps:
*(02:00) - Mike and Davidâs Journey in Data Streaming
*(13:55) - ââData Streaming 101: Data Integration
*(40:06) - ââThe Playbook: Tools & Tactics for Data Integration
*(53:25) - ââVoices from the World of Data Streaming
*(59:33) - Quick Bytes
*(1:05:20) - Josephâs Top 3 TakeawaysOur Sponsor:
Your data shouldnât be a problem to manage. It should be your superpower. The Confluent Data Streaming Platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true Data Streaming Platform from the pioneers in data streaming. Learn more at confluent.io
-
Weâre diving even deeper into the fundamentals of data streaming to explore stream processingâwhat it is, the best tools and frameworks, and its real-world applications.
Our guests, Anna McDonald, Distinguished Technical Voice of the Customer at Confluent, and Abhishek Walia, Staff Customer Success Technical Architect at Confluent, break down what stream processing is, how it differs from batch processing, and why tools like Flink are game changers.
Youâll learn:
The key differences between stream and batch processingHow different frameworks like Flink, Kafka Streams, and ksqlDB approach stream processingThe role of POCs and observability in real-time data workflowsAnd moreâŠAbout the Guests:
Anna McDonald is the Distinguished Technical Voice of the Customer at Confluent. She loves designing creative solutions to challenging problems. Her focus is on event-driven architectures, reactive systems, and Apache KafkaÂź.
Abhishek Walia is a Staff Customer Success Technical Architect at Confluent. He has years of experience implementing innovative, performance-driven, and highly scalable enterprise-level solutions for large organizations. Abhishek specializes in architecting, designing, developing, and delivering integration solutions across multiple platforms.
Guest Highlights:
âFlink is more approachable because it blends approaches together and says, âIf you need this, you still can use this.â It's the most powerful at this point.â - Abhishek Walia
âIf you're somebody who's ever gone from normal to eventing, at some point you probably would have gone, âWhen does [the data] stop?â It doesn't stop.â - Anna McDonald
ââStart with a fully managed service. That's probably going to save a lot of cycles for you.â - Abhishek Walia
Episode Timestamps:
*(01:35) - Anna & Abhishekâs Journey in Data Streaming
*(12:30) - ââData Streaming 101: Stream Processing
*(26:30) - ââThe Playbook: Tools & Tactics for Stream Processing
*(50:20) - ââVoices from the World of Data Streaming
*(56:13) - âââQuick Bytes
*(58:57) - Top 3 Takeaways
Links & Resources:
Connect with Joseph: @thedatagiantJosephâs LinkedIn: linkedin.com/in/thedatagiantAnnaâs LinkedIn: linkedin.com/in/jbfletchAbhishekâs LinkedIn: linkedin.com/in/abhishek-waliaIntroducing Derivative Event SourcingDesigning Event-Driven SystemsLearn more at Confluent.ioOur Sponsor:
Your data shouldnât be a problem to manage. It should be your superpower. The Confluent Data Streaming Platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true Data Streaming Platform from the pioneers in data streaming. Learn more at confluent.io.
-
Real-time data streaming is shaking up everything we know about modern data systems. If youâre ready to dive in but unsure where to begin, no worries. Thatâs why weâre here.
Our first episode breaks down the basics of data streamingâfrom what it is, to its pivotal role in processing and transferring data in a fast-paced digital environment. Your guide is Tim Berglund, VP of Developer Relations at Confluent, where he and his team work to make data streaming data and its emerging toolset accessible to all developers.
Youâll learn:
The fundamentals of data streamingData streaming advantages vs. other technologiesWhat is an Event-Driven Architecture (EDA)And much moreâŠAbout the Guest:
Tim Berglund serves as the VP of Developer Relations at Confluent, where he and his team work to make streaming data and its emerging toolset accessible to all developers. He is a regular speaker at conferences and a presence on YouTube explaining complex technology topics in an accessible way.
Guest Highlights:
âThe basic intellectual habit that you have in building a data streaming system isn't first, âWhat are the things?â But it's, âWhat is happening?ââ
âWith batch processing, Iâve got my data in a pileâI know where it starts, where it ends, and I can work through it. With streaming, itâs not a pileâitâs a pipe.â
âThe future of data streaming is real-time everythingâflows, insights, and actions. Thereâs no more âtake the data here and think about it later.â The insight is now, ready to be consumed by anyone who needs it. Businesses built on this model can respond to the world as it changes, right away."
Episode Timestamps:
*(01:44) - âTimâs Journey in Data Streaming
*(14:35) - ââData Streaming 101: Unlocking the Power of Data
*(38:56) - ââThe Playbook: Tools & Tactics for Data Streaming
*(49:00) - ââVoices from the World of Data Streaming
*(53:35) - âââQuick Bytes
*(57:10) - Top 3 Takeaways
Links & Resources:
Connect with Joseph: @thedatagiantJosephâs LinkedIn: linkedin.com/in/thedatagiantTimâs LinkedIn: linkedin.com/in/tlberglundExplore the 2024 Data Streaming ReportLearn more at Confluent.ioOur Sponsor:
Your data shouldnât be a problem to manage. It should be your superpower. The Confluent Data Streaming Platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true Data Streaming Platform from the pioneers in data streaming. Learn more at confluent.io.
-
Life Is But A Stream is the web show for tech leaders who must respond the second data is created. This show is your guide to staying ahead of the curve and leading the data revolution, not getting left behind.
Data streaming isnât just a buzzword; itâs the driving force behind instant analytics, enhanced customer experiences, and cutting-edge innovation. From foundational concepts to advanced techniques like stream processing and real-time data integration, this web show provides the insights you need to revolutionize your business.
Hosted by Joseph Morais, Technical Champion and Data Streaming Evangelist at Confluent, each episode features technical leaders and industry pioneers sharing real examples of how theyâre leveraging real-time data and event-driven architecture to transform their operations. Learn strategies to accelerate decision-making, amplify business impact, and gain a competitive edge.
Lead the data revolution â subscribe to Life Is But A Stream today.
Watch on YouTube or listen on Apple Podcasts, Spotify, or your favorite audio platform.
Powered by the data streaming experts at Confluent.
Links:
Watch and Subscribe on YouTube: www.youtube.com/@ConfluentListen and Subscribe on:Apple PodcastsSpotifyAmazon MusicLearn more at Confluent.ioOur Sponsor:
Confluent is pioneering a fundamentally new category of data infrastructure focused on data in motion. Our cloud-native offering is the foundational platform for data in motionâdesigned to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organization. With Confluent, our customers can meet the new business imperative of delivering rich, digital customer experiences and real-time business operations. Our mission is to help every organization harness data in motion so they can compete and thrive in the modern world. Learn more at Confluent.io.