Afleveringen

  • Want your real-time data streaming initiative to stick? Success hinges on more than pipelines—it’s about people, governance, and business impacts. Jeffrey Johnathon Jennings (J3), managing principal at signalRoom, shares how to bring it all together.

    In this episode, J3 shares how he’s used impactful proofs of concepts to demonstrate value early, then scaled effectively through shift left with governance and stronger cross-team collaboration.

    You’ll learn about:

    Why proofs of concept are key to securing buy-in and demonstrating ROI earlyHow data governance as a shared language create consistency across teamsStrategies for establishing a data streaming center of excellenceThe role of business outcomes in guiding streaming data adoption strategies

    If you’re building or scaling a data streaming practice, this episode goes beyond the technology, showing you how to drive real impact.

    About the Guest:
    Jeffrey Johnathan Jennings is the managing principal of signalRoom, a dedicated father, avid traveler, and EDM enthusiast whose creativity and energy shape both his personal and professional life. As a cloud-native data streaming expert, he specializes in integrating ML/AI technologies to drive transformative change and improve business outcomes. With a focus on innovation, he designs scalable data architectures that enable real-time insights and smarter decision-making. Committed to continuous learning, Jeffrey stays ahead of technological advancements to help businesses navigate the evolving digital landscape and achieve lasting growth.

    Guest Highlight:
    “We need to speak the same language. The only way to speak the same language is to have a Schema Registry. I don't think there's an option. You just have to do this. We share a common language and therefore we build common libraries.”

    Episode Timestamps:
    *(01:13) - J3’s Data Streaming Journey
    *(07:07) -  Data Streaming Goodness: Strategies to Demonstrate Value
    *(26:38) -   The Runbook: Data Streaming Center of Excellence
    *(37:00) -   Data Streaming Street Cred: Improve Data Streaming Adoption
    *(42:35) - Quick Bytes
    *(45:00) - Joseph’s Top 3 Takeaways

    Dive Deeper into Data Streaming:

    EP1—Stream On: Unleashing Innovation with Data Streaming | Life Is But A StreamEP2—Processing Without Pause: Continuous Stream Processing and Apache Flink¼ | Life Is But A StreamEP3—The Connective Tissue: Shift Left to turn Data Chaos to Clarity | Life Is But A Stream

    Links & Resources:

    Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantJ3’s LinkedIn: linkedin.com/in/jeffreyjonathanjennings/J3’s GitHub: github.com/j3-signalroom“Fall in Love with the Problem, Not the Solution” by Uri Levine“Ask Your Developer” by Jeff LawsonLearn more at Confluent.io

    Our Sponsor:
    Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io.

  • Despite its value, legacy data can feel like a roadblock in a fast-paced digital world—Henry Schein One is clearing the path forward with real-time data streaming.

    In this episode, Chris Kapp, Software Architect at Henry Schein One (HS1), shares how his team modernizes data management to stay competitive and unlock real-time insights.

    You’ll learn about:

    How tagging strategy, immutable audit log, and governance keep data secure and reliableThe challenges (and wins) of getting leadership buy-in for data modernizationHS1’s approach to decentralized data ownership, domain-driven design, and the importance of stream processing for scalingThe role of GenAI in the future of real-time stream processing

    Get ready to future-proof your data strategy with this must-listen episode for technology leaders facing scalability, governance, or integration challenges.

    About the Guest:
    Chris Kapp is a Software Architect at Henry Schein One specializing in domain-driven design and event-driven patterns. He has 34 years of experience in the software industry including Target and Henry Schein One. He is passionate about teaching patterns for scalable data architectures. He’s currently focused on the One-Platform initiative to allow Henry Schein applications to work together as a single suite of products.

    Guest Highlight:
    “It's important to collect the data, try to eliminate our biases and go towards what is delivering quickly. The key is to start small, agile, iterative, and build something small with the people that are excited and willing to learn new things. If it doesn't work, then be agile, adjust, and find the thing that does work."

    Episode Timestamps:
    *(01:18) - Chris’ Data Streaming Journey
    *(03:35) -   Data Streaming Goodness: AI-Driven Reporting & Data Streaming
    *(21:08) -   The Playbook: Data Revitalization & Event-Driven Architecture
    *(31:55) -   Data Streaming Street Cred: Executive Alignment & Engineering Collaboration
    *(32:03) - Quick Bytes
    *(40:14) - Joseph’s Top 3 Takeaways

    Links & Resources:

    Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantChris’ LinkedIn: linkedin.com/in/chris-kapp-87868a4Designing Event-Driven Systems eBookDesigning Data-Intensive Applications eBookCurrent 2025—The Data Streaming EventLearn more at Confluent.io

    Our Sponsor:
    Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io.

  • Zijn er afleveringen die ontbreken?

    Klik hier om de feed te vernieuwen.

  • In the final episode of our 3-part series on the basics of data streaming, we take a deep dive into data integration—covering everything from data governance to data quality.

    Our guests, Mike Agnich, General Manager of Data Streaming Platform, and David Araujo, Director of Product Management at Confluent, explain why connectors are must-haves for integrating systems.

    You’ll learn:

    Why real-time ETL out performs the old-school approachHow shifting left with governance saves time and pain laterThe overlooked role of schemas in data qualityAnd more


    About the Guests:

    Mike Agnich is the General Manager and VP of Product for Confluent's Data Streaming Platform (DSP). Mike manages a product portfolio that includes stream processing, connectors and integrations, governance, partnerships, and developer tooling. Over the last six years at Confluent, Mike has held various product leadership roles spanning Apache KafkaÂź, Confluent Cloud, and Confluent Platform. Working closely with customers, partners, and R&D to drive adoption and execution of Confluent products. Prior to his work at Confluent, Mike was the founder and CEO of Terrain Data (acquired by Confluent in 2018).

    David Araujo is a Director of Product Management at Confluent, focusing on data governance with products such as Schema Registry, Data Catalog, and Data Lineage. David previously held positions at companies like Amobee, Turn, WeDo Technologies Australia, and Saphety, where David worked on various aspects of data management, analytics, and infrastructure. With a background in Computer Science from the University of Évora, David has a strong foundation of technical expertise and leadership roles in the tech industry.

    Guest Highlights:

    "If a ton of raw data shows up on your doorstep, it's like shipping an unlabeled CSV into a finance organization and telling them to build their annual forecast. By shifting that cleaning and structure into streaming, we remove a massive amount of toil for our organizations
 Instead of punting the problem down to our analytics friends, we can solve it because we're the ones that created the data." - Mike Agnich

    "We've had data contracts in Kafka long before it became a buzzword—we called them schemas
 But more recently, we've evolved this concept beyond just schemas. In streaming, a data contract is an agreement between producers and consumers on both the structure (schema) and the semantics of data in motion. It serves as a governance artifact, ensuring consistency, reliability, and quality while providing a single source of truth for understanding streaming data." - David Araujo

    Links & Resources:

    Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantMike’s LinkedIn: linkedin.com/in/magnichDavid’s LinkedIn: linkedin.com/in/davidaraujoWhat Is a Data Streaming Platform (DSP)Learn more at Confluent.io

    Episode Timestamps:

    *(02:00) - Mike and David’s Journey in Data Streaming
    *(13:55) -   Data Streaming 101: Data Integration
    *(40:06) -   The Playbook: Tools & Tactics for Data Integration
    *(53:25) -   Voices from the World of Data Streaming
    *(59:33) - Quick Bytes
    *(1:05:20) - Joseph’s Top 3 Takeaways

    Our Sponsor:

    Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent Data Streaming Platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true Data Streaming Platform from the pioneers in data streaming. Learn more at confluent.io

  • We’re diving even deeper into the fundamentals of data streaming to explore stream processing—what it is, the best tools and frameworks, and its real-world applications.

    Our guests, Anna McDonald, Distinguished Technical Voice of the Customer at Confluent, and Abhishek Walia, Staff Customer Success Technical Architect at Confluent, break down what stream processing is, how it differs from batch processing, and why tools like Flink are game changers.

    You’ll learn:

    The key differences between stream and batch processingHow different frameworks like Flink, Kafka Streams, and ksqlDB approach stream processingThe role of POCs and observability in real-time data workflowsAnd more


    About the Guests:

    Anna McDonald is the Distinguished Technical Voice of the Customer at Confluent. She loves designing creative solutions to challenging problems. Her focus is on event-driven architectures, reactive systems, and Apache KafkaÂź.

    Abhishek Walia is a Staff Customer Success Technical Architect at Confluent. He has years of experience implementing innovative, performance-driven, and highly scalable enterprise-level solutions for large organizations. Abhishek specializes in architecting, designing, developing, and delivering integration solutions across multiple platforms.

    Guest Highlights:

    “Flink is more approachable because it blends approaches together and says, ‘If you need this, you still can use this.’ It's the most powerful at this point.” - Abhishek Walia

    “If you're somebody who's ever gone from normal to eventing, at some point you probably would have gone, ‘When does [the data] stop?’ It doesn't stop.” - Anna McDonald

    “ Start with a fully managed service. That's probably going to save a lot of cycles for you.” - Abhishek Walia

    Episode Timestamps:

    *(01:35) - Anna & Abhishek’s Journey in Data Streaming

    *(12:30) -   Data Streaming 101: Stream Processing

    *(26:30) -   The Playbook: Tools & Tactics for Stream Processing

    *(50:20) -   Voices from the World of Data Streaming

    *(56:13) -    Quick Bytes

    *(58:57) - Top 3 Takeaways

    Links & Resources:

    Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantAnna’s LinkedIn: linkedin.com/in/jbfletchAbhishek’s LinkedIn: linkedin.com/in/abhishek-waliaIntroducing Derivative Event SourcingDesigning Event-Driven SystemsLearn more at Confluent.io

    Our Sponsor:

    Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent Data Streaming Platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true Data Streaming Platform from the pioneers in data streaming. Learn more at confluent.io.

  • Real-time data streaming is shaking up everything we know about modern data systems. If you’re ready to dive in but unsure where to begin, no worries. That’s why we’re here.

    Our first episode breaks down the basics of data streaming—from what it is, to its pivotal role in processing and transferring data in a fast-paced digital environment. Your guide is Tim Berglund, VP of Developer Relations at Confluent, where he and his team work to make data streaming data and its emerging toolset accessible to all developers.

    You’ll learn:

    The fundamentals of data streamingData streaming advantages vs. other technologiesWhat is an Event-Driven Architecture (EDA)And much more


    About the Guest:

    Tim Berglund serves as the VP of Developer Relations at Confluent, where he and his team work to make streaming data and its emerging toolset accessible to all developers. He is a regular speaker at conferences and a presence on YouTube explaining complex technology topics in an accessible way.

    Guest Highlights:

    “The basic intellectual habit that you have in building a data streaming system isn't first, ‘What are the things?’ But it's, ‘What is happening?’”

    “With batch processing, I’ve got my data in a pile—I know where it starts, where it ends, and I can work through it. With streaming, it’s not a pile—it’s a pipe.”

    “The future of data streaming is real-time everything—flows, insights, and actions. There’s no more ‘take the data here and think about it later.’ The insight is now, ready to be consumed by anyone who needs it. Businesses built on this model can respond to the world as it changes, right away."

    Episode Timestamps:

    *(01:44) -  Tim’s Journey in Data Streaming

    *(14:35) -   Data Streaming 101: Unlocking the Power of Data

    *(38:56) -   The Playbook: Tools & Tactics for Data Streaming

    *(49:00) -   Voices from the World of Data Streaming

    *(53:35) -    Quick Bytes

    *(57:10) - Top 3 Takeaways

    Links & Resources:

    Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantTim’s LinkedIn: linkedin.com/in/tlberglundExplore the 2024 Data Streaming ReportLearn more at Confluent.io

    Our Sponsor:

    Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent Data Streaming Platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true Data Streaming Platform from the pioneers in data streaming. Learn more at confluent.io.

  • Life Is But A Stream is the web show for tech leaders who must respond the second data is created. This show is your guide to staying ahead of the curve and leading the data revolution, not getting left behind.

    Data streaming isn’t just a buzzword; it’s the driving force behind instant analytics, enhanced customer experiences, and cutting-edge innovation. From foundational concepts to advanced techniques like stream processing and real-time data integration, this web show provides the insights you need to revolutionize your business.

    Hosted by Joseph Morais, Technical Champion and Data Streaming Evangelist at Confluent, each episode features technical leaders and industry pioneers sharing real examples of how they’re leveraging real-time data and event-driven architecture to transform their operations. Learn strategies to accelerate decision-making, amplify business impact, and gain a competitive edge.

    Lead the data revolution – subscribe to Life Is But A Stream today.

    Watch on YouTube or listen on Apple Podcasts, Spotify, or your favorite audio platform.

    Powered by the data streaming experts at Confluent.

    Links:

    Watch and Subscribe on YouTube: www.youtube.com/@ConfluentListen and Subscribe on:Apple PodcastsSpotifyAmazon MusicLearn more at Confluent.io

    Our Sponsor:
    Confluent is pioneering a fundamentally new category of data infrastructure focused on data in motion. Our cloud-native offering is the foundational platform for data in motion—designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organization. With Confluent, our customers can meet the new business imperative of delivering rich, digital customer experiences and real-time business operations. Our mission is to help every organization harness data in motion so they can compete and thrive in the modern world. Learn more at Confluent.io.