Afleveringen
-
The convergence of Master Data Management (MDM) and Artificial Intelligence (AI) is transforming how businesses harness data to drive innovation and efficiency. MDM provides the foundation by organising, standardising, and maintaining critical business data, ensuring consistency and accuracy across an organisation.
When paired with AI, this clean and structured data becomes a powerful asset, enabling advanced analytics, predictive insights, and intelligent automation. MDM and AI help businesses uncover hidden patterns, streamline operations, and make more informed decisions in real-time.
By integrating MDM with AI, organisations can move beyond simply managing data to actively leveraging it for competitive advantage. AI algorithms thrive on high-quality, well-structured data, and MDM ensures just that—minimising errors and redundancies that could compromise results. This synergy empowers companies to personalise customer experiences, optimise supply chains, and respond proactively to market changes.
In this episode, Kevin Petrie, VP of Research at BARC US, speaks to Jesper Grode, Director of Product Innovation at Stibo Systems, about the intersection between AI and MDM.
Key Takeaways:
AI and master data management should be integrated for better outcomes.Master data improves the quality of inputs for AI models.Accurate data is crucial for training machine learning models.Generative AI can enhance product launch processes.Prompt engineering is essential for generating accurate AI responses.AI can optimise MDM processes and reduce operational costs.Fast prototyping is vital for successful AI implementation.Chapters:
00:00 - Introduction to AI and Master Data Management
02:59 - The Synergy Between AI and Master Data
05:49 - Generative AI and Master Data Management
09:12 - Leveraging Master Data for Small Language Models
11:58 - AI's Role in Optimizing Master Data Management
14:53 - Best Practices for Implementing AI in MDM
-
As cloud adoption grows, so do the challenges of managing costs effectively. Cloud environments offer scalability and flexibility but often come with hidden fees, unpredictable expenses, and resource sprawl that can quickly inflate budgets. Without the right tools and strategies, businesses may struggle to track spending, identify waste, and maintain budget alignment.
Usage-based reporting is pivotal in this process, providing the granular visibility needed to understand real-time consumption patterns and optimise costs. Businesses can align expenses directly with value-driven activities by tracking how, where, and when resources are used. From preventing overspending to fostering accountability, usage-based reporting empowers teams to proactively manage their cloud expenses, turning cloud cost management into a strategic advantage rather than a recurring headache.
In this episode, George Firican, Founder of LightsOnData, speaks to Rem Baumann, Resident FinOps Expert at Vantage, about usage-based reporting and its benefits.
Key Takeaways:
Organisations face challenges in tracking complex cloud costs.Usage-based reporting provides context to cloud spending.Metrics should align with business goals for effective decision-making.Communication between finance and engineering teams is crucial.Identifying cost optimisation opportunities can lead to significant savings.Different industries require customised cost metrics.Cloud providers offer basic tools, but deeper insights are needed.Regular monitoring of metrics ensures financial transparency.Chapters:
00:00 - Introduction to Cloud Cost Management
03:03 - Understanding Cloud Complexity and Cost Tracking
05:53 - The Role of Usage-Based Reporting
09:06 - Metrics for Cost Optimization
12:02 - Industry-Specific Applications of Cost Metrics
14:49 - Aligning Cloud Costs with Business Goals
18:09 - Conclusion and Key Takeaways
-
Zijn er afleveringen die ontbreken?
-
Data custodianship today involves managing and protecting vast quantities of sensitive information, requiring organisations to ensure security, regulatory compliance, and ethical usage. It’s not just about protecting data from breaches but also about responsible storage, access, and deletion that aligns with strict industry standards and evolving privacy regulations.
The ethical dimensions of data custodianship add further complexity as organisations balance the need for data-driven insights with privacy rights and transparent usage. Mismanagement can lead to significant financial, legal, and reputational risks, making effective custodianship essential for maintaining customer trust and regulatory compliance.
In this episode, Paulina Rios Maya, Head of Industry Relations, speaks to Debbie Reynolds, Founder and Chief Data Privacy Officer at Debbie Reynolds Consulting, about compliance with global regulations, the role of AI in data management, and the necessity of human oversight in technology.
Key Takeaways:
Data custodianship emphasises that data belongs to individuals, not companies.Organisations must have a comprehensive plan for data management throughout its lifecycle.Transparency and communication with consumers are essential in data handling.Different types of data require different levels of protection based on risk.Building trust with consumers requires responsible data practices.Organisations need to prioritise basic data protection strategies over compliance with every regulation.Chapters:
00:00 - Introduction to Data Custodianship
03:03 - Understanding Responsibilities in Data Handling
05:59 - Balancing Innovation and Data Protection
08:45 - Building Trust Through Responsible Data Practices
12:07 - Navigating Compliance and Data Governance
14:54 - Leveraging AI for Enhanced Data Custodianship
18:06 - The Role of Humans in Technology and Data Management
-
Generative AI and unstructured data are transforming how businesses improve customer experiences and streamline internal processes. As technology evolves, companies find new ways to gain insights, automate tasks, and personalize interactions, unlocking new growth opportunities.
The integration of these technologies is reshaping operations, driving efficiency, and enhancing decision-making, helping businesses stay competitive and agile in a rapidly changing landscape. Organizations that embrace these innovations can better adapt to customer needs and market demands, positioning themselves for long-term success.
In this episode, Doug Laney speaks to Katrina M. Conn, Senior Practice Director of Data Science at Teradata, and Sri Raghavan, Principal of Data Science and Analytics at AWS, about sustainability efforts and the ethical considerations surrounding AI.
Key Takeaways:
Generative AI is being integrated into various business solutions.Unstructured data is crucial for enhancing customer experiences.Real-time analytics can improve customer complaint resolution.Sustainability is a key focus in AI resource management.Explainability in AI models is essential for ethical decision-making.The combination of structured and unstructured data enhances insights.AI innovations are making analytics more accessible to users.Trusted AI frameworks are vital for security and governance.Chapters:
00:00 - Introduction to the Partnership and Generative AI
02:50 - Technological Integration and Market Expansion
06:08 - Leveraging Unstructured Data for Insights
08:55 - Innovations in Customer Experience and Internal Processes
11:48 - Sustainability and Resource Optimization in AI
15:08 - Ensuring Ethical AI and Explainability
23:57 - Conclusion and Future Directions
-
In this episode, Rachel Thornton, Fivetran's CMO, discusses the highlights of Big Data London 2024, including the launch of Fivetran Hybrid Deployment, which addresses the needs of organisations with mixed IT environments.
The conversation delves into integrating AI into business operations, emphasizing the importance of a robust data foundation. Additionally, data security and compliance challenges in the context of GDPR and other regulations are explored. The episode concludes with insights on the benefits of hybrid deployment for organisations.
Key Takeaways:
Big Data London 2024 is a significant event for data leaders.Fivetran Hybrid Deployment caters to organizations with mixed IT environments.AI integration requires a strong data foundation.Data security and compliance are critical in today's landscape.Organizations must understand their data sources for effective AI use.Hybrid deployment allows for secure data management.Compliance regulations are becoming increasingly stringent. Data readiness is essential for AI integration.Chapters:
00:00 - Introduction to Big Data London 2024
02:46 - Launch of Fibntran Hybrid Deployment
06:06 - Integrating AI into Business Operations
08:54 - Data Security and Compliance Challenges
11:50 - Benefits of Hybrid Deployment
-
Managing network traffic efficiently is essential to control cloud costs. Network flow
reports are critical in providing detailed insights into data movement across cloud
environments. These reports help organisations identify usage patterns, track
bandwidth consumption, and uncover inefficiencies that may lead to higher expenses.
With a clear understanding of how data flows, businesses can make informed decisions
to optimise traffic, reduce unnecessary data transfers, and allocate resources more
effectively. This helps lower cloud costs, improves network performance, and enhances
security by revealing unusual or potentially harmful traffic patterns.
In this episode, Wayne Eckerson from Eckerson Group speaks to Ben Schaechter,
CEO of Vantage, about optimising network traffic costs with Vantage’s Network Flow
Reports.
Key Takeaways:
● Network Flow Reports provide detailed insights into AWS costs.
● They help identify specific resources driving network traffic costs.
● Organisations can reduce costs by up to 90% with proper configuration.
● The shift towards cost management in cloud services is critical.
● FinOps teams are becoming essential for cloud cost optimization.
● Anomaly detection can alert teams to unexpected cost spikes.
● Vantage integrates with multiple cloud providers for comprehensive cost
management.
● Effective cost management does not have to impact production workflows.
Chapters:
00:00 - Introduction to Vantage and Network Flow Reports
02:52 - Understanding Network Flow Reports and Their Impact
06:09 - Real-World Applications and Case Studies
09:03 - The Shift in Cost Management Focus
11:54 - Tangible Benefits of Implementing Network Flow Reports
15:07 - The Role of FinOps in Cost Optimization
18:00 - Conclusion and Future Insights
-
Safe Software’s FME is transforming Omaha’s approach to urban mobility with groundbreaking solutions for asset management, e-scooter tracking, and parking management.
FME’s robust data integration capabilities are at the core of Omaha’s advancements. The data integration platform enables real-time tracking of e-scooters, offering precise data on their locations and usage patterns. This innovation enhances the management and accessibility of e-scooters, making urban mobility more efficient and user-friendly.
Automated parking management processes, facilitated by FME, streamline city operations and reduce manual efforts. This automation leads to smoother parking experiences for residents and visitors, while dynamic rate adjustments, powered by FME, ensure that parking fees are responsive to real-time demand, optimising availability and revenue.
In this episode, Wayne Eckerson from Eckerson Group speaks to Jacob Larson, an Applications Analyst from the City of Omaha, to discuss Omaha’s usage of FME.
Key Takeaways:
FME helps automate the tracking of e-scooters in real time.Data sharing agreements with providers like Lime enhance tracking capabilities.Omaha's parking management has been transformed through automation.FME allows for dynamic changes in parking rates based on events.The integration of GIS data with third-party APIs is crucial for parking management.Omaha is pioneering a real-time parking information system in the US.Chapters:
00:00 - Introduction to Omaha's Data Initiatives
01:03 - FME's Role in Asset Management
04:53 - Real-Time Tracking of E-Scooters
07:48 - Automating Parking Management
10:06 - Innovations in Parking Availability
12:59 - Dynamic Parking Rate Management
-
Safe Software’s FME is transforming Omaha’s approach to urban mobility with groundbreaking solutions for asset management, e-scooter tracking, and parking management.
FME’s robust data integration capabilities are at the core of Omaha’s advancements. The data integration platform enables real-time tracking of e-scooters, offering precise data on their locations and usage patterns. This innovation enhances the management and accessibility of e-scooters, making urban mobility more efficient and user-friendly.
Automated parking management processes, facilitated by FME, streamline city operations and reduce manual efforts. This automation leads to smoother parking experiences for residents and visitors, while dynamic rate adjustments, powered by FME, ensure that parking fees are responsive to real-time demand, optimising availability and revenue.
In this episode, Wayne Eckerson from Eckerson Group speaks to Jacob Larson, an Applications Analyst from the City of Omaha, to discuss Omaha’s usage of FME.
Key Takeaways:
FME helps automate the tracking of e-scooters in real time.Data sharing agreements with providers like Lime enhance tracking capabilities.Omaha's parking management has been transformed through automation.FME allows for dynamic changes in parking rates based on events.The integration of GIS data with third-party APIs is crucial for parking management.Omaha is pioneering a real-time parking information system in the US.Chapters:
00:00 - Introduction to Omaha's Data Initiatives
01:03 - FME's Role in Asset Management
04:53 - Real-Time Tracking of E-Scooters
07:48 - Automating Parking Management
10:06 - Innovations in Parking Availability
12:59 - Dynamic Parking Rate Management
-
Open source technologies are transforming how businesses manage real-time data on cloud platforms. By leveraging flexible, scalable, and cost-effective open-source tools, organisations can process and analyse large volumes of data with speed and precision. These technologies offer unmatched transparency, customisation, and community-driven innovation, making them ideal for real-time monitoring, analytics, and IoT applications.
As data demands grow, open-source solutions ensure that businesses stay agile, reduce vendor lock-in, and maintain full control over their cloud infrastructure. The result? Faster insights, smarter decision-making, and enhanced performance—all powered by open source.
In this episode, Paulina Rios Maya, Head of Industry Relations at EM360Tech, speaks to Mikhail Epikhin, Chief Technology Officer at Double Cloud, about The Power of Open Source in Cloud Platforms.
Key Takeaways:
Open-source technologies provide standard building blocks for products.Community-driven innovation is essential for the evolution of technology.Flexibility in data infrastructure is crucial for real-time processing.Observability and monitoring are vital for performance optimisation.Managed services can accelerate product development and feature implementation.Chapters:
00:00 - The Power of Open Source in Cloud Platforms
05:24 - Apache Airflow: Enhancing Real-Time Data Management
10:08 - Balancing Open Source and Managed Services
13:57 - Best Practices for Scalability and Performance
-
Big Data LDN 2024, the UK’s leading data, analytics, and AI event, is less than a week away – promising two days filled with ground-breaking stories, expert insights, and endless innovation.
Taking place at the Kensington Olympia in London on September 18-19, this year’s event features fifteen theatres and over 300 expert speakers sharing insights on some of the industry’s hottest topics – from generative AI to data analytics and privacy.
With the event less than a week away, EM360Tech’s Head of Podcast Production, Paulina Rios Maya, grabbed Big Data LDN’s Event Director, Andy Steed, for a chat about his expectations for this year’s event and its growing importance in the data world.
In the episode, they discuss:
The exciting themes or breakthroughs attendees can expect to see showcased this yearHow Big Data London remains relevant in such a rapidly evolving fieldThe unique networking opportunities or interactive experiences attendees have at the conferenceThe standout sessions or keynote speakers at the conference
Chapters:00:00: Introduction to Big Data LDN 2024
01:35: Showcasing Data Stories, Transformations, and Challenges
02:33: The Networking Opportunities with Industry Leaders and Peers at Big Data LDN 2024
05:01: Staying Relevant with a Focus on Generative AI and Real-World Use Cases
06:55:The Importance of Data Events for Community Building and Learning
About Big Data LDN 2024Big Data London is the UK's largest data and analytics event, attracting over 16,500 visitors each year. Taking place at the Olympia in London on September 18-19, this year’s event features fifteen theatres and over 300 expert speakers across the two-day conference.
Attendees can meet face-to-face with tech providers and consultants to find solutions to your data challenges and view the latest product releases and software demos to enhance your business' data capabilities.
It’s also a great opportunity for attendees to strengthen their business network with new and existing partners, and immerse themselves within the data community and network with speakers, colleagues and practicioners all in 2 days at Big Data LDN.
-
Sustainable sourcing is essential for businesses committed to environmental and social responsibility, but achieving it requires accurate and reliable data. Master Data Management (MDM) ensures that all sourcing data—such as supplier information, certifications, and compliance records—is consistent and up-to-date. This enables organisations to make informed decisions that align with their sustainability goals, reduce waste, and promote ethical practices throughout their supply chain.
MDM is the foundation of a successful sustainability strategy. By providing a single source of truth for all critical data, MDM helps businesses monitor and track their sustainability efforts effectively. With accurate data, companies can identify opportunities to improve resource efficiency, reduce carbon footprints, and ensure compliance with environmental standards, ultimately leading to a more sustainable and resilient business model.
In this episode, George Firican, Founder of LightsOnData, speaks to Matthew Cawsey, Director of Product Marketing and Solution Strategy, and Paarijat Bose, Customer Success Manager at Stibo Systems, to discuss sustainable sourcing and why accurate data matters.
Key Takeaways:
Sustainable sourcing involves understanding the provenance and environmental impact of products, ensuring compliance with regulations, and meeting sustainability goals.Data completeness and accuracy are crucial in meeting regulatory requirements and avoiding issues like greenwashing.Managing sustainability data requires a solid foundation of MDM to ensure data accuracy, stewardship, and semantic consistency.MDM solutions help companies collect, manage, and share sustainability data, enabling them to meet compliance requirements and achieve their sustainability goals.Chapters:
00:00 - Introduction and Overview
01:07 - The Challenge of Collecting Data for Compliance and Reporting
02:31 - Data Accuracy and Completeness in the Supply Chain
05:23 - Regulations and the Demand for Transparent and Complete Data
08:41 - The Role of Master Data Management in Sustainability
15:51 - How Data Management Technology Solutions Help Achieve Sustainability Goals
21:02 - The Need to Start Early and Engage with Data Management Solutions
22:01 - Conclusion and Call to Action
-
Data provenance is essential for maintaining trust and integrity in data management. It involves tracking the origin of data and understanding how it has been processed and handled over time. By focusing on fundamental principles such as identity, timestamps, and the content of the data, organisations can ensure that their data remains accurate, consistent, and reliable.
Implementing data provenance does not require significant changes or large investments. Existing technologies and techniques can be seamlessly integrated to provide greater transparency and control over data. With data provenance, businesses can confidently manage their data, enhancing decision-making and fostering stakeholder trust.
In this episode, Jon Geater, Co-Chair of the Supply Chain Integrity Transparency and Trust (SCITT) Working Group, speaks to Paulina Rios Maya, Head of Industry Relations, about data provenance.
Key Takeaways:
Data provenance is knowing where data comes from and how it has been handled, ensuring trust and integrity.The fundamental principles of data provenance include identity, timestamps, and the content of the data.Data provenance can be implemented by integrating existing technologies and techniques without significant changes or investments.Data provenance helps with compliance, such as GDPR, by providing a transparent record of data handling and demonstrating compliance with requests.Chapters:
00:00 - Introduction and Background
02:01 - Understanding Data Provenance
05:47 - Implementing Data Provenance
10:01 - Data Provenance and Compliance
13:50 - Success Stories and Industry Applications
18:10 - Conclusion and Call to Action
-
FME is a vital tool in disaster management and response. It enables the integration and transformation of geospatial data for real-time tracking of disasters and hazards. By ensuring accurate and timely data analysis, it provides essential decision support for disaster management professionals.
During the Maui wildfires, FME and the Pacific Disaster Centre were crucial in managing and analysing critical data, allowing for effective coordination and response. By facilitating seamless data sharing and collaboration among stakeholders, FME helps ensure that the correct information reaches the right people at the right time.
In this episode of the EM360 Podcast, Alejandro Leal, an Analyst at KuppingerCole, speaks to Jorma Rodieck, a GIS Specialist at the Pacific Disaster Centre, about the importance of FME.
Key Takeaways:
FME is an essential tool in disaster management and response, allowing for the integration and transformation of geospatial data.FME enables real-time data analysis and decision support for disaster management professionals.During the Maui wildfires, FME was instrumental in managing and analyzing critical data, providing a common operating picture for response efforts.FME ensures effective data sharing and collaboration among various stakeholders, enabling smooth interoperability between departments and agencies.Chapters:
00:00 - Introduction and Background
02:35 - The Role of FME in Disaster Management
06:44 - Managing and Analyzing Critical Data with FME
10:34 - FME's Impact during the Maui Wildfires
11:59 - Ensuring Effective Data Sharing and Collaboration
15:20 - The Future of FME in the Pacific Disaster Center
18:15 - Conclusion
-
Open source real-time analytics offers unparalleled advantages, providing businesses with freedom and independence to maintain operations seamlessly, even if a vendor issue arises. However, the journey isn't without its challenges. Open source solutions can often be clunky and require specialised expertise to manage effectively.
This is where DoubleCloud comes in, offering a managed platform that addresses these obstacles by handling crucial responsibilities such as backups, high availability, and security updates, allowing businesses to focus on leveraging their data.
In this podcast, Christina Stathopoulos speaks to Vladimir Borodin, Co-Founder and CEO of DoubleCloud, about open source strategies and the advantages of the DoubleCloud solution.
Key Takeaways:
DoubleCloud's managed platform helps overcome the challenges of open source, such as clunkiness and a lack of expertise.Successful customer use cases demonstrate the performance and cost benefits of DoubleCloud's solution.The transition phase to DoubleCloud's solution depends on the complexity of the application.Using open source whenever possible is recommended.
Chapters:00:00 - Introduction and Background
02:29 - The Advantages of Open Source
04:21 - Challenges of Open Source
06:47 - The Power of Real-Time Analytics
09:11 - Success Stories: Improved Performance and Reduced Costs
12:54 - Navigating the Transition to DoubleCloud's Solution
15:14 - The Importance of Using Open Source
-
Privacy by Default and Design is a fundamental principle of the General Data Protection Regulation (GDPR). It prioritises transparency, user control, and data security from the outset. This approach ensures that privacy is integrated into systems and processes by default rather than as an afterthought.
By embedding these practices, organisations enhance trust and accountability while meeting regulatory requirements. However, challenges such as resistance to change and the need for cultural transformation must be addressed to implement this principle effectively.
In this episode of the Don’t Panic It’s Just Data, Tudor Galos, Senior Privacy Consultant, speaks to Paulina Rios Maya, Head of Industry Relations, about the impact of privacy by default and design extend to user experience, where issues like consent fatigue and the necessity for user-friendly interfaces arise.
Key Takeaways:
Organisations face challenges in implementing privacy by default and design, including resistance to change and the need for cultural transformation.Privacy by default and design impact user experience, with issues like consent fatigue and the need for user-friendly interfaces.Regulations like GDPR and CCPA incorporate privacy by default and design principles, emphasising compliance and accountability.Chapters:
00:00 - Introduction and Overview
01:00 - Core Principles of Privacy by Default and Design
02:19 - Difference from Traditional Privacy Practices
04:09 - Challenges in Implementing Privacy by Default and Design
05:33 - Impact of Privacy by Default on User Experience
08:14 - Alignment of Privacy by Default with Regulations
09:04 - Ensuring Compliance and Trust
11:24 - Implications of Emerging Technologies on Privacy
13:15 - Innovations in Privacy-Enhancing Technologies
15:50 - Conclusion
-
Safe Software's Feature Manipulation Engine (FME) plays a pivotal role in the City of Fremont's operations, particularly in ensuring accurate and efficient data submissions under the Racial and Identity Profiling Act (RIPA). By automating complex workflows and enhancing data quality, FME not only ensures seamless compliance with RIPA requirements but also optimises processes for their ITS and GIS divisions.
FME also drives innovation in projects like the DroneSense programme and their Cityworks asset management integration. With seamless data integration and powerful visualisations, FME empowers the City of Fremont to enhance operations, improve asset management, and support informed decision-making.
In this episode, Jonathan Reichental, founder at Human Future, speaks to John Leon, GIS Manager for the City of Fremont, to discuss:
FME RIPAPublic SafetyChapters:
00:00 - Introduction and Overview of the City of Fremont and IT/GIS Division
03:01 - Explanation of the Racial and Identity Profiling Act (RIPA)
04:27 - Challenges in Meeting RIPA Standards and Utilizing FME
06:21 - How FME Ensures Error-Free RIPA Data Submissions
09:40 - Benefits of Using FME for RIPA Compliance
10:39 - Other Innovative Projects Utilizing FME in the City of Fremont
13:30 - Future Plans for FME in the City of Fremont
17:17 - Recommendations for Government Agencies: Leverage FME for Data Submissions
-
Real-time data insights help identify performance bottlenecks, manage data efficiently, and drive innovation. Despite the growing need for these capabilities, organisations often face challenges in implementing effective real-time analytics.
Achieving high-concurrency data processing is crucial for overcoming performance bottlenecks in real-time analytics. Embracing real-time analytics is not just a necessity, but a way to transform your data into actionable insights, optimise performance, and fuel business growth.
Yellowbrick is a modern data platform built on Kubernetes for enterprise data warehousing, ad-hoc and streaming analytics, AI and BI workloads that ensures comprehensive data security, unparalleled flexibility, and high performance.
In this podcast, Doug Laney, a Data Strategy Innovation Fellow with West Monroe, speaks to Mark Cusack, the CTO of Yellowbrick, about the power of real-time analytics.
Key Takeaways:
Real-time analytics enables faster business decisions based on up-to-date data and focuses on enabling actions.Using a SQL data platform like Yellowbrick, designed for high-concurrency data processing, can address performance bottlenecks in real-time analytics.Chapters:
00:00 - Introduction and Overview
01:07 - The Benefits of Real-Time Analytics
06:23 - Overcoming Challenges in Implementing Real-Time Analytics
06:51 - High Concurrency Data Processing for Real-Time Analytics
13:59 - Yellowbrick: A Secure and Efficient SQL Data Platform
-
Accurate and reliable data is essential for training effective AI models. High-quality data ensures precision, reduces bias, and builds trust in AI systems. Similarly, Master Data Management (MDM) systems enhance data quality by integrating data from multiple sources, enforcing data governance, and providing a single source of truth. This helps eliminate discrepancies and maintain data integrity.
Integrating Product Information Management (PIM) with MDM ensures accurate and consistent product data across all channels, crucial for data-driven marketing. This combination centralises customer and product data, enabling precise targeting and personalised experiences. MDM and PIM integration leads to higher ROI and improved customer satisfaction by supporting effective marketing strategies.
In this episode of the EM360 Podcast, Paulina Rios Maya speaks to Philipp Krueger about integrating PIM and MDM functionalities and how it streamlines operations, improves data accuracy and supports data-driven marketing strategies.
Chapters
00:00 - Introduction and Importance of Data Quality in AI Models
05:27 - Core Capabilities of an MDM System
08:13 - The Role of Data Governance in Data Management
13:37 - Enhancing Customer Experience and Driving Sales with Pimcore
19:47 - Integration of PIM and MDM Functionalities for Data-Driven Marketing Strategies
22:59 - The Impact of Accurate Data on Revenue Growth
27:28 - Simplifying Data Management with a Single Platform
-
One of the biggest challenges businesses face when it comes to data visualisation is handling the volume of data and the need for faster processing methods.
There's a common misconception that effective data visualisation must be fancy and interactive, but simple visuals can be just as powerful. Ann K. Emery, an expert in the field, believes that accessibility doesn't have to be time-consuming or expensive.
In this podcast, she shares actionable strategies for creating accessible visualizations with Paulina Rios Maya, Head of Industry Relations at EM360Tech.
Key Takeaways
Avoiding red-green colour combinations, Ensuring proper colour contrastUsing direct labelling instead of legends. Avoiding using all-capsUsing grey to highlight important informationEmploying small multiples to simplify complex visualisations.Chapters:
00:00 - Introduction
00:54 - Defining Accessibility in Data Visualization
02:17 - Big A Accessibility Tips
06:36 - Little a Accessibility Strategies
12:28 - The Future of Data Accessibility
-
Managing cloud costs effectively has become a significant challenge for organisations relying on public cloud services. FinOps addresses these challenges by ensuring efficient spending and governance of cloud resources. Key practices in FinOps include achieving complete visibility into cloud usage and costs, fostering cross-functional collaboration between finance, operations, and engineering teams, and utilising data-driven decision-making to optimise cloud investments.
By embracing a centralised team, organisations can instil a culture of governance and efficiency in cloud cost management. This approach can lead to enhanced resource utilisation and substantial cost savings. With Vantage, your organisation can cultivate a robust cloud cost governance and efficiency culture, ensuring your cloud investments yield maximum value.
In this episode of the EM360 Podcast, Kevin Petrie, VP of research at BARC US, speaks to Ben Schaechter, CEO and co-founder of Vantage, to discuss:
FinOpsVantage’s platform Cloud costs and FinOps practices
Chapters00:00 - Introduction and Overview
02:02 - Understanding FinOps and Cloud Cost Governance
07:45 - Best Practices in FinOps: Centralization and Collaboration
13:50 - The Role of Data-Driven Insights in Optimizing Cloud Costs
- Laat meer zien