Folgen
-
Organisations often find themselves trapped in an infinite loop. Typically hailed as an "infinity loop" or "patching loop,” it’s differentiated by a continuous cycle of fixing problems, patching, and redoing data pipelines just to maintain operations.
“It's a trap that many financial institutions, banks, insurance companies, and more fall into. This loop or this cycle is a process of focusing on fixing problems, patching,” describes Errol Rodericks.
In this episode of the Don't Panic It's Just Data podcast, Errol Rodericks, Product & Solutions Marketing and Sales Enablement Specialist at Denodo, discusses the challenges faced by financial services in steering the infinity loop of reactive data management.
He emphasises the importance of breaking free from this cycle to achieve real innovation and success in AI-led initiatives. The conversation explores the significance of AI-ready data, the journey from bronze to gold data products, and how Denodo helps bridge the last-mile gap in data management.
The Cost of Poor Data"The cost of poor data isn't just bad decisions. It's the decisions you never knew you could make. You never get around to that."
This statement sets the stage for the podcast. It captures the profound effect of being stuck in a reactive data mode and how organisations can overcome it.
Rodericks believes that financial services are entirely missing out on strategic opportunities that could redefine their market position.
While many financial institutions have invested heavily in centralised lakehouse architectures like Snowflake or Databricks. These alone are not enough. They often struggle to deliver the trusted, real-time insights that Gen AI and business teams require.
Missed Opportunities to Measurable OutcomesIt becomes more challenging for financial institutions to deliver adequate Gen AI real-time data insights when dealing with the "missing mile" of data. Approximately 30 per cent of crucial data is often overlooked or inaccessible.
To overcome the Gen AI challenges, Denodo facilitates logical data management. The firm provides direct access to live, relevant, and governed data when it's needed. It becomes critical for achieving measurable outcomes with AI-led initiatives.
For Chief Financial Officers (CFOs), Rodericks offered a succinct but powerful message – "Modern finance isn't about reports... It's about your ability to predict, to personalise, and to prevent – the three Ps."
TakeawaysThe infinity loop is a trap for financial institutions.Reactive data handling leads to missed insights and customer churn.Breaking out of the infinity loop is essential for innovation.AI projects often fail due to unreliable data inputs.AI-ready data must be trusted, timely, contextual, and reusable.The journey from bronze to gold data products is strategic.Ownership of data products is crucial for success.Timeliness of data is critical in financial services.Denodo provides real-time access to data without copying it.Modern finance focuses on prediction, personalisation, and prevention.
Chapters00:00 Understanding the Infinity Loop in Financial Services
04:23 Breaking Free from the Infinity Loop
06:03 The Cost of Firefighting Mode
09:14 The Importance of AI-Ready Data
19:42 The Journey from Bronze to Gold Data Products
22:53 Bridging the Last Mile Gap
27:06 Real-World Examples of
-
Blending financial planning directly into existing business intelligence (BI) platforms like Microsoft Power BI and Qlik are on the rise. But have you questioned why, and what drives this shift?
In this episode of the Don’t Panic It’s Just Data podcast, Kevin Petrie, BARC analyst, sits down to chat with Thomas Gorr, Director of Product Management for xP&A and BI at insightsoftware, and Henri Rufin, Head of Responsible Data & Analytics at Radiall.
The speakers stress how the integration of financial planning in BI platforms is driven by the need to move beyond traditional Excel-based planning. This is because it often leads to data silos and errors.
Their conversation spotlights how integrated BI and planning solutions improve collaboration across departments. Going deeper, BI can also provide a unified view of data and help organizations be more agile and proactive in volatile markets.
Gorr and Rufin explain how separating your financial data makes less sense, and why traditional tools like Excel are no longer significant.
Watch this podcast to discover how embedding planning capabilities within BI platforms, such as Qlik and Power BI, offer a seamless experience, greater flexibility, and real-time collaboration.
Learn how organisations are adapting to volatile market conditions through agile planning, and gain a glimpse into the evolving landscape of financial planning.
Tune in to gain expert insights and practical strategies for a more collaborative and data-driven future!
TakeawaysThe integration of planning into BI platforms is essential for collaboration.Excel is prone to errors and creates silos in data management.Organizations need to adapt to dynamic market conditions for effective planning.Stakeholder engagement is crucial for successful financial planning.Advanced planning solutions offer flexibility and real-time collaboration.Data governance is necessary to support planning processes.The total cost of ownership is lower with integrated planning solutions.BI platforms provide a unified experience for users.Future planning will focus on platform integration and advanced analytics.Companies must evolve their planning capabilities to remain competitive.
Chapters00:00 Introduction to Financial Data Management
03:04 The Shift Towards Integrated Planning
08:30 Collaboration in Dynamic Markets
11:05 The Role of Stakeholders in Planning
15:10 Moving Beyond Excel
19:19 Total Cost of Ownership in Planning Solutions
25:06 Future of Integrated Financial Planning
About insightsoftwareinsightsoftware is a global provider of comprehensive solutions for the Office of the CFO. They believe actionable business strategies begin and end with financial data that’s accessible and easy to understand.
They offer solutions across financial planning and analysis (FP&A), accounting, and operations. This transforms how teams operate, empowering leaders to make timely and informed decisions.
-
Fehlende Folgen?
-
In this episode of the Don’t Panic, It’s Just Data podcast, Kevin Petrie, VP of Research at BARC, is joined by Nidhi Ram, Vice President of Global Services Strategy and Operational Excellence at Precisely.
The duo explore the idea of focusing on data modernisation and improving accessibility rather than constantly implementing new technologies.
Both Petrie and Ram highlight the importance of traditional mainframes, especially in modern data strategies. They delve into how companies can combine cloud tools with data in order to handle diverse data systems. This removes the need for replacing the mainframe, keeping the data accessible for all users.
Going into the critical role of data quality and governance — especially in the age of AI — Ram emphasises that “garbage in, garbage out” has never been more relevant, as AI outputs are only as good as the data feeding the model.
What's needed is a more comprehensive approach to data integration, quality, governance, and enrichment that helps ensure data is always ready for confident business decisions.
Listen to this latest episode to learn how Precisely’s Data Integrity Suite provides a comprehensive approach to data modernisation.
TakeawaysData modernisation is about accessibility, not technology.The mainframe continues to play a crucial role in data strategies.High-quality data is essential for successful AI initiatives.Data governance is critical to comply with regulations and ensure data quality.Cloud solutions offer flexibility, but on-premise systems provide control.Companies need to adapt to a heterogeneous data environment.Integrating people and processes is key to successful data programs.Future data roles will require broad functional knowledge rather than deep technical skills.
Chapters00:00 Introduction to Data Modernization
02:56 Understanding Data Modernization
06:01 Challenges in Data Modernization
08:53 The Role of AI in Data Strategies
11:56 Data Quality and Governance
15:08 Cloud vs On-Premise Data Solutions
18:14 Adapting to Diverse Data Environments
20:47 Advice for Modernizing Data Strategies
23:58 The Importance of People and Process
27:07 Future Skills for Data Teams
About PreciselyPrecisely is a leading global data integrity provider, ensuring organisations have accurate, consistent, and contextual data. In today's data-driven world, where businesses rely on information for critical decisions, data integrity is crucial. Precisely offers a comprehensive portfolio of solutions designed to transform raw data into a reliable asset.
Trusted by over 12,000 organisations in more than 100 countries, Precisely plays a critical role in helping businesses effectively leverage their data. By providing reliable software and strategic services, Precisely empowers organisations to confidently embark on their AI, automation, and analytics initiatives.
High-quality, trustworthy data is the bedrock for successful AI models, efficient automation processes, and insightful analytics. Without it, these initiatives risk delivering inaccurate results and misleading insights. Precisely's commitment to data integrity allows businesses to make confident decisions and achieve strategic objectives.
-
In this episode of the Don't Panic, It's Just Data podcast, hosted by EM360Tech's podcast producer, Shubhangi Dua speaks to Donnie Owsley from Snohomish County, and Jeff Burton and Tom Lavery from the University of British Columbia. All of the speakers will be presenting at the upcoming Peak of Data and AI event, organised by Safe Software, the creators of FME.
Scheduled to take place in Seattle from May 5th to 8th, 2025, The Peak is an exciting gathering for data and AI innovators. This conversation offers a preview of some of the practical applications and insights that will be shared at the event.
The podcast also talks about the development of creative solutions for enhancing accessibility in urban environments. The UBC speakers particularly refer to their creation of an accessible university campus navigation system, a project that showcases the power of integrating FME with platforms like ArcGIS. This discussion spotlights the challenges and ingenuity involved in building inclusive wayfinding solutions that cater to the diverse needs of a community.
The conversation sheds light on some tangible ways in which FME is being used across different sectors to tackle specific challenges and boost creative innovations. It provides valuable context for the types of practical knowledge and problem-solving approaches that will be central to The Peak of Data and AI event.
For further information on what we’ve talked about and to register for The Peak of Data and AI event in Seattle, please head over to peakofdataintegration.com.
Key Highlights
Discover how to use tools like FME for preemptive IT issue resolution.Learn the approach to creating inclusive navigation systems with FME and ArcGIS.Get practical insights into current industry applications.Preview actionable data and AI solutions.Explore the versatile application of FME in your organisation.
About Safe SoftwareFounded in 1993, Safe is headquartered in Surrey, BC with over 200 team members and counting. We’re always looking for talented individuals with diverse backgrounds who are determined to learn and grow.
Over 20,000 organisations around the world use FME in industries like AEC, government, utilities, and transportation to maximise the value of their data.
-
Takeaways
#Financeteams are often resistant to change, clinging to outdated tools.Real-time data integration can help break down data silos and better decision-making.The shift from FP&A to XP&A emphasises collaboration across departments.#Datagovernance is crucial when handling financial data.Real-time data enables more accurate forecasting and budgeting.Organisations should define their data strategy before implementation.Proactive adaptation to data technologies is essential for future success.Summary
In this episode of "Don't Panic, It's Just Data," host Christina Stathopoulos explores the world of real-time analytics and its impact on financial decision-making. She is joined by Thomas Gore, insightsoftware's Director of Product Management for extended planning and analysis (XP&A), and Cody Riemenschneider, Director of Solutions Engineering, and they discuss the challenges and opportunities of integrating real-time data into finance.
We explore the issue of data silos and how real-time AI integration can break them down. Thomas explains: "Each department has its own data report, its own planning tools, which are mostly Excel files... They don't collaborate."
Listen to our latest podcast now for an insightful conversation on how real-time data is reshaping the future of #financialplanning and analysis.
For the latest tech insights visit: EM360Tech.com
-
"So you want trusted data, but you want it now? Building this trust really starts with transparency and collaboration. It's not just technology. It's about creating a single governed view of data that is consistent no matter who accesses it, " says Errol Rodericks, Director of Product Marketing at Denodo.
In this episode of the 'Don't Panic, It's Just Data' podcast, Shawn Rogers, CEO at BARC US, speaks with Errol Rodericks from Denodo. They explore the crucial link between trusted data and successful AI initiatives. They discuss key factors such as data orchestration, governance, and cost management within complex cloud environments.
We've all heard the horror stories – AI projects that fail spectacularly, delivering biased or inaccurate results. But what's the root cause of these failures? More often than not, it's a lack of focus on the data itself. Rodericks emphasises that "AI is only as good as the data it's trained on."
This episode explores how organisations can avoid the "garbage in, garbage out" scenario by prioritising data quality, lineage, and responsible AI practices.
Learn how to avoid AI failures and discover strategies for building an AI-ready data foundation that ensures trusted, reliable outcomes. Key topics include overcoming data bias, ETL processes, and improving data sharing practices.
TakeawaysBad data leads to bad AI outputs.Trust in data is essential for effective AI.Organisations must prioritise data quality and orchestration.Transparency and collaboration are key to building trust in data.Compliance is a responsibility for the entire organisation, not just IT.Agility in accessing data is crucial for AI success.
Chapters00:00 The Importance of Data Quality in AI
02:57 Building Trust in Data Ecosystems
06:11 Navigating Complex Data Landscapes
09:11 Top-Down Pressure for AI Strategy
11:49 Responsible AI and Data Governance
15:08 Challenges in Personalisation and Compliance
17:47 The Role of Speed in Data Utilisation
20:47 Advice for CFOs on AI Investments
About DenodoDenodo is a leader in data management. The award-winning Denodo Platform is the leading logical data management platform for transforming data into trustworthy insights and outcomes for all data-related initiatives across the enterprise, including AI and self-service.
Denodo's customers in all industries all over the world have delivered trusted AI-ready and business-ready data in a third of the time and with 10x better performance than with lakehouses and other mainstream data platforms alone.
-
Takeaways
#CMDM is essential for creating a unified and accurate view of the customer. Trust comes from how responsibly data is handled.Ethical data use is a competitive advantage and builds trust.Executive sponsorship and cross team buy-in is crucial for MDM projects.A strong customer #datastrategy is a competitive edge, not a side project.Summary
Customer Master Data Management (CMDM) as Matthew Cawsey, Director of Product Marketing at Stibo Systems describes, "about having a master record of a customer, as opposed to many organisations where customer data gets created and authored in potentially dozens of different CRMs, ERPs, and finance systems.”
In this episode of our 'Don't Panic, It's Just Data' podcast series, Christina Stathopoulos, Founder at Dare to Data, speaks with Matthew Cawsey and Arnjah Dillard from Stibo Systems. Matthew and Arnjah explain the importance of CMDM and CXDC for maintaining positive customer engagement.
Learn how to move from basic personalisation, ethically use #data, leverage #AI for a better customer experience. Let’s transform your messy data into a competitive advantage.
For the latest tech insights visit: EM360Tech.com
-
In the latest episode of the Don’t Panic It’s Just Data podcast, we connected with speakers who provided a preview of their presentations at the upcoming Peak of Data and AI event in Seattle organised by Safe Software from May 5-8, 2025.
This premier gathering, hosted by Safe Software, the creators of FME, will be a hub for data and AI innovators, and this podcast episode offers an exclusive look into what attendees can expect.
Our conversation featured Margaret Smith and Reshma Joy from the West Virginia Department of Transportation. They shared their crucial work in ensuring data integrity through rigorous validations of their Linear Reference System data. This foundational work underpins much of their operational efficiency and decision-making.
They further revealed how they’ve achieved seamless integration between Survey123 and their R&H data, showcasing a strong example of how disparate systems can be harmonized for greater insight. This presentation will provide attendees with actionable strategies for enhancing data quality and interoperability.
We also spoke to Bruno Blanco, a GIS Engineer from Shelby County 9-1-1. Bruno walked us through how FME supports critical aspects of their 911 addressing workflow—particularly data aggregation, QA/QC, and attribution—within a larger automation framework.
This work highlighted the power of automation in critical public safety infrastructure. By streamlining their addressing processes, Shelby County 9-1-1 is improving response times and ensuring more accurate location data, ultimately saving lives. Bruno’s presentation will offer valuable insights into how organisations can leverage FME to automate complex workflows and enhance operational efficiency.
This episode serves as a compelling preview for the main event at The Peak of Data and AI. If you’d like to learn more about Bruno and Shelby County 9-1-1’s story, check out their success story with Safe. For further information on what we’ve talked about and to register for The Peak of Data and AI event in Seattle, please head over to peakofdataintegration.com.
Takeaways
Data validation is essential for accurate operations.FME enables seamless integration of disparate systems.Automation of critical processes improves public safety.Networking and community learning are key benefits of The Peak.Breakout sessions provide valuable hands-on FME knowledge.AI is increasingly influencing data integration workflows.
-
"There's been a lot of wrangling of data, a lot of wrangling of humans as well, which is a big theme for today," says Warwick Leitch, Product Management Director at insightsoftware.
In this episode of the 'Don't Panic, It's Just Data' podcast, Debbie Reynolds, CEO and Chief Data Privacy Officer at Debbie Reynolds Consulting LLC, speaks with Leitch from insightsoftware. They discuss the vital role of financial strategy and collaborative planning, particularly as it pertains to the decisions made by IT executives.
The question they address is: In a world awash with data, how do we transform it into actionable insights? Warwick shares his wealth of experience, offering practical advice and illuminating the path to successful budgeting and forecasting.
One such challenge addressed in the podcast is how organisations are securing executive buy-in. "And 51 percent of people find it difficult to engage senior executives to buy into the process, which is a roadblock. And 57 percent of organisations struggle cross-functionally," Warwick reveals.
It's not just about the numbers. Warwick also emphasises the human element, reminding us that "Without people, we don't have anything." In an era where AI looms large, it's crucial to remember that technology serves to enhance, not replace, human collaboration.
Tune in to the podcast and learn how to navigate the complexities of financial strategy and collaborative planning.
Takeaways
Executive buy-in is crucial for successful budgeting.A clear vision helps guide the budgeting process.Thoughtful execution is key to effective planning.Fostering a culture of collaboration enhances participation.Data accuracy is vital in today's fast-paced environment.Avoid overcomplicating the budgeting process.Gamification can improve engagement in budgeting.AI can significantly streamline forecasting and reporting.Regularly updating forecasts leads to better accuracy.Understanding business measures is essential for effective planning.Chapters
00:00 Introduction to Collaborative Financial Planning
05:01 The Importance of Executive Buy-In
10:09 Thoughtful Execution in Budgeting
15:01 Fostering a Culture of Collaboration
19:48 Defining Business Measures and Data Accuracy
24:52 Common Pitfalls in Collaborative Budgeting
29:52 The Future of Collaborative Planning with AI
-
The digital age is fueled by data, and the engines powering that data are data centres. However, this growth comes at a significant energy cost. In the latest episode of the EM360Tech Don’t Panic It’s Just Data podcast, Shubhangi Dua speaks with Rolf Bienert, Technical & Managing Director of the OpenADR Alliance, to shed light on the urgent need for sustainable energy practices within the data centres industry.
In this episode, we discuss the stark reality of escalating energy consumption, driven by factors like the rise of AI, and the critical importance of moving beyond superficial "green" initiatives to implement genuine, impactful solutions.
From talking about the historical context of data centres energy usage, the evolution of energy demands and the challenges of achieving net-zero goals, Rolf provides valuable insights into innovative solutions such as smart grids, microgrids, and virtual power plants. These hold immense potential for managing energy distribution efficiently and sustainably.
Beyond technological solutions, the podcast addresses the critical role of regulatory frameworks and industry standards in fostering sustainable practices. The frameworks are necessary to adapt to modern energy consumption patterns, ensuring interoperability and reducing costs. It spotlights the importance of collaboration between IT and utility sectors, as well as open communication with the public, to address concerns about energy consumption and build trust.
Takeaways
Data centres are increasingly becoming significant consumers of energy.Sustainability in data centre is often perceived as branding rather than genuine effort.AI's demand for processing power is escalating energy needs.Smart grids are essential for managing energy distribution effectively.Microgrids and virtual power plants offer promising solutions for energy sustainability.Enterprises can leverage renewable energy to become energy providers.Regulatory frameworks need to adapt to modern energy consumption patterns.Standards are crucial for ensuring interoperability and reducing costs.Collaboration between IT and utility sectors is vital for sustainable energy management.Open communication is key to addressing public concerns about energy consumption.Chapters
00:00 Introduction to Data Centres Sustainability
03:22 Historical Perspective on Data centres Energy Consumption
08:32 The Role of Smart Grids in Energy Management
12:40 Understanding Microgrids and Virtual Power Plants
21:30 Enterprise Strategies for Sustainable Data Centres
29:51 Regulatory Challenges and Opportunities
32:34 The Importance of Standards in Data Centres Growth
-
As organisations strive to stay competitive in the age of AI, data trust has become a critical factor for success. Without it, even the most advanced AI initiatives are bound to fall short.
With the rapid advancement of technology, prioritising trust in data is essential for unlocking AI's full potential and driving meaningful results. Conversely, a lack of data trust can undermine decision-making, operational efficiency, and the success of AI initiatives.
In this episode, Christina Stathopoulos, Founder at Dare to Data, speaks to Jay Limburn, Chief Product Officer at Ataccama, to explore these pressing topics. Together, they share actionable insights, real-world examples, and innovative strategies to help businesses harness the power of trusted data.
Key Takeaways
Data trust is essential for confident decision-making.AI can significantly reduce mundane tasks.Organizations must focus on their data strategy.Customer experience is a key area for AI application.Data teams are crucial for successful AI initiatives.Proactive data management is becoming the norm.The chief data officer's influence is growing.Data quality and security are critical challenges.AI can enhance regulatory reporting processes.Trust in data is vital for successful AI projects.Chapters
00:00 - Introduction to Data Trust and AI Integration
09:36 - The Role of AI in Operational Efficiency
12:47 - Balancing Short-term and Long-term Data Priorities
15:00 - Enhancing Customer Experience through AI
19:08 - Aligning Workforce and Culture for AI Success
21:03 - Innovative Strategies for Data Quality and Security
24:35 - Final Thoughts on Data Trust and AI Success
-
The San Antonio River Authority (SARA) has experienced a transformative shift in data management, thanks to the powerful capabilities of FME. By integrating FME, SARA has streamlined data integration, improved efficiency, and enhanced decision-making processes across multiple departments. FME’s ability to automate data transformation, standardise formats, and manage large volumes of spatial data has allowed the authority to optimise workflows, reduce manual errors, and accelerate project timelines.
A key highlight of SARA’s success with FME is its use in predictive flood modelling and the standardisation of data workflows. By leveraging FME, SARA can more accurately predict flood risks, improving public safety and response times.
This innovation not only enhances internal operations but also helps SARA lead in sustainable water management. With its versatility in handling diverse data sources and streamlining communication between systems, FME is a powerful investment for organisations seeking to improve operational efficiency and long-term strategic decision-making.
In this episode, Debbie Reynolds, Founder and Chief Data Privacy Officer at Debbie Reynolds Consulting, speaks to Jordan Merson, Enterprise Applications Supervisor at San Antonio River Authority, about the game-changing impact of FME.
Key Takeaways:
Data management challenges often stem from a lack of standardisation.FME allows for the integration of various data sources seamlessly.Predictive modelling can enhance emergency response efforts.FME provides tools for real-time data monitoring and alerts.The user-friendly interface of FME accelerates onboarding for new team members.FME can handle both spatial and non-spatial data effectively.Collaboration and knowledge sharing are key to successful data management.Chapters:
00:00 - Introduction to Data Management and FME
02:30 - Jordan's Journey in IT and Data Management
05:43 - Challenges Before FME Implementation
08:36 - Transformative Impact of FME on Data Processes
10:02 - Real-World Applications of FME at San Antonio River Authority
14:06 - Predictive Flood Modeling and Emergency Operations
16:31 - Standardization and Efficiency with FME
17:59 - Final Thoughts and Recommendations on FME
-
Summary
This discussion explores the complexities and strategies surrounding edge computing and data management, highlighting the importance of security, the challenges of vendor lock-in, the implications of data repatriation, and the necessity of ensuring high-quality data for AI systems. It emphasises the need for organisations to balance edge processing with centralised storage while future-proofing their data strategies against rapid technological changes.
Building on their discussion, Jimmy Tam highlights the transformative role of edge computing in modern data management, emphasising the importance of governance, compliance, and interoperability to address the challenges of data sprawl and vendor lock-in.
Takeaways
Edge computing is transforming how organisations manage data.Security at the edge is paramount to prevent intrusions.Data sprawl poses significant challenges for edge data management.Governance and compliance are essential for effective data management.Vendor lock-in can limit flexibility and adaptability in technology.Data interoperability is crucial for avoiding vendor lock-in.Data repatriation is a growing trend among organisations.AI systems require access to comprehensive data for training.Speed of data relevance is critical for effective AI applications.Flexibility in data strategies is essential for future-proofing organisations.Sound Bites
"Data sprawl is a significant problem."
"Governance and compliance are crucial."
"Data repatriation is absolutely real."
"Speed of data relevance is critical."
Chapters
00:00 Introduction to Edge Computing and Data Management
02:53 Security Strategies for Edge Data
06:06 Vendor Lock-In and Data Interoperability
09:00 Data Repatriation and Cost Optimisation
11:57 Ensuring Quality Data for AI Systems
14:46 Balancing Edge Processing and Centralised Storage
17:59 Future-Proofing Data Strategies
-
In today’s data-driven world, real-time analytics has become a cornerstone for businesses seeking to make smarter, faster decisions. From enhancing user experiences to enabling continuous intelligence, the ability to process data in real-time is transforming industries. Yet, challenges such as legacy systems and the demand for innovative data management approaches persist.
This episode explores the evolution of real-time analytics and its crucial role in modern data processing. We delve into how technology is reshaping the way businesses interact with data and the importance of user-centric design in creating powerful data applications.
Joining Christina Stathopoulos, Founder of Dare to Data, is Rahul Rastogi, Chief Innovation Officer at SingleStore. Together, they discuss the necessity of real-time data in today’s fast-paced business environment, tackle the challenges organizations face in adapting to this shift, and highlight how data serves as the foundation for AI-driven innovation.
Don’t miss this insightful discussion packed with practical strategies and forward-looking ideas!
Key Takeaways
Real-time analytics has evolved from a luxury to a necessity.Streaming technologies like Kafka and Spark have revolutionized data processing.Legacy systems are often monolithic and ill-suited for real-time analytics.Modern data platforms enable easier data management and integration.Continuous intelligence requires a solid analytics foundation.User experience is critical for the adoption of data applications.Organizations must treat data as a valuable asset.Data governance and quality are essential for effective analytics.The separation of compute from storage enhances scalability.Real-time processing with low latency improves user satisfaction.Chapters
00:00 - Introduction to Real-Time Analytics
06:14 - The Evolution of Technology in Data Processing
10:09 - Challenges of Legacy Systems
14:23 - Innovative Approaches to Data Management
18:06 - Building a Foundation for AI Innovations
21:27 - User Experience in Data Applications
-
The convergence of Master Data Management (MDM) and Artificial Intelligence (AI) is transforming how businesses harness data to drive innovation and efficiency. MDM provides the foundation by organising, standardising, and maintaining critical business data, ensuring consistency and accuracy across an organisation.
When paired with AI, this clean and structured data becomes a powerful asset, enabling advanced analytics, predictive insights, and intelligent automation. MDM and AI help businesses uncover hidden patterns, streamline operations, and make more informed decisions in real-time.
By integrating MDM with AI, organisations can move beyond simply managing data to actively leveraging it for competitive advantage. AI algorithms thrive on high-quality, well-structured data, and MDM ensures just that—minimising errors and redundancies that could compromise results. This synergy empowers companies to personalise customer experiences, optimise supply chains, and respond proactively to market changes.
In this episode, Kevin Petrie, VP of Research at BARC US, speaks to Jesper Grode, Director of Product Innovation at Stibo Systems, about the intersection between AI and MDM.
Key Takeaways:
AI and master data management should be integrated for better outcomes.Master data improves the quality of inputs for AI models.Accurate data is crucial for training machine learning models.Generative AI can enhance product launch processes.Prompt engineering is essential for generating accurate AI responses.AI can optimise MDM processes and reduce operational costs.Fast prototyping is vital for successful AI implementation.Chapters:
00:00 - Introduction to AI and Master Data Management
02:59 - The Synergy Between AI and Master Data
05:49 - Generative AI and Master Data Management
09:12 - Leveraging Master Data for Small Language Models
11:58 - AI's Role in Optimizing Master Data Management
14:53 - Best Practices for Implementing AI in MDM
-
As cloud adoption grows, so do the challenges of managing costs effectively. Cloud environments offer scalability and flexibility but often come with hidden fees, unpredictable expenses, and resource sprawl that can quickly inflate budgets. Without the right tools and strategies, businesses may struggle to track spending, identify waste, and maintain budget alignment.
Usage-based reporting is pivotal in this process, providing the granular visibility needed to understand real-time consumption patterns and optimise costs. Businesses can align expenses directly with value-driven activities by tracking how, where, and when resources are used. From preventing overspending to fostering accountability, usage-based reporting empowers teams to proactively manage their cloud expenses, turning cloud cost management into a strategic advantage rather than a recurring headache.
In this episode, George Firican, Founder of LightsOnData, speaks to Rem Baumann, Resident FinOps Expert at Vantage, about usage-based reporting and its benefits.
Key Takeaways:
Organisations face challenges in tracking complex cloud costs.Usage-based reporting provides context to cloud spending.Metrics should align with business goals for effective decision-making.Communication between finance and engineering teams is crucial.Identifying cost optimisation opportunities can lead to significant savings.Different industries require customised cost metrics.Cloud providers offer basic tools, but deeper insights are needed.Regular monitoring of metrics ensures financial transparency.Chapters:
00:00 - Introduction to Cloud Cost Management
03:03 - Understanding Cloud Complexity and Cost Tracking
05:53 - The Role of Usage-Based Reporting
09:06 - Metrics for Cost Optimization
12:02 - Industry-Specific Applications of Cost Metrics
14:49 - Aligning Cloud Costs with Business Goals
18:09 - Conclusion and Key Takeaways
-
Data custodianship today involves managing and protecting vast quantities of sensitive information, requiring organisations to ensure security, regulatory compliance, and ethical usage. It’s not just about protecting data from breaches but also about responsible storage, access, and deletion that aligns with strict industry standards and evolving privacy regulations.
The ethical dimensions of data custodianship add further complexity as organisations balance the need for data-driven insights with privacy rights and transparent usage. Mismanagement can lead to significant financial, legal, and reputational risks, making effective custodianship essential for maintaining customer trust and regulatory compliance.
In this episode, Paulina Rios Maya, Head of Industry Relations, speaks to Debbie Reynolds, Founder and Chief Data Privacy Officer at Debbie Reynolds Consulting, about compliance with global regulations, the role of AI in data management, and the necessity of human oversight in technology.
Key Takeaways:
Data custodianship emphasises that data belongs to individuals, not companies.Organisations must have a comprehensive plan for data management throughout its lifecycle.Transparency and communication with consumers are essential in data handling.Different types of data require different levels of protection based on risk.Building trust with consumers requires responsible data practices.Organisations need to prioritise basic data protection strategies over compliance with every regulation.Chapters:
00:00 - Introduction to Data Custodianship
03:03 - Understanding Responsibilities in Data Handling
05:59 - Balancing Innovation and Data Protection
08:45 - Building Trust Through Responsible Data Practices
12:07 - Navigating Compliance and Data Governance
14:54 - Leveraging AI for Enhanced Data Custodianship
18:06 - The Role of Humans in Technology and Data Management
-
Generative AI and unstructured data are transforming how businesses improve customer experiences and streamline internal processes. As technology evolves, companies find new ways to gain insights, automate tasks, and personalize interactions, unlocking new growth opportunities.
The integration of these technologies is reshaping operations, driving efficiency, and enhancing decision-making, helping businesses stay competitive and agile in a rapidly changing landscape. Organizations that embrace these innovations can better adapt to customer needs and market demands, positioning themselves for long-term success.
In this episode, Doug Laney speaks to Katrina M. Conn, Senior Practice Director of Data Science at Teradata, and Sri Raghavan, Principal of Data Science and Analytics at AWS, about sustainability efforts and the ethical considerations surrounding AI.
Key Takeaways:
Generative AI is being integrated into various business solutions.Unstructured data is crucial for enhancing customer experiences.Real-time analytics can improve customer complaint resolution.Sustainability is a key focus in AI resource management.Explainability in AI models is essential for ethical decision-making.The combination of structured and unstructured data enhances insights.AI innovations are making analytics more accessible to users.Trusted AI frameworks are vital for security and governance.Chapters:
00:00 - Introduction to the Partnership and Generative AI
02:50 - Technological Integration and Market Expansion
06:08 - Leveraging Unstructured Data for Insights
08:55 - Innovations in Customer Experience and Internal Processes
11:48 - Sustainability and Resource Optimization in AI
15:08 - Ensuring Ethical AI and Explainability
23:57 - Conclusion and Future Directions
-
In this episode, Rachel Thornton, Fivetran's CMO, discusses the highlights of Big Data London 2024, including the launch of Fivetran Hybrid Deployment, which addresses the needs of organisations with mixed IT environments.
The conversation delves into integrating AI into business operations, emphasizing the importance of a robust data foundation. Additionally, data security and compliance challenges in the context of GDPR and other regulations are explored. The episode concludes with insights on the benefits of hybrid deployment for organisations.
Key Takeaways:
Big Data London 2024 is a significant event for data leaders.Fivetran Hybrid Deployment caters to organizations with mixed IT environments.AI integration requires a strong data foundation.Data security and compliance are critical in today's landscape.Organizations must understand their data sources for effective AI use.Hybrid deployment allows for secure data management.Compliance regulations are becoming increasingly stringent. Data readiness is essential for AI integration.Chapters:
00:00 - Introduction to Big Data London 2024
02:46 - Launch of Fibntran Hybrid Deployment
06:06 - Integrating AI into Business Operations
08:54 - Data Security and Compliance Challenges
11:50 - Benefits of Hybrid Deployment
-
Managing network traffic efficiently is essential to control cloud costs. Network flow
reports are critical in providing detailed insights into data movement across cloud
environments. These reports help organisations identify usage patterns, track
bandwidth consumption, and uncover inefficiencies that may lead to higher expenses.
With a clear understanding of how data flows, businesses can make informed decisions
to optimise traffic, reduce unnecessary data transfers, and allocate resources more
effectively. This helps lower cloud costs, improves network performance, and enhances
security by revealing unusual or potentially harmful traffic patterns.
In this episode, Wayne Eckerson from Eckerson Group speaks to Ben Schaechter,
CEO of Vantage, about optimising network traffic costs with Vantage’s Network Flow
Reports.
Key Takeaways:
● Network Flow Reports provide detailed insights into AWS costs.
● They help identify specific resources driving network traffic costs.
● Organisations can reduce costs by up to 90% with proper configuration.
● The shift towards cost management in cloud services is critical.
● FinOps teams are becoming essential for cloud cost optimization.
● Anomaly detection can alert teams to unexpected cost spikes.
● Vantage integrates with multiple cloud providers for comprehensive cost
management.
● Effective cost management does not have to impact production workflows.
Chapters:
00:00 - Introduction to Vantage and Network Flow Reports
02:52 - Understanding Network Flow Reports and Their Impact
06:09 - Real-World Applications and Case Studies
09:03 - The Shift in Cost Management Focus
11:54 - Tangible Benefits of Implementing Network Flow Reports
15:07 - The Role of FinOps in Cost Optimization
18:00 - Conclusion and Future Insights
- Mehr anzeigen