Episodes

  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways:

    Effective data science education requires feedback and rapid iteration.Building LLM applications presents unique challenges and opportunities.The software development lifecycle for AI differs from traditional methods.Collaboration between data scientists and software engineers is crucial.Hugo's new course focuses on practical applications of LLMs.Continuous learning is essential in the fast-evolving tech landscape.Engaging learners through practical exercises enhances education.POC purgatory refers to the challenges faced in deploying LLM-powered software.Focusing on first principles can help overcome integration issues in AI.Aspiring data scientists should prioritize problem-solving over specific tools.Engagement with different parts of an organization is crucial for data scientists.Quick paths to value generation can help gain buy-in for data projects.Multimodal models are an exciting trend in AI development.Probabilistic programming has potential for future growth in data science.Continuous learning and curiosity are vital in the evolving field of data science.

    Chapters:

    09:13 Hugo's Journey in Data Science and Education

    14:57 The Appeal of Bayesian Statistics

    19:36 Learning and Teaching in Data Science

    24:53 Key Ingredients for Effective Data Science Education

    28:44 Podcasting Journey and Insights

    36:10 Building LLM Applications: Course Overview

    42:08 Navigating the Software Development Lifecycle

    48:06 Overcoming Proof of Concept Purgatory

    55:35 Guidance for Aspiring Data Scientists

    01:03:25 Exciting Trends in Data Science and AI

    01:10:51 Balancing Multiple Roles in Data Science

    01:15:23 Envisioning Accessible Data Science for All

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim

  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways:

    CFA is commonly used in psychometrics to validate theoretical constructs.Theoretical structure is crucial in confirmatory factor analysis.Bayesian approaches offer flexibility in modeling complex relationships.Model validation involves both global and local fit measures.Sensitivity analysis is vital in Bayesian modeling to avoid skewed results.Complex models should be justified by their ability to answer specific questions.The choice of model complexity should balance fit and theoretical relevance. Fitting models to real data builds confidence in their validity.Divergences in model fitting indicate potential issues with model specification.Factor analysis can help clarify causal relationships between variables.Survey data is a valuable resource for understanding complex phenomena.Philosophical training enhances logical reasoning in data science.Causal inference is increasingly recognized in industry applications.Effective communication is essential for data scientists.Understanding confounding is crucial for accurate modeling.

    Chapters:

    10:11 Understanding Structural Equation Modeling (SEM) and Confirmatory Factor Analysis (CFA)

    20:11 Application of SEM and CFA in HR Analytics

    30:10 Challenges and Advantages of Bayesian Approaches in SEM and CFA

    33:58 Evaluating Bayesian Models

    39:50 Challenges in Model Building

    44:15 Causal Relationships in SEM and CFA

    49:01 Practical Applications of SEM and CFA

    51:47 Influence of Philosophy on Data Science

    54:51 Designing Models with Confounding in Mind

    57:39 Future Trends in Causal Inference

    01:00:03 Advice for Aspiring Data Scientists

    01:02:48 Future Research Directions

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy,

  • Missing episodes?

    Click here to refresh the feed.

  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    -------------------------

    Love the insights from this episode? Make sure you never miss a beat with Chatpods! Whether you're commuting, working out, or just on the go, Chatpods lets you capture and summarize key takeaways effortlessly.

    Save time, stay organized, and keep your thoughts at your fingertips.

    Download Chatpods directly from App Store or Google Play and use it to listen to this podcast today!

    https://www.chatpods.com/?fr=LearningBayesianStatistics

    -------------------------

    Takeaways:

    Epidemiology focuses on health at various scales, while biology often looks at micro-level details.Bayesian statistics helps connect models to data and quantify uncertainty.Recent advancements in data collection have improved the quality of epidemiological research.Collaboration between domain experts and statisticians is essential for effective research.The COVID-19 pandemic has led to increased data availability and international cooperation.Modeling infectious diseases requires understanding complex dynamics and statistical methods.Challenges in coding and communication between disciplines can hinder progress.Innovations in machine learning and neural networks are shaping the future of epidemiology.The importance of understanding the context and limitations of data in research. 

    Chapters:

    00:00 Introduction to Bayesian Statistics and Epidemiology

    03:35 Guest Backgrounds and Their Journey

    10:04 Understanding Computational Biology vs. Epidemiology

    16:11 The Role of Bayesian Statistics in Epidemiology

    21:40 Recent Projects and Applications in Epidemiology

    31:30...

  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways:

    Bob's research focuses on corruption and political economy.Measuring corruption is challenging due to the unobservable nature of the behavior.The challenge of studying corruption lies in obtaining honest data.Innovative survey techniques, like randomized response, can help gather sensitive data.Non-traditional backgrounds can enhance statistical research perspectives.Bayesian methods are particularly useful for estimating latent variables.Bayesian methods shine in situations with prior information.Expert surveys can help estimate uncertain outcomes effectively.Bob's novel, 'The Bayesian Hitman,' explores academia through a fictional lens.Writing fiction can enhance academic writing skills and creativity.The importance of community in statistics is emphasized, especially in the Stan community.Real-time online surveys could revolutionize data collection in social science.

    Chapters:

    00:00 Introduction to Bayesian Statistics and Bob Kubinec

    06:01 Bob's Academic Journey and Research Focus

    12:40 Measuring Corruption: Challenges and Methods

    18:54 Transition from Government to Academia

    26:41 The Influence of Non-Traditional Backgrounds in Statistics

    34:51 Bayesian Methods in Political Science Research

    42:08 Bayesian Methods in COVID Measurement

    51:12 The Journey of Writing a Novel

    01:00:24 The Intersection of Fiction and Academia

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell,...

  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways:

    User experience is crucial for the adoption of Stan.Recent innovations include adding tuples to the Stan language, new features and improved error messages.Tuples allow for more efficient data handling in Stan.Beginners often struggle with the compiled nature of Stan.Improving error messages is crucial for user experience.BridgeStan allows for integration with other programming languages and makes it very easy for people to use Stan models.Community engagement is vital for the development of Stan.New samplers are being developed to enhance performance.The future of Stan includes more user-friendly features.

    Chapters:

    00:00 Introduction to the Live Episode

    02:55 Meet the Stan Core Developers

    05:47 Brian Ward's Journey into Bayesian Statistics

    09:10 Charles Margossian's Contributions to Stan

    11:49 Recent Projects and Innovations in Stan

    15:07 User-Friendly Features and Enhancements

    18:11 Understanding Tuples and Their Importance

    21:06 Challenges for Beginners in Stan

    24:08 Pedagogical Approaches to Bayesian Statistics

    30:54 Optimizing Monte Carlo Estimators

    32:24 Reimagining Stan's Structure

    34:21 The Promise of Automatic Reparameterization

    35:49 Exploring BridgeStan

    40:29 The Future of Samplers in Stan

    43:45 Evaluating New Algorithms

    47:01 Specific Algorithms for Unique Problems

    50:00 Understanding Model Performance

    54:21 The Impact of Stan on Bayesian Research

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin...

  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways:

    Designing experiments is about optimal data gathering.The optimal design maximizes the amount of information.The best experiment reduces uncertainty the most.Computational challenges limit the feasibility of BED in practice.Amortized Bayesian inference can speed up computations.A good underlying model is crucial for effective BED.Adaptive experiments are more complex than static ones.The future of BED is promising with advancements in AI.

    Chapters:

    00:00 Introduction to Bayesian Experimental Design

    07:51 Understanding Bayesian Experimental Design

    19:58 Computational Challenges in Bayesian Experimental Design

    28:47 Innovations in Bayesian Experimental Design

    40:43 Practical Applications of Bayesian Experimental Design

    52:12 Future of Bayesian Experimental Design

    01:01:17 Real-World Applications and Impact

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov,...

  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways:

    Building an athlete management system and a scouting and recruitment platform are key goals in football analytics.The focus is on informing training decisions, preventing injuries, and making smart player signings.Avoiding false positives in player evaluations is crucial, and data analysis plays a significant role in making informed decisions.There are similarities between different football teams, and the sport has social and emotional aspects. Transitioning from on-premises SQL servers to cloud-based systems is a significant endeavor in football analytics.Analytics is a tool that aids the decision-making process and helps mitigate biases. The impact of analytics in soccer can be seen in the decline of long-range shots.Collaboration and trust between analysts and decision-makers are crucial for successful implementation of analytics.The limitations of available data in football analytics hinder the ability to directly measure decision-making on the field. Analyzing the impact of coaches in sports analytics is challenging due to the difficulty of separating their effect from other factors. Current data limitations make it hard to evaluate coaching performance accurately.Predictive metrics and modeling play a crucial role in soccer analytics, especially in predicting the career progression of young players.Improving tracking data and expanding its availability will be a significant focus in the future of soccer analytics.

    Chapters:

    00:00 Introduction to Ravi and His Role at Seattle Sounders 

    06:30 Building an Analytics Department

    15:00 The Impact of Analytics on Player Recruitment and Performance 

    28:00 Challenges and Innovations in Soccer Analytics 

    42:00 Player Health, Injury Prevention, and Training 

    55:00 The Evolution of Data-Driven Strategies

    01:10:00 Future of Analytics in Sports

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson,

  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways:

    State space models and traditional time series models are well-suited to forecast loss ratios in the insurance industry, although actuaries have been slow to adopt modern statistical methods.Working with limited data is a challenge, but informed priors and hierarchical models can help improve the modeling process.Bayesian model stacking allows for blending together different model predictions and taking the best of both (or all if more than 2 models) worlds.Model comparison is done using out-of-sample performance metrics, such as the expected log point-wise predictive density (ELPD). Brute leave-future-out cross-validation is often used due to the time-series nature of the data.Stacking or averaging models are trained on out-of-sample performance metrics to determine the weights for blending the predictions. Model stacking can be a powerful approach for combining predictions from candidate models. Hierarchical stacking in particular is useful when weights are assumed to vary according to covariates.BayesBlend is a Python package developed by Ledger Investing that simplifies the implementation of stacking models, including pseudo Bayesian model averaging, stacking, and hierarchical stacking.Evaluating the performance of patient time series models requires considering multiple metrics, including log likelihood-based metrics like ELPD, as well as more absolute metrics like RMSE and mean absolute error.Using robust variants of metrics like ELPD can help address issues with extreme outliers. For example, t-distribution estimators of ELPD as opposed to sample sum/mean estimators.It is important to evaluate model performance from different perspectives and consider the trade-offs between different metrics. Evaluating models based solely on traditional metrics can limit understanding and trust in the model. Consider additional factors such as interpretability, maintainability, and productionization.Simulation-based calibration (SBC) is a valuable tool for assessing parameter estimation and model correctness. It allows for the interpretation of model parameters and the identification of coding errors.In industries like insurance, where regulations may restrict model choices, classical statistical approaches still play a significant role. However, there is potential for Bayesian methods and generative AI in certain areas.
  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways:

    Education and visual communication are key in helping athletes understand the impact of nutrition on performance.Bayesian statistics are used to analyze player performance and injury risk.Integrating diverse data sources is a challenge but can provide valuable insights.Understanding the specific needs and characteristics of athletes is crucial in conditioning and injury prevention. The application of Bayesian statistics in baseball science requires experts in Bayesian methods.Traditional statistical methods taught in sports science programs are limited.Communicating complex statistical concepts, such as Bayesian analysis, to coaches and players is crucial.Conveying uncertainties and limitations of the models is essential for effective utilization.Emerging trends in baseball science include the use of biomechanical information and computer vision algorithms.Improving player performance and injury prevention are key goals for the future of baseball science.

    Chapters:

    00:00 The Role of Nutrition and Conditioning

    05:46 Analyzing Player Performance and Managing Injury Risks

    12:13 Educating Athletes on Dietary Choices

    18:02 Emerging Trends in Baseball Science

    29:49 Hierarchical Models and Player Analysis

    36:03 Challenges of Working with Limited Data

    39:49 Effective Communication of Statistical Concepts

    47:59 Future Trends: Biomechanical Data Analysis and Computer Vision Algorithms

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde,...

  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways:

    Bayesian statistics is a powerful framework for handling complex problems, making use of prior knowledge, and excelling with limited data.Bayesian statistics provides a framework for updating beliefs and making predictions based on prior knowledge and observed data.Bayesian methods allow for the explicit incorporation of prior assumptions, which can provide structure and improve the reliability of the analysis.There are several Bayesian frameworks available, such as PyMC, Stan, and Bambi, each with its own strengths and features.PyMC is a powerful library for Bayesian modeling that allows for flexible and efficient computation.For beginners, it is recommended to start with introductory courses or resources that provide a step-by-step approach to learning Bayesian statistics.PyTensor leverages GPU acceleration and complex graph optimizations to improve the performance and scalability of Bayesian models.ArviZ is a library for post-modeling workflows in Bayesian statistics, providing tools for model diagnostics and result visualization.Gaussian processes are versatile non-parametric models that can be used for spatial and temporal data analysis in Bayesian statistics.

    Chapters:

    00:00 Introduction to Bayesian Statistics

    07:32 Advantages of Bayesian Methods

    16:22 Incorporating Priors in Models

    23:26 Modeling Causal Relationships

    30:03 Introduction to PyMC, Stan, and Bambi

    34:30 Choosing the Right Bayesian Framework

    39:20 Getting Started with Bayesian Statistics

    44:39 Understanding Bayesian Statistics and PyMC

    49:01 Leveraging PyTensor for Improved Performance and Scalability

    01:02:37 Exploring Post-Modeling Workflows with ArviZ

    01:08:30 The Power of Gaussian Processes in Bayesian Modeling

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna,...

  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways:

    Teaching Bayesian Concepts Using M&Ms: Tomi Capretto uses an engaging classroom exercise involving M&Ms to teach Bayesian statistics, making abstract concepts tangible and intuitive for students.Practical Applications of Bayesian Methods: Discussion on the real-world application of Bayesian methods in projects at PyMC Labs and in university settings, emphasizing the practical impact and accessibility of Bayesian statistics.Contributions to Open-Source Software: Tomi’s involvement in developing Bambi and other open-source tools demonstrates the importance of community contributions to advancing statistical software.Challenges in Statistical Education: Tomi talks about the challenges and rewards of teaching complex statistical concepts to students who are accustomed to frequentist approaches, highlighting the shift to thinking probabilistically in Bayesian frameworks.Future of Bayesian Tools: The discussion also touches on the future enhancements for Bambi and PyMC, aiming to make these tools more robust and user-friendly for a wider audience, including those who are not professional statisticians. 

    Chapters:

    05:36 Tomi's Work and Teaching

    10:28 Teaching Complex Statistical Concepts with Practical Exercises

    23:17 Making Bayesian Modeling Accessible in Python

    38:46 Advanced Regression with Bambi

    41:14 The Power of Linear Regression

    42:45 Exploring Advanced Regression Techniques

    44:11 Regression Models and Dot Products

    45:37 Advanced Concepts in Regression

    46:36 Diagnosing and Handling Overdispersion

    47:35 Parameter Identifiability and Overparameterization

    50:29 Visualizations and Course Highlights

    51:30 Exploring Niche and Advanced Concepts

    56:56 The Power of Zero-Sum Normal

    59:59 The Value of Exercises and Community

    01:01:56 Optimizing Computation with Sparse Matrices

    01:13:37 Avoiding MCMC and Exploring Alternatives

    01:18:27 Making Connections Between Different Models

    Thank you to my Patrons for making this episode...

  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways:

    Communicating Bayesian concepts to non-technical audiences in sports analytics can be challenging, but it is important to provide clear explanations and address limitations.Understanding the model and its assumptions is crucial for effective communication and decision-making.Involving domain experts, such as scouts and coaches, can provide valuable insights and improve the model's relevance and usefulness.Customizing the model to align with the specific needs and questions of the stakeholders is essential for successful implementation. Understanding the needs of decision-makers is crucial for effectively communicating and utilizing models in sports analytics.Predicting the impact of training loads on athletes' well-being and performance is a challenging frontier in sports analytics.Identifying discrete events in team sports data is essential for analysis and development of models.

    Chapters:

    00:00 Bayesian Statistics in Sports Analytics

    18:29 Applying Bayesian Stats in Analyzing Player Performance and Injury Risk

    36:21 Challenges in Communicating Bayesian Concepts to Non-Statistical Decision-Makers

    41:04 Understanding Model Behavior and Validation through Simulations

    43:09 Applying Bayesian Methods in Sports Analytics

    48:03 Clarifying Questions and Utilizing Frameworks

    53:41 Effective Communication of Statistical Concepts

    57:50 Integrating Domain Expertise with Statistical Models

    01:13:43 The Importance of Good Data

    01:18:11 The Future of Sports Analytics

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew...

  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways:

    Use mini-batch methods to efficiently process large datasets within Bayesian frameworks in enterprise AI applications.Apply approximate inference techniques, like stochastic gradient MCMC and Laplace approximation, to optimize Bayesian analysis in practical settings.Explore thermodynamic computing to significantly speed up Bayesian computations, enhancing model efficiency and scalability.Leverage the Posteriors python package for flexible and integrated Bayesian analysis in modern machine learning workflows.Overcome challenges in Bayesian inference by simplifying complex concepts for non-expert audiences, ensuring the practical application of statistical models.Address the intricacies of model assumptions and communicate effectively to non-technical stakeholders to enhance decision-making processes.

    Chapters:

    00:00 Introduction to Large-Scale Machine Learning

    11:26 Scalable and Flexible Bayesian Inference with Posteriors

    25:56 The Role of Temperature in Bayesian Models

    32:30 Stochastic Gradient MCMC for Large Datasets

    36:12 Introducing Posteriors: Bayesian Inference in Machine Learning

    41:22 Uncertainty Quantification and Improved Predictions

    52:05 Supporting New Algorithms and Arbitrary Likelihoods

    59:16 Thermodynamic Computing

    01:06:22 Decoupling Model Specification, Data Generation, and Inference

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal

  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work !

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways

    Bayesian methods align better with researchers' intuitive understanding of research questions and provide more tools to evaluate and understand models.Prior sensitivity analysis is crucial for understanding the robustness of findings to changes in priors and helps in contextualizing research findings.Bayesian methods offer an elegant and efficient way to handle missing data in longitudinal studies, providing more flexibility and information for researchers.Fit indices in Bayesian model selection are effective in detecting underfitting but may struggle to detect overfitting, highlighting the need for caution in model complexity.Bayesian methods have the potential to revolutionize educational research by addressing the challenges of small samples, complex nesting structures, and longitudinal data. Posterior predictive checks are valuable for model evaluation and selection.

    Chapters

    00:00 The Power and Importance of Priors

    09:29 Updating Beliefs and Choosing Reasonable Priors

    16:08 Assessing Robustness with Prior Sensitivity Analysis

    34:53 Aligning Bayesian Methods with Researchers' Thinking

    37:10 Detecting Overfitting in SEM

    43:48 Evaluating Model Fit with Posterior Predictive Checks

    47:44 Teaching Bayesian Methods

    54:07 Future Developments in Bayesian Statistics

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi...

  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways

    Convincing non-stats stakeholders in sports analytics can be challenging, but building trust and confirming their prior beliefs can help in gaining acceptance.Combining subjective beliefs with objective data in Bayesian analysis leads to more accurate forecasts.The availability of massive data sets has revolutionized sports analytics, allowing for more complex and accurate models.Sports analytics models should consider factors like rest, travel, and altitude to capture the full picture of team performance.The impact of budget on team performance in American sports and the use of plus-minus models in basketball and American football are important considerations in sports analytics.The future of sports analytics lies in making analysis more accessible and digestible for everyday fans.There is a need for more focus on estimating distributions and variance around estimates in sports analytics.AI tools can empower analysts to do their own analysis and make better decisions, but it's important to ensure they understand the assumptions and structure of the data.Measuring the value of certain positions, such as midfielders in soccer, is a challenging problem in sports analytics.Game theory plays a significant role in sports strategies, and optimal strategies can change over time as the game evolves.

    Chapters

    00:00 Introduction and Overview

    09:27 The Power of Bayesian Analysis in Sports Modeling

    16:28 The Revolution of Massive Data Sets in Sports Analytics

    31:03 The Impact of Budget in Sports Analytics

    39:35 Introduction to Sports Analytics

    52:22 Plus-Minus Models in American Football

    01:04:11 The Future of Sports Analytics

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi...

  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    In this episode, Marvin Schmitt introduces the concept of amortized Bayesian inference, where the upfront training phase of a neural network is followed by fast posterior inference.

    Marvin will guide us through this new concept, discussing his work in probabilistic machine learning and uncertainty quantification, using Bayesian inference with deep neural networks. 

    He also introduces BayesFlow, a Python library for amortized Bayesian workflows, and discusses its use cases in various fields, while also touching on the concept of deep fusion and its relation to multimodal simulation-based inference.

    A PhD student in computer science at the University of Stuttgart, Marvin is supervised by two LBS guests you surely know — Paul Bürkner and Aki Vehtari. Marvin’s research combines deep learning and statistics, to make Bayesian inference fast and trustworthy. 

    In his free time, Marvin enjoys board games and is a passionate guitar player.

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary and Blake Walters.

    Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)

    Takeaways:

    Amortized Bayesian inference...
  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    If there is one guest I don’t need to introduce, it’s mister Andrew Gelman. So… I won’t! I will refer you back to his two previous appearances on the show though, because learning from Andrew is always a pleasure. So go ahead and listen to episodes 20 and 27.

    In this episode, Andrew and I discuss his new book, Active Statistics, which focuses on teaching and learning statistics through active student participation. Like this episode, the book is divided into three parts: 1) The ideas of statistics, regression, and causal inference; 2) The value of storytelling to make statistical concepts more relatable and interesting; 3) The importance of teaching statistics in an active learning environment, where students are engaged in problem-solving and discussion.

    And Andrew is so active and knowledgeable that we of course touched on a variety of other topics — but for that, you’ll have to listen ;)

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary and Blake Walters.

    Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)

    Takeaways:

    - Active learning is essential for teaching and learning statistics.

    - Storytelling can make...

  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    In this episode, Andy Aschwanden and Doug Brinkerhoff tell us about their work in glaciology and the application of Bayesian statistics in studying glaciers. They discuss the use of computer models and data analysis in understanding glacier behavior and predicting sea level rise, and a lot of other fascinating topics.

    Andy grew up in the Swiss Alps, and studied Earth Sciences, with a focus on atmospheric and climate science and glaciology. After his PhD, Andy moved to Fairbanks, Alaska, and became involved with the Parallel Ice Sheet Model, the first open-source and openly-developed ice sheet model.

    His first PhD student was no other than… Doug Brinkerhoff! Doug did an MS in computer science at the University of Montana, focusing on numerical methods for ice sheet modeling, and then moved to Fairbanks to complete his PhD. While in Fairbanks, he became an ardent Bayesian after “seeing that uncertainty needs to be embraced rather than ignored”. Doug has since moved back to Montana, becoming faculty in the University of Montana’s computer science department.

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero and Will Geary.

    Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)

  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    GPs are extremely powerful…. but hard to handle. One of the bottlenecks is learning the appropriate kernel. What if you could learn the structure of GP kernels automatically? Sounds really cool, but also a bit futuristic, doesn’t it?

    Well, think again, because in this episode, Feras Saad will teach us how to do just that! Feras is an Assistant Professor in the Computer Science Department at Carnegie Mellon University. He received his PhD in Computer Science from MIT, and, most importantly for our conversation, he’s the creator of AutoGP.jl, a Julia package for automatic Gaussian process modeling.

    Feras discusses the implementation of AutoGP, how it scales, what you can do with it, and how you can integrate its outputs in your models.

    Finally, Feras provides an overview of Sequential Monte Carlo and its usefulness in AutoGP, highlighting the ability of SMC to incorporate new data in a streaming fashion and explore multiple modes efficiently.

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell and Gal Kampel.

    Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)

    Takeaways:

    - AutoGP is a Julia package for automatic Gaussian process modeling that learns the

  • Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    My Intuitive Bayes Online Courses1:1 Mentorship with me

    Changing perspective is often a great way to solve burning research problems. Riemannian spaces are such a perspective change, as Arto Klami, an Associate Professor of computer science at the University of Helsinki and member of the Finnish Center for Artificial Intelligence, will tell us in this episode.

    He explains the concept of Riemannian spaces, their application in inference algorithms, how they can help sampling Bayesian models, and their similarity with normalizing flows, that we discussed in episode 98.

    Arto also introduces PreliZ, a tool for prior elicitation, and highlights its benefits in simplifying the process of setting priors, thus improving the accuracy of our models.

    When Arto is not solving mathematical equations, you’ll find him cycling, or around a good board game.

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.

    Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)

    Takeaways:

    - Riemannian spaces offer a way to improve computational efficiency and accuracy in Bayesian inference by considering the curvature of the posterior distribution.

    - Riemannian spaces can be used in Laplace approximation and Markov chain Monte Carlo...