Рет қаралды 723
My Intuitive Bayes Online Courses: www.intuitivebayes.com/
1:1 Mentorship with me: topmate.io/alex_andorra
This is part one of our series on HSGP, focusing on the mathematical foundations of the method. The two other parts will focus on Practical Tips & Tricks, and finally a Full Bayesian Workflow with examples. Subscribe to the channel to know when they are released!
📚 In this webinar, dive into the cutting-edge world of data analysis and modeling in our upcoming webinar, where we demystify the revolutionary Hilbert Space Gaussian Process (HSGP) approximation.
This technique is your key to leveraging Gaussian processes at an unprecedented scale, transforming complex data into actionable insights.
📈 Takeaways
- Gaussian Processes are powerful models that can be used in various applications, such as popularity modeling and time-variant coefficients in linear regression.
- The Hilbert Space GP (HSGP) approximation is a method to approximate the large kernel matrix in GPs, making the inversion process more feasible.
- Setting priors on the kernel parameters and using the conditional method are key steps in implementing the HSGP approximation.
- Out-of-sample predictions with GPs can be challenging, especially when extrapolating beyond the range of the training data.
- HSGP allows for more efficient and scalable modeling with GPs, making them applicable to larger datasets.
- HSGPs use eigenvalues and eigenvectors to approximate functions and make predictions.
- HSGPs have limitations in higher dimensions and require careful selection of parameters.
- HSGPs offer more interpretability and flexibility compared to spline models.
🔍 What You Will Learn:
- Foundational Concepts Simplified: Begin your journey with an accessible introduction to Gaussian processes and the principles of spectral analysis methods. We break down complex ideas into understandable concepts, ensuring you grasp the fundamentals of HSGP approximation.
- Real-World Application: Through the lens of the seminal Birthdays dataset, witness the power of HSGP in action. We'll guide you step-by-step, demonstrating how this method can be applied to tangible data challenges, making the abstract concrete.
- Innovative Techniques Explained: At the heart of HSGP lies the Laplacian's spectral decomposition-its eigenvalues and eigenvectors. Learn how a truncated sum of eigenvectors, combined in a linear fashion, can approximate the kernel of a Gaussian process with high efficiency. Discover how the selection of coefficients, influenced by the spectral density of the Gaussian process kernel, unlocks faster computations without sacrificing precision.
- Speed and Efficiency: Uncover the game-changing observation that the eigenvectors used in HSGP approximation remain constant across different hyperparameters of the Gaussian process kernel. This insight is crucial for accelerating computational processes, enabling you to handle larger datasets more efficiently than ever before.
🌟 Why Attend?
Don't miss this unique opportunity to enhance your machine learning and modeling expertise. Gain a comprehensive understanding of Gaussian Processes through the innovative lens of HSGP approximation.
Whether you're a data science enthusiast, a seasoned analyst, or a professional looking to refine your skill set, this webinar will equip you with the knowledge and tools to elevate your analytical capabilities.
🎙️ Our guest speaker, Juan Orduz, is a mathematician (Ph.D., Humboldt Universität Berlin) and data scientist. He is interested in interdisciplinary applications of mathematical methods, particularly time series analysis, Bayesian methods, and causal inference.
🎁 If you're a Patron of the Learning Bayesian Statistics podcast, you can submit questions in advance, enjoy early access to all webinar recordings, and get at least a 50% discount on future webinars ( / learnbayesstats ).
References:
Slides: juanitorduz.github.io/html/hs...
Notebook: github.com/juanitorduz/websit...
Previous webinars: • Modeling Webinars
HSGP Reference & First Steps: www.pymc.io/projects/examples...
GPs with Aki Vehtari: learnbayesstats.com/episode/m...
Automated GPs: learnbayesstats.com/episode/1...
Juan's website: juanitorduz.github.io/
Alex on Twitter: / alex_andorra
Alex on LinkedIn: / alex-andorra
Chapters:
00:00 Overview
12:31 Understanding GPs and Kernels
38:18 Understanding the Covariance Function
01:19:46 Implementing HSGPs in PyMC
01:45:38 HSGPs v Splines