Andrew Rowan - Bayesian Deep Learning with Edward (and a trick using Dropout)

  Рет қаралды 33,759

PyData

PyData

Күн бұрын

Filmed at PyData London 2017
Description
Bayesian neural networks have seen a resurgence of interest as a way of generating model uncertainty estimates. I use Edward, a new probabilistic programming framework extending Python and TensorFlow, for inference on deep neural nets for several benchmark data sets. This is compared with dropout training, which has recently been shown to be formally equivalent to approximate Bayesian inference.
Abstract
Deep learning methods represent the state-of-the-art for many applications such as speech recognition, computer vision and natural language processing. Conventional approaches generate point estimates of deep neural network weights and hence make predictions that can be overconfident since they do not account well for uncertainty in model parameters. However, having some means of quantifying the uncertainty of our predictions is often a critical requirement in fields such as medicine, engineering and finance. One natural response is to consider Bayesian methods, which offer a principled way of estimating predictive uncertainty while also showing robustness to overfitting.
Bayesian neural networks have a long history. Exact Bayesian inference on network weights is generally intractable and much work in the 1990s focused on variational and Monte Carlo based approximations [1-3]. However, these suffered from a lack of scalability for modern applications. Recently the field has seen a resurgence of interest, with the aim of constructing practical, scalable techniques for approximate Bayesian inference on more complex models, deep architectures and larger data sets [4-10].
Edward is a new, Turing-complete probabilistic programming language built on Python [11]. Probabilistic programming frameworks typically face a trade-off between the range of models that can be expressed and the efficiency of inference engines. Edward can leverage graph frameworks such as TensorFlow to enable fast distributed training, parallelism, vectorisation, and GPU support, while also allowing composition of both models and inference methods for a greater degree of flexibility.
In this talk I will give a brief overview of developments in Bayesian deep learning and demonstrate results of Bayesian inference on deep architectures implemented in Edward for a range of publicly available data sets. Dropout is an empirical technique which has been very successfully applied to reduce overfitting in deep learning models [12]. Recent work by Gal and Ghahramani [13] has demonstrated a surprising formal equivalence between dropout and approximate Bayesian inference in neural networks. I will compare some results of inference via the machinery of Edward with model averaging over neural nets with dropout training.
www.pydata.org
PyData is an educational program of NumFOCUS, a 501(c)3 non-profit organization in the United States. PyData provides a forum for the international community of users and developers of data analysis tools to share ideas and learn from each other. The global PyData network promotes discussion of best practices, new approaches, and emerging technologies for data management, processing, analytics, and visualization. PyData communities approach data science using many languages, including (but not limited to) Python, Julia, and R.
We aim to be an accessible, community-driven conference, with novice to advanced level presentations. PyData tutorials and talks bring attendees the latest project features along with cutting-edge use cases. 00:00 Welcome!
00:10 Help us add time stamps or captions to this video! See the description for details.
Want to help add timestamps to our KZbin videos to help with discoverability? Find out more here: github.com/numfocus/KZbinVi...

Пікірлер: 3
@codetowin
@codetowin 2 жыл бұрын
Thanks for Posting such great content!
@codetowin
@codetowin 2 жыл бұрын
00:17 Roadmap 00:56 Deep Learning 02:19 Drawbacks of Standard Deep Learning 03:13 Probabilistic Machine Learning 04:17 Bayesian Neural Networks 07:28 History of Bayesian Neural Networks 08:15 Modern Revival: Bayesian Deep Learning 08:47 Probabilistic Programming with Edward 11:14 Edward 14:53 Inference in Edward 15:54 Variational Inference 18:24 Black Box Variational Inference with Edward 18:50 Dropout as a Bayesian Approximation 22:34 MC Dropout Experiments 22:54 Experiments 27:20 Model Specific versus Black Box 29:29 Current Research in Variational Inference 30:27 QnA
@codetowin
@codetowin 2 жыл бұрын
I am giving the timestamps as aTribute
Stephen Whitworth - Building robust machine learning systems
38:15
Double Stacked Pizza @Lionfield @ChefRush
00:33
albert_cancook
Рет қаралды 97 МЛН
Little girl's dream of a giant teddy bear is about to come true #shorts
00:32
Finger Heart - Fancy Refill (Inside Out Animation)
00:30
FASH
Рет қаралды 25 МЛН
Thomas Huijskens - Bayesian optimisation with scikit-learn
39:21
The moment we stopped understanding AI [AlexNet]
17:38
Welch Labs
Рет қаралды 808 М.
History of Bayesian Neural Networks (Keynote talk)
40:25
Bayesian Deep Learning Workshop NIPS 2016
Рет қаралды 38 М.
Implementing Dropout as a Bayesian Approximation in TensorFlow
27:00
Kumanda İle Bilgisayarı Yönetmek #shorts
0:29
Osman Kabadayı
Рет қаралды 2,1 МЛН
Samsung laughing on iPhone #techbyakram
0:12
Tech by Akram
Рет қаралды 5 МЛН
Что делать если в телефон попала вода?
0:17
Лена Тропоцел
Рет қаралды 3 МЛН
S24 Ultra and IPhone 14 Pro Max telephoto shooting comparison #shorts
0:15
Photographer Army
Рет қаралды 10 МЛН