Close
This site uses cookies

By using this site, you consent to our use of cookies. You can view our terms and conditions for more information.

Principled Amortized Bayesian Inference with Deep Learning

Authors
Stefan Radev
Rensselaer Polytechnic Institute ~ Cognitive Science
Abstract

This workshop will provide an introduction to deep learning methods and architectures for efficient Bayesian inference with complex models. It will include a self-contained theoretical part and a practical part, focusing on the topics of posterior estimation, model comparison, likelihood estimation, and model misspecification. In the theoretical part, participants will learn about neural density estimation with normalizing flows, simulation-based optimization, embedding networks, sequential and amortized inference, as well as the rationale of principled Bayesian workflows. In the practical part, participants will apply existing software packages for neural Bayesian inference (e.g., BayesFlow, SBI) to build their own amortized Bayesian workflows. Participants are highly encouraged to "bring" their own models and ideas to the workshop.

Tags

Keywords

Deep learning
Bayesian inference
simulation-based inference
cognitive modeling
amortized inference
Discussion
New

There is nothing here yet. Be the first to create a thread.

Cite this as:

Radev, S. T. (2023, July). Principled Amortized Bayesian Inference with Deep Learning. Abstract published at MathPsych/ICCM/EMPG 2023. Via mathpsych.org/presentation/1020.