Dear all,
the next OWABI seminar https://www.warwick.ac.uk/owabi, the last of this Season, is also going to be a special one, featuring two selected talks live from BayesComp https://bayescomp2025.sg/!
The two talks are scheduled on Thursday the 19th June at 1pm UK time (note the different date and time!), given by
* Filippo Pagani (University of Warwick): Approximate Bayesian Fusion * Maurizio Filippone (KAUST): GANs Secretly Perform Approximate Bayesian Model Selection
with abstracts reported below.
The talks will be streamed on Zoom at the link: https://monash.zoom.us/j/81050994376?pwd=aF52Ab2oUlNYPO251d4nDw0jXRemia.1https://www.google.com/url?q=https%3A%2F%2Fmonash.zoom.us%2Fj%2F81050994376%3Fpwd%3DaF52Ab2oUlNYPO251d4nDw0jXRemia.1&sa=D&ust=1749507600000000&usg=AOvVaw0_FbLXzhbJ_k3HRjk2wT7f Meeting ID: 810 5099 4376 Passcode: 137607
1st OWABI Talk: 1-1.30pm Uk time Speaker: Filippo Paganihttps://filippopagani.github.io/ (University of Warwick) Title: Approximate Bayesian Fusion Abstract: Bayesian Fusion is a powerful approach that enables distributed inference while maintaining exactness. However, the approach is computationally expensive. In this work, we propose a novel method that incorporates numerical approximations to alleviate the most computationally expensive steps, thereby achieving substantial reductions in runtime. Our approach retains the flexibility to approximate the target posterior distribution to an arbitrary degree of accuracy, and is scalable with respect to both the size of the dataset and the number of computational cores. Our method offers a practical and efficient alternative for large-scale Bayesian inference in distributed environments.
2nd OWABI Talk: 1.30-2pm Uk time Speaker: Maurizio Filipponehttps://www.kaust.edu.sa/en/study/faculty/maurizio-filippone (KAUST) Title: Approximate Bayesian Fusion Abstract: Generative Adversarial Networks (GANs) are popular models achieving impressive performance in various generative modeling tasks. In this work, we aim at explaining the undeniable success of GANs by interpreting them as probabilistic generative models. In this view, GANs transform a distribution over latent variables Z into a distribution over inputs X through a function parameterized by a neural network, which is usually referred to as the generator. This probabilistic interpretation enables us to cast the GAN adversarial-style optimization as a proxy for marginal likelihood optimization. More specifically, it is possible to show that marginal likelihood maximization with respect to model parameters is equivalent to the minimization of the Kullback-Leibler (KL) divergence between the true data generating distribution and the one modeled by the GAN. By replacing the KL divergence with other divergences and integral probability metrics we obtain popular variants of GANs such as f-GANs, Wasserstein-GANs, and Maximum Mean Discrepancy (MMD)-GANs. This connection has profound implications because of the desirable properties associated with marginal likelihood optimization, such as (i) lack of overfitting, which explains the success of GANs, and (ii) allowing for model selection, which opens to the possibility of obtaining parsimonious generators through architecture search.
We are looking forward to seeing you at the next OWABI seminar(s), best, Massimiliano on the behalf of the OWABI Organisers
------ Dr. Massimiliano Tamborrino Reader (Associate Professor) and WIHEA Fellow Department of Statistics University of Warwick https://warwick.ac.uk/tamborrino