Dear All,


The final OWABI seminar www.warwick.ac.uk/owabi of the season will feature two selected talks and will take place live from BayesComp https://bayescomp2025.sg/! The talks will take place on Thursday the 19th June at 1pm UK time (note the different date and time!), and will be given by

with abstracts reported below.


The talks will be live streamed from the following link:

https://monash.zoom.us/j/81050994376?pwd=aF52Ab2oUlNYPO251d4nDw0jXRemia.1

Meeting ID: 810 5099 4376  

Passcode: 137607


The session will run in a hybrid format, taking place live from BayesCom2025 - 8pm-9pm (Singapore time) - and will be located in room LT50 in the conference venue. 


We look forward to seeing you online or in person!


Best,
Massimiliano on the behalf of the OWABI Organisers

1st OWABI Talk: 1-1.30pm Uk time
Speaker: Filippo Pagani (University of Warwick)
Title: Approximate Bayesian Fusion
Abstract: Bayesian Fusion is a powerful approach that enables distributed inference while maintaining exactness. However, the approach is computationally expensive. In this work, we propose a novel method that incorporates numerical approximations to alleviate the most computationally expensive steps, thereby achieving substantial reductions in runtime. Our approach retains the flexibility to approximate the target posterior distribution to an arbitrary degree of accuracy, and is scalable with respect to both the size of the dataset and the number of computational cores. Our method offers a practical and efficient alternative for large-scale Bayesian inference in distributed environments.

2nd OWABI Talk: 1.30-2pm Uk time
Speaker: Maurizio Filippone (KAUST)
Title:  GANs Secretly Perform Approximate Bayesian Model Selection
Abstract: Generative Adversarial Networks (GANs) are popular models achieving impressive performance in various generative modeling tasks. In this work, we aim at explaining the undeniable success of GANs by interpreting them as probabilistic generative models. In this view, GANs transform a distribution over latent variables Z into a distribution over inputs X through a function parameterized by a neural network, which is usually referred to as the generator. This probabilistic interpretation enables us to cast the GAN adversarial-style optimization as a proxy for marginal likelihood optimization. More specifically, it is possible to show that marginal likelihood maximization with respect to model parameters is equivalent to the minimization of the Kullback-Leibler (KL) divergence between the true data generating distribution and the one modeled by the GAN. By replacing the KL divergence with other divergences and integral probability metrics we obtain popular variants of GANs such as f-GANs, Wasserstein-GANs, and Maximum Mean Discrepancy (MMD)-GANs. This connection has profound implications because of the desirable properties associated with marginal likelihood optimization, such as (i) lack of overfitting, which explains the success of GANs, and (ii) allowing for model selection, which opens to the possibility of obtaining parsimonious generators through architecture search. 
 

------
Dr. Massimiliano Tamborrino
Reader (Associate Professor) and WIHEA Fellow
Department of Statistics
University of Warwick
https://warwick.ac.uk/tamborrino