SEMINARS IN STATISTICS @ COLLEGIO CARLO ALBERTO https://www.carloalberto.org/events/category/seminars/seminars-in-statistics/page/2/?tribe-bar-date=2019-09-01
Venerdi 12 Aprile 2024, presso il Collegio Carlo Alberto, in Piazza Arbarello 8, Torino, si terranno i seguenti *2 seminari*:
------------------------------------------------
*11.00-12.00*
Speaker: Varun Jog (University of Cambridge, UK)
Title: *The sample complexity of simple binary hypothesis testing*
Abstract: The sample complexity of simple binary hypothesis testing is the smallest number of i.i.d. samples required to distinguish between two distributions p and q such that the Type-I and Type-II errors are smaller than some pre-specified thresholds α and β, respectively. Our main contribution is deriving, under mild technical conditions, a formula for the sample complexity in terms of parameters p, q, alpha, and beta, that is tight up to universal multiplicative constants.
------------------------------------------------
*12.00-13.00*
Speaker: Po-Ling Loh (University of Cambridge, UK)
Title: *Differentially private M-estimation via noisy optimization*
Abstract: We present a noisy composite gradient descent algorithm for differentially private statistical estimation in high dimensions. We begin by providing general rates of convergence for the parameter error of successive iterates under assumptions of local restricted strong convexity and local restricted smoothness. Our analysis is local, in that it ensures a linear rate of convergence when the initial iterate lies within a constant-radius region of the true parameter. At each iterate, multivariate Gaussian noise is added to the gradient in order to guarantee that the output satisfies Gaussian differential privacy. We then derive consequences of our theory for linear regression and mean estimation. Motivated by M-estimators used in robust statistics, we study loss functions which downweight the contribution of individual data points in such a way that the sensitivity of function gradients is guaranteed to be bounded, even without the usual assumption that our data lie in a bounded domain. We prove that the objective functions thus obtained indeed satisfy the restricted convexity and restricted smoothness conditions required for our general theory. We then show how the private estimators obtained by noisy composite gradient descent may be used to obtain differentially private confidence intervals for regression coefficients, by leveraging work in Lasso debiasing proposed in high-dimensional statistics. We complement our theoretical results with simulations that illustrate the favorable finite-sample performance of our methods. ------------------------------------------------
Sarà possibile seguire entrambi i seminari anche in streaming: Join Zoom Meeting https://us02web.zoom.us/j/88987605028?pwd=YXN1akl1bkN3TEFlS3QrMWFDWEF5dz09
I seminari sono organizzati dalla "de Castro" Statistics Initiative
www.carloalberto.org/stats