Dear all,
as usual, we will soon start with the NumPi seminars for the second
semester. As in the first semester, they will be online.
In order to minimize the overlap with teaching activities and other
commitments, we have prepared a Doodle where (if you wish to attend the
seminars) you can choose your preferred time slot(s).
Please note that the Doodle is for this week, but it is intended for a
"generic week" of this semester. The first seminar will likely be at
the beginning of March.
Doodle link: https://doodle.com/poll/a4fpmfwat7q62rnu
P.S.: In case you wish to propose some speakers (or yourself as a
speaker), feel free to drop us an e-mail.
Best wishes, -- Fabio Durastante and Leonardo Robol.
Dear all,
You are all invited to this week's NOMADS seminar at GSSI.
The seminar will be on Wednesday February 17 at 17:00 (CET) by Michael
Schaub from RWTH Aachen University (Germany).
The talk will be focused on a method for learning the structure of a
network given few observations of a diffusive process on the unknown graph.
Title and abstract are below.
To attend the semimar please use the following link:
https://us02web.zoom.us/j/87171939595
Further info about past and future meetings are available at the webpage:
https://num-gssi.github.io/seminar/
Hope to see you all on Wednesday! And, please feel free to distribute
this announcement as you see fit.
Francesco and Nicola
--------
Title: Learning from signals on graphs with unobserved edges
In many applications we are confronted with the following system
identification scenario: we observe a dynamical process that describes
the state of a system at particular times. Based on these observations
we want to infer the (dynamical) interactions between the entities we
observe. In the context of a distributed system, this typically
corresponds to a "network identification" task: find the (weighted)
edges of the graph of interconnections. However, often the number of
samples we can obtain from such a process are far too few to identify
the edges of the network exactly. Can we still reliably infer some
aspects of the underlying system?
Motivated by this question we consider the following identification
problem: instead of trying to infer the exact network, we aim to recover
a (low-dimensional) statistical model of the network based on the
observed signals on the nodes. More concretely, here we focus on
observations that consist of snapshots of a diffusive process that
evolves over the unknown network. We model the (unobserved) network as
generated from an independent draw from a latent stochastic blockmodel
(SBM), and our goal is to infer both the partition of the nodes into
blocks, as well as the parameters of this SBM. We present simple
spectral algorithms that provably solve the partition and parameter
inference problems with high-accuracy.
We further discuss some possible variations and extensions of this
problem setup.
—
Francesco Tudisco
Assistant Professor
School of Mathematics
GSSI Gran Sasso Science Institute
Web: https://ftudisco.gitlab.io
--
You received this message because you are subscribed to the Google Groups "nomads-list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to nomads-list+unsubscribe(a)gssi.it.
To view this discussion on the web visit https://groups.google.com/a/gssi.it/d/msgid/nomads-list/2f2274b3-9a34-5b45-….
For more options, visit https://groups.google.com/a/gssi.it/d/optout.
Good morning everyone,
This is just a gentle reminder about today's seminar "Large-scale
regression with non-convex loss and penalty" by Lothar Reichel (Kent
State University, USA). Abstract below.
The seminar is at 17:00 (CET). To attend please use the zoom link:
https://us02web.zoom.us/j/89724684523
Please feel free to distribute this announcement as you see fit.
Hope to see you there!
Francesco and Nicola
----------
Title:
Large-scale regression with non-convex loss and penalty
Description:
We do non-convex optimization with application to image restoration and
regression problems for which a sparse solution is desired.
----------
—
Francesco Tudisco
Assistant Professor
School of Mathematics
GSSI Gran Sasso Science Institute
Web: https://ftudisco.gitlab.io
--
You received this message because you are subscribed to the Google Groups "nomads-list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to nomads-list+unsubscribe(a)gssi.it.
To view this discussion on the web visit https://groups.google.com/a/gssi.it/d/msgid/nomads-list/02613376-994d-c51e-….
For more options, visit https://groups.google.com/a/gssi.it/d/optout.
Dear all,
You are all invited to this week's NOMADS seminar at GSSI.
The seminar will be given on *Thursday* (not Wednesday as usual)
*February 4 at 17:00 (CET)* by *Lothar Reichel* from Kent State
University (USA).
---
Title:
Large-scale regression with non-convex loss and penalty
Description:
We do non-convex optimization with application to image restoration and
regression problems for which a sparse solution is desired.
---
To attend the seminar please use the following link:
https://us02web.zoom.us/j/89724684523
Further info about past and future meetings are available at the webpage:
https://num-gssi.github.io/seminar/
Please feel free to distribute this announcement as you see fit.
Hope to see you all on Thursday!
Francesco and Nicola
—
Francesco Tudisco
Assistant Professor
School of Mathematics
GSSI Gran Sasso Science Institute
Web: https://ftudisco.gitlab.io
--
You received this message because you are subscribed to the Google Groups "nomads-list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to nomads-list+unsubscribe(a)gssi.it.
To view this discussion on the web visit https://groups.google.com/a/gssi.it/d/msgid/nomads-list/2f5d2866-9537-1186-….
For more options, visit https://groups.google.com/a/gssi.it/d/optout.
Inoltro questo annuncio per i potenziali interessati non iscritti al GNCS.
Buon week-end,
-federico
-------- Forwarded Message --------
Subject: Summer School plus Conference on “Mathematics for Nonstationary Signals,and applications in Geophysics and other fields” - L'Aquila (Italy) and online, July 2021
Date: Sat, 30 Jan 2021 10:29:15 +0100
From: Ruggiero Valeria <valeria.ruggiero(a)unife.it>
To: gncs-aderenti(a)altamatematica.it
Dear Colleagues,
we kindly inform you that a Summer School plus Conference on
“Mathematics for Nonstationary Signals and applications in Geophysics and other fields”,
will take place at the Università degli Studi dell'Aquila, L'Aquila, Italy, and online on July 19-24, 2021.
The event will be hybrid, providing the opportunity to everyone to join either in-person or virtually.
During the Summer School young researchers and PhD students will have a chance to learn and deepen
their knowledge on Mathematics of Signal Processing, in particular on new data analysis tools/techniques
for non-stationary time series and their theoretical foundation.
The summer school will take place during the first 4 days and it will consist of three courses of 8 hours each.
Confirmed Lecturers:
Patrick Flandrin - ENS Lyon
Yang Wang - HKSTU
Hau-tieng Wu - Duke University
At the end of the school there will be a 2 days and half Conference and Poster Session during which
the speakers will show both the applications of these techniques to real life data
and present the current frontiers of the theoretical research.
Some slots for contributed talks and posters are still available.
Contributed talks will be 30 minutes long (25+5 for questions).
Submission deadline is April 30, 2021.
Applications for prospective students of the Summer School,
as well as speakers of the conference and poster session are now open.
Financial support is available for a limited number of participants.
For more information and to apply please visit www.cicone.com/NoSAG21.html <http://www.cicone.com/NoSAG21.html>
Best regards,
The local organizing committee
Antonio Cicone - DISIM - Università degli Studi dell'Aquila - L'Aquila
Giulia D'Angelo - INAF - Istituto di Astrofisica e Planetologia Spaziali - Roma
Enza Pellegrino - DIIIE - Università degli Studi dell'Aquila - L'Aquila
Mirko Piersanti - INFN - Universita di Roma "Tor Vergata" - Roma
Angela Stallone - INGV - Istituto Nazionale di Geofisica e Vulcanologia - Roma
Dear all, *
*the next GSSI Math Colloquium will be held on *Thursday January 28
*at***3pm* (Italian time).
The speaker is Anders Hansen,
<http://www.damtp.cam.ac.uk/research/afha/anders/> with a lecture
connecting computational mathematics with deep learning and AI. More
details below.
Anders Hansen is Associate Professor at University of Cambridge, where
he leads the Applied Functional and Harmonic Analysis group, and Full
Professor of Mathematics at the University of Oslo.
To attend the talk please use to the following *Zoom link*:
https://us02web.zoom.us/j/84038062394
Please feel free to distribute this announcement as you see fit.
Looking forward to seeing you all on Thursday!
Paolo Antonelli, Stefano Marchesani, Francesco Tudisco and Francesco Viola
---------------------
Title: On the foundations of computational mathematics, Smale's 18th
problem and the potential limits of AI
Abstract:
There is a profound optimism on the impact of deep learning (DL) and AI
in the sciences with Geoffrey Hinton concluding that 'They should stop
training radiologists now'. However, DL has an Achilles heel: it is
universally unstable so that small changes in the initial data can lead
to large errors in the final result. This has been documented in a wide
variety of applications. Paradoxically, the existence of stable neural
networks for these applications is guaranteed by the celebrated
Universal Approximation Theorem, however, the stable neural networks are
never computed by the current training approaches. We will address this
problem and the potential limitations of AI from a foundations point of
view. Indeed, the current situation in AI is comparable to the situation
in mathematics in the early 20th century, when David Hilbert’s optimism
(typically reflected in his 10th problem) suggested no limitations to
what mathematics could prove and no restrictions on what computers could
compute. Hilbert’s optimism was turned upside down by Goedel and Turing,
who established limitations on what mathematics can prove and which
problems computers can solve (however, without limiting the impact of
mathematics and computer science).
We predict a similar outcome for modern AI and DL, where the
limitations of AI (the main topic of Smale’s 18th problem) will be
established through the foundations of computational mathematics. We
sketch the beginning of such a program by demonstrating how there exist
neural networks approximating classical mappings in scientific
computing, however, no algorithm (even randomised) can compute such a
network to even 1-digit accuracy (with probability better than 1/2). We
will also show how instability is inherit in the methodology of DL
demonstrating that there is no easy remedy, given the current
methodology. Finally, we will demonstrate basic examples in inverse
problems where there exists (untrained) neural networks that can easily
compute a solution to the problem, however, the current DL techniques
will need 10^80 data points in the training set to get even 1% success rate.
—
Francesco Tudisco
Assistant Professor
School of Mathematics
GSSI Gran Sasso Science Institute
Web: https://ftudisco.gitlab.io
--
You received this message because you are subscribed to the Google Groups "nomads-list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to nomads-list+unsubscribe(a)gssi.it.
To view this discussion on the web visit https://groups.google.com/a/gssi.it/d/msgid/nomads-list/b3099d6a-bcef-2052-….
For more options, visit https://groups.google.com/a/gssi.it/d/optout.
Buongiorno,
inoltro questo annuncio, che può essere di interesse per qualche
iscritto alla lista.
**Postdoc Position, Krylov Methods, Charles Univ, Czech Rep**
A postdoc position is available within the framework of the Primus
Research Programme "A Lanczos-like Method for the Time-Ordered
Exponential" at the Faculty of Mathematics and Physics, Charles
University, Prague.
The appointment period is one year, with the possibility of
extension. The postdoc will start before the end of 2021. The start
date is negotiable.
We are looking for candidates with a strong background in numerical
linear algebra. In particular, we seek applicants with expertise in
matrix function approximation and Krylov subspace methods. The
applicant must hold a Ph.D. degree by the start date.
Application deadline: March 15, 2021.
More information and application instructions:
https://www.starlanczos.cz/open-positions
<https://www.starlanczos.cz/open-positions>
--
--federico poloni
Dipartimento di Informatica, Università di Pisa
https://www.di.unipi.it/~fpoloni/ tel:+39-050-2213143
Dear all,
on January 19, 3 pm, Dario Bini will give a talk on "Solving Structured
Matrix Equations Encountered in the Analysis of Stochastic Processes".
The talk is part of the NEPA seminar series, and many more talks will
take place in the next weeks. Participation is free, but a registration
is required to obtain the Zoom link [1].
[1] https://sites.google.com/unisa.it/nepaseminars
Best wishes, -- Leonardo.
Good morning everyone,
This is just a gentle reminder about today's seminar "Numerical
integrators for dynamical low-rank approximation" by Gianluca Ceruti
(Uni Tuebingen). Abstract below.
The seminar is at 17:00 (CET). To attend please use the zoom link:
https://us02web.zoom.us/j/82131676880
Hope to see you there!
Francesco and Nicola
-----------
Gianluca Ceruti <https://na.uni-tuebingen.de/~ceruti/> - University of
Tuebingen
Numerical integrators for dynamical low-rank approximation
Discretization of time-dependent high-dimensional PDEs suffers of an
undesired effect, known as curse of dimensionality. The amount of data
to be stored and treated, grows exponentially, and exceeds standard
capacity of common computational devices.
In this setting, time dependent model order reductions techniques are
desirable.
In the present seminar, together with efficient numerical integrators,
we present a recently developed approach: dynamical low-rank approximation.
Dynamical low-rank approximation for matrices will be firstly presented,
and a numerical integrator with two remarkable properties will be
introduced: the matrix projector splitting integrator.
Based upon this numerical integrator, we will construct two equivalent
extensions for tensors, multi-dimensional arrays, in Tucker format - a
high-order generalization of the SVD decomposition for matrices. These
extensions are proven to preserve the excellent qualities of the matrix
integrator.
To conclude, via a novel compact formulation of the Tucker integrator,
we will further extend the matrix and Tucker projector splitting
integrators to the most general class of Tree Tensor Networks. Important
examples belonging to this class and of interest for applications are
given, but not only restricted to, by Tensor Trains.
This seminar is based upon a joint work with Ch. Lubich and H. Walach.
—
Francesco Tudisco
Assistant Professor
School of Mathematics
GSSI Gran Sasso Science Institute
Web: https://ftudisco.gitlab.io
--
You received this message because you are subscribed to the Google Groups "nomads-list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to nomads-list+unsubscribe(a)gssi.it.
To view this discussion on the web visit https://groups.google.com/a/gssi.it/d/msgid/nomads-list/46a4a726-fee9-9f0b-….
For more options, visit https://groups.google.com/a/gssi.it/d/optout.
Dear all,
We hope you all had joyful holidays and wish you all a great start for the
new year!
We are ready to start with this year's NOMADS seminar at GSSI and would
like to invite you to this week's talk.
The seminar will be given on Wednesday January 13 at 17:00 (CET) by
Gianluca Ceruti from University of Tuebingen.
Title, abstract and zoom link are below.
Further info about past and future meetings are available at the webpage:
https://num-gssi.github.io/seminar/
Please feel free to distribute this announcement as you see fit.
Hope to see you all on Wednesday!
Francesco and Nicola
================================
Speaker: Gianluca Ceruti, University of Tuebingen
https://na.uni-tuebingen.de/~ceruti/
Zoom link:
https://us02web.zoom.us/j/82131676880
Numerical integrators for dynamical low-rank approximation
Discretization of time-dependent high-dimensional PDEs suffers of an
undesired effect, known as curse of dimensionality. The amount of data to
be stored and treated, grows exponentially, and exceeds standard capacity
of common computational devices.
In this setting, time dependent model order reductions techniques are
desirable.
In the present seminar, together with efficient numerical integrators, we
present a recently developed approach: dynamical low-rank approximation.
Dynamical low-rank approximation for matrices will be firstly presented,
and a numerical integrator with two remarkable properties will be
introduced: the matrix projector splitting integrator.
Based upon this numerical integrator, we will construct two equivalent
extensions for tensors, multi-dimensional arrays, in Tucker format - a
high-order generalization of the SVD decomposition for matrices. These
extensions are proven to preserve the excellent qualities of the matrix
integrator.
To conclude, via a novel compact formulation of the Tucker integrator, we
will further extend the matrix and Tucker projector splitting integrators
to the most general class of Tree Tensor Networks. Important examples
belonging to this class and of interest for applications are given, but not
only restricted to, by Tensor Trains.
This seminar is based upon a joint work with Ch. Lubich and H. Walach.
—
Francesco Tudisco
Assistant Professor
School of Mathematics
GSSI Gran Sasso Science Institute
Web: https://ftudisco.gitlab.io
--
You received this message because you are subscribed to the Google Groups "nomads-list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to nomads-list+unsubscribe(a)gssi.it.
To view this discussion on the web visit https://groups.google.com/a/gssi.it/d/msgid/nomads-list/CAO2_FW%3DqD1z%3D9V….
For more options, visit https://groups.google.com/a/gssi.it/d/optout.