Speaker: Fabio Durastante
Affiliation: IAC-CNR (...that will soon become University of Pisa)
Time: Tuesday, 01/12/2020, 16:00
Title: (Sparse) Linear Algebra at the Extreme Scales
Sparse linear algebra is essential for a wide variety of scientific
applications.
The availability of highly parallel sparse solvers and preconditioners
lies at the
core of pretty much all multi-physics and multi-scale simulations.
Technology
is nowadays expanding to target exascale platforms. I am going to
present
some work on Algebraic Multigrid Preconditioners in which we try to
face these
challenges to make Exascale Computing possible.
The talk will focus on one side on the theoretical aspects pertaining
to the
construction of the multigrid hierarchy for which the main novelty is
the design
and implementation of new parallel smoothers and a coarsening algorithm
based on aggregation of unknowns employing weighted graph matching
techniques.
On the other, the talk also focuses on the libraries developed to cover
the needs of having parallel BLAS feature for sparse matrices that are
capable
of running on machines with thousands of high-performance cores; and to
discuss
the advancements made by the new smoothers and coarsening algorithm
as an improvement in terms of numerical scalability at low operator
complexity
over the algorithms available in previous releases of the package. I
will present
weak scalability results on two of the most powerful supercomputers in
Europe,
for linear systems with sizes up to O(10^10) unknowns for a benchmark
Poisson
problem, and strong scaling result for a wind-simulation benchmark
problem.
This is a joint work with P. D’Ambra, and S. Filippone. This work is
supported by the
EU under the Horizon 2020 Project Energy oriented Centre of
Excellence: toward exascale for energy (EoCoE-II), Project ID: 824158
Meeting link: <https://hausdorff.dm.unipi.it/b/leo-xik-xu4>
Dear all,
You are all invited to this week's NOMADS seminar at GSSI.
The seminar will be given by Martin Stoll from TU-Chemnitz (Germany).
Title, abstract and zoom link are below.
Further info about past and future meetings are available at the webpage:
https://num-gssi.github.io/seminar/
Please feel free to distribute this announcement as you see fit.
Hope to see you all on Wednesday!
Francesco and Nicola
==================================================
Title:
From PDEs to data science: an adventure with the graph Laplacian
Abstract:
In this talk we briefly review some basic PDE models that are used to
model phase separation in materials science. They have since become
important tools in image processing and over the last years
semi-supervised learning strategies could be implemented with these PDEs
at the core. The main ingredient is the graph Laplacian that stems from
a graph representation of the data. This matrix is large and typically
dense. We illustrate some of its crucial features and show how to
efficiently work with the graph Laplacian. In particular, we need some
of its eigenvectors and for this the Lanczos process needs to be
implemented efficiently. Here, we suggest the use of the NFFT method for
evaluating the matrix vector products without even fully constructing
the matrix. We illustrate the performance on several examples.
Zoom:
https://us02web.zoom.us/j/81317396646
—
Francesco Tudisco
Assistant Professor
School of Mathematics
GSSI Gran Sasso Science Institute
Web: https://ftudisco.gitlab.io
--
You received this message because you are subscribed to the Google Groups "nomads-list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to nomads-list+unsubscribe(a)gssi.it.
To view this discussion on the web visit https://groups.google.com/a/gssi.it/d/msgid/nomads-list/c28372f5-7312-f490-….
For more options, visit https://groups.google.com/a/gssi.it/d/optout.
Cari tutti,
ho appena ricevuto da Luca Zanni la mail che segue.
Io non sono piu' iscritto al GNCS.
Se qualcuno di voi vuole prendere accordi con Luca per fare la
comunicazione al GNCS, nella mail e' riportato anche il suo numero di
cellulare.
Ciao
Paolo
-------- Messaggio Inoltrato --------
Oggetto: scomparsa Prof. Alfonso Laratta
Data: Fri, 27 Nov 2020 08:30:56 +0100
Mittente: Luca ZANNI <luca.zanni(a)unimore.it>
Organizzazione: Università di Modena e Reggio Emilia
A: menchi(a)di.unipi.it, paolo.ghelardoni(a)unipi.it
Cari Ornella e Paolo,
mi rivolgo a voi perchè sono certo che avevate avuto
collaborazioni con Alfonso.
Ieri pomeriggio ho saputo della sua scomparsa e sicuramente ne sarete a
conoscenza anche voi.
Ho fatto un avviso per il mio dipartimento e pensavo di fare una
comunicazione anche al GNCS, magari insieme a qualche collega
di Pisa, visto che l'attività di Alfonso si divise tra le due sedi.
Cosa ne dite?
Vi lascio il mio cellulare nel caso servisse sentirsi: 331 6074422
Un caro saluto
L.
--
*********************************************************************
* Prof. Luca Zanni * office: +39 059 2055206 *
* Dipartimento di Scienze Fisiche, Informatiche e Matematiche *
* Universita' di Modena e Reggio Emilia * fax: +39 059 2055216 *
* Via Campi, 213/b *
* 41125 - Modena - ITALY * e-mail: luca.zanni(a)unimore.it *
*********************************************************************
--
Paolo Ghelardoni
Dipartimento di Matematica
Sede -- Via Buonarroti 1/C
56127 Pisa (Italia)
Tel: +39 050 2213867
E-mail: paolo.ghelardoni(a)unipi.it
Home Page: http://pagine.dm.unipi.it/ghelardoni
Dear all,
next week Fabio Durastante, which will soon officially join the
Department of Mathematics, will give a talk in the NumPi series.
Some of you recently told me that the usual time slot (Tue 11:00 --
12:00) does not work well anymore, due to new teaching obligations.
Hence, I am preparing a new Doodle to evaluate changing the day and
time of the seminar for this (and possibly the next) seminar; I am
kindly asking if you could fill in the Doodle by Friday, so I can then
make a decision and send the announcement.
Here is the link for the Doodle:
https://doodle.com/poll/5uq983z7s686qdp6
Thanks!
Best wishes, -- Leonardo Robol.
Speaker: Francesco Tudisco
Affiliation: GSSI
Time: Tuesday, 17/11/2020, 11:00
Meeting link: https://hausdorff.dm.unipi.it/b/leo-xik-xu4
Title: A tensor method for semi-supervised learning
Semi-supervised learning is the problem of finding clusters in a graph
or a point-clould dataset where we are given "few" initial input labels.
Label Spreading (LS) is a standard technique for this problem, which can
be interpreted as a diffusion process of the labels on the graph. While
there are many variants of LS, nearly all of them are linear models
which, for every node, only account for the information incoming from
its direct neighbors.
Recent work in network science has shown that in many graph algorithms a
great advantage can be obtained when accounting directly for
higher-order features. Such features may be built from the point-cloud
data or the adjacency matrix, for example by considering a triangle
involving nodes i, j and k. In other contexts, higher-order information
appears explicitly, for example, in a coauthorship network, a document
with three authors forms a natural triangle.
After reviewing the original LS algorithm I will present our proposed
variation of it, which takes directly advantage of higher-order
information. A key point of the proposed method is that we replace the
standard Laplacian matrix with a nonlinear Laplacian-inspired map which
is defined in terms of a order-three tensor.
Just like standard LS, we can show convergence of the new nonlinear
diffusion process to the gloabal minimum of a constrained
semi-supervised loss function that enforces local and global consistency
with the input labels. We demonstrate the efficiency and efficacy of our
approach on a variety of point cloud and network datasets, where the
proposed model outperforms classical label spreading, hypergraph
clustering methods, and graph neural networks. I will also point out
some open problems related to both the modeling part and the
computational one.
https://www.dm.unipi.it/webnew/it/seminari/tensor-method-semi-supervised-le…
Dear all,
We are happy to invite you all to the next NOMADS (Numerical ODEs,
Matrix Analysis and Data Science) seminar at GSSI. This week we will
have two talks, as per the following schedule:
1) Christian Lubich, University of Tuebingen, Germany
Wednesday November 18, 17h00 (CET)
2) Patricia Diaz De Alba, Gran Sasso Science Institute, Italy
Friday November 20, 17h15 (CET)
All seminars take place via Zoom. See below for additional information
(e.g. title, abstract and zoom link). Further info about past and future
meetings are available at the webpage:
https://num-gssi.github.io/seminar/
Please feel free to distribute this announcement as you see fit.
Hope to see you all on Wednesday and Friday!
Francesco Tudisco and Nicola Guglielmi
---------------------------------------------------------------------
November 18, 2020 (Wednesday) at 17h00 (Italian time)
Christian Lubich <https://na.uni-tuebingen.de/~lubich/>
*Dynamical low-rank approximation*
This talk reviews differential equations and their numerical solution on
manifolds of low-rank matrices or of tensors with a rank structure such
as tensor trains or general tree tensor networks. These low-rank
differential equations serve to approximate, in a data-compressed
format, large time-dependent matrices and tensors or multivariate
functions that are either given explicitly via their increments or are
unknown solutions to high-dimensional evolutionary differential
equations, with multi-particle time-dependent Schrödinger equations and
kinetic equations such as Vlasov equations as noteworthy examples of
applications.
Recently developed numerical time integrators are based on splitting the
projection onto the tangent space of the low-rank manifold at the
current approximation. In contrast to all standard integrators, these
projector-splitting methods are robust to the unavoidable presence of
small singular values in the low-rank approximation. This robustness
relies on exploiting geometric properties of the manifold of low-rank
matrices or tensors: in each substep of the projector-splitting
algorithm, the approximation moves along a flat subspace of the low-rank
manifold. In this way, high curvature due to small singular values does
no harm.
This talk is based on work done intermittently over the last decade with
Othmar Koch, Bart Vandereycken, Ivan Oseledets, Emil Kieri, Hanna Walach
and Gianluca Ceruti.
Zoom link <https://us02web.zoom.us/j/83782294125>
Add event to Google calendar
<https://calendar.google.com/event?action=TEMPLATE&tmeid=MW83YTByZm1ia2E0MGp…>
-------------------------------------------------------------------
-------------------------------------------------------------------
November 20, 2020 (Friday) at 17h30 (Italian time)
Patricia Diaz De Alba <http://bugs.unica.it/~patricia/>
*Numerical treatment for inverse electromagnetic problems*
Electromagnetic induction surveys are among the most popular techniques
for non-destructive investigation of soil properties, in order to detect
the presence of both ground inhomogeneities and particular substances.
Frequency-domain electromagnetic instruments allow the collection of
data in different configurations, that is, varying the intercoil
spacing, the frequency, and the height above the ground.
Based on a non-linear forward model used to describe the interaction
between an electromagnetic field and the soil, the aim is to reconstruct
the distribution of either the electrical conductivity or the magnetic
permeability with respect to depth. To this end, the inversion of both
the real (in-phase) and the imaginary (quadrature) components of the
signal are studied by a regularized damped Gauss-Newton method. The
regularization part of the algorithm is based on a low-rank
approximation of the Jacobian of the non-linear model. Furthermore, in
many situations, a regularization scheme retrieving smooth solutions is
blindly applied, without taking into account the prior available
knowledge. An algorithm for a regularization method that promotes the
sparsity of the reconstructed electrical conductivity or magnetic
permeability distribution is available. This regularization strategy
incorporates a minimum gradient support stabilizer into a truncated
generalized singular value decomposition scheme. The whole inversion
algorithm has been enclosed in a MATLAB package, called FDEMtools,
allowing the user to experiment with synthetic and experimental data
sets, and different regularization strategies, in order to compare them
and draw conclusions.
The numerical effectiveness of the inversion procedure is demonstrated
on synthetic and real datasets by using FDEMtools package.
Zoom link <https://us02web.zoom.us/j/85165138386>
Add event to Google calendar
<https://calendar.google.com/event?action=TEMPLATE&tmeid=N2gxdW1xOGhjOTZsa2J…>
--------------------------------------------------------------------
—
Francesco Tudisco
Assistant Professor
School of Mathematics
GSSI Gran Sasso Science Institute
Web: https://ftudisco.gitlab.io
This event has been canceled.
Title: [Numpi] Seminar on 17/11 (Francesco Tudisco)
Speaker: Francesco Tudisco
Affiliation: GSSI
Time: Tuesday, 17/11/2020, 11:00
Meeting link: https://hausdorff.dm.unipi.it/b/leo-xik-xu4
Title: A tensor method for semi-supervised learning
Semi-supervised learning is the problem of finding clusters in a graph
or a point-clould dataset where we are given "few" initial input labels.
Label Spreading (LS) is a standard technique for this problem, which can
be interpreted as a diffusion process of the labels on the graph. While
there are many variants of LS, nearly all of them are linear models
which, for every node, only account for the information incoming from
its direct neighbors.
Recent work in network science has shown that in many graph algorithms a
great advantage can be obtained when accounting directly for
higher-order features. Such features may be built from the point-cloud
data or the adjac...
When: Tue Nov 17, 2020 11:00 – 12:00 Central European Time - Rome
Joining info: Join with Google Meet
https://meet.google.com/wtx-fpah-paw
Calendar: numpi(a)di.unipi.it
Who:
* enrico.facca(a)gmail.com - organizer
* numpi(a)di.unipi.it
* f.durastante(a)na.iac.cnr.it
* viviani(a)chalmers.se
* leonardo.robol(a)unipi.it
Invitation from Google Calendar: https://calendar.google.com/calendar/
You are receiving this courtesy email at the account numpi(a)di.unipi.it
because you are an attendee of this event.
To stop receiving future updates for this event, decline this event.
Alternatively you can sign up for a Google account at
https://calendar.google.com/calendar/ and control your notification
settings for your entire calendar.
Forwarding this invitation could allow any recipient to send a response to
the organizer and be added to the guest list, or invite others regardless
of their own invitation status, or to modify your RSVP. Learn more at
https://support.google.com/calendar/answer/37135#forwarding
You have been invited to the following event.
Title: [Numpi] Seminar on 17/11 (Francesco Tudisco)
Speaker: Francesco Tudisco
Affiliation: GSSI
Time: Tuesday, 17/11/2020, 11:00
Meeting link: https://hausdorff.dm.unipi.it/b/leo-xik-xu4
Title: A tensor method for semi-supervised learning
Semi-supervised learning is the problem of finding clusters in a graph
or a point-clould dataset where we are given "few" initial input labels.
Label Spreading (LS) is a standard technique for this problem, which can
be interpreted as a diffusion process of the labels on the graph. While
there are many variants of LS, nearly all of them are linear models
which, for every node, only account for the information incoming from
its direct neighbors.
Recent work in network science has shown that in many graph algorithms a
great advantage can be obtained when accounting directly for
higher-order features. Such features may be built from the point-cloud
data or the adjac...
When: Tue Nov 17, 2020 11:00 – 12:00 Central European Time - Rome
Joining info: Join with Google Meet
https://meet.google.com/wtx-fpah-paw
Calendar: numpi(a)di.unipi.it
Who:
* enrico.facca(a)gmail.com - organizer
* numpi(a)di.unipi.it
* f.durastante(a)na.iac.cnr.it
* viviani(a)chalmers.se
* leonardo.robol(a)unipi.it
Event details:
https://calendar.google.com/calendar/event?action=VIEW&eid=M3Bmcm1maDdobmlk…
Invitation from Google Calendar: https://calendar.google.com/calendar/
You are receiving this courtesy email at the account numpi(a)di.unipi.it
because you are an attendee of this event.
To stop receiving future updates for this event, decline this event.
Alternatively you can sign up for a Google account at
https://calendar.google.com/calendar/ and control your notification
settings for your entire calendar.
Forwarding this invitation could allow any recipient to send a response to
the organizer and be added to the guest list, or invite others regardless
of their own invitation status, or to modify your RSVP. Learn more at
https://support.google.com/calendar/answer/37135#forwarding
Dear all,
this is just a reminder of today's seminar.
Best, -- Leonardo.
Speaker: Angelo Casulli
Affiliation: SNS
Time: Tuesday, 03/11/2020, 11:00
Meeting link: https://hausdorff.dm.unipi.it/b/leo-xik-xu4
Title: Rank-structured QR for Chebyshev rootfinding
The computation of the roots of polynomials expressed in the Chebyshev
basis has a lot of applications, for instance, it is useful in the
computation of real roots of smooth functions.
We present an algorithm for the rootfinding of Chebyshev polynomials
based on an improvement of the QR iteration presented in [Eidelman, Y.,
Gemignani, L., and Gohberg, I., Numer. Algorithms , 47.3 (2008): pp.
253-273]. We introduce an aggressive early deflation strategy, and we
show that the rank-structure allows to parallelize the algorithm
avoiding data dependencies which would be present in the unstructured
QR. The method exploits the particular structure of the colleague
linearization to achieve quadratic complexity and linear storage
requirements. The (unbalanced) QR iteration used for Chebyshev
rootfinding does not guarantee backward stability on the polynomial
coefficients, unless the vector of coefficients satisfy ||p|| ~ 1, an
hypothesis which is almost never verified for polynomials approximating
smooth functions. Even though the presented method is mathematically
equivalent to the latter algorithm, we show that exploiting the rank
structure allows to guarantee a small backward error on the polynomial,
up to an explicitly computable amplification factor ɣ(p), which
depends on the polynomial under consideration. We show that this
parameter is almost always of moderate size, making the method accurate
on several numerical tests, in contrast with what happens in the
unstructured unbalanced QR. We also discuss the connection between the
size of this amplification factor and the existence of a good
balancing. This provides some insight on why the accuracy of our method
is often very close to the balanced QR iteration.
https://www.dm.unipi.it/webnew/it/seminari/rank-structured-qr-chebyshev-roo…