Reading Course in Numerical Analysis

What is it?

The reading course is a medium-term (typically one semester or so) group study of a specific topic, based on periodic sessions. The course sessions last two hours and are scheduled bi-weekly (see the calendar below for further details).

A paper or a chapter from a book is discussed during each meeting and presented by one of the attendees. The topics are chosen among relevant trends in Numerical Analysis and Linear Algebra and are based on requests from the audience.

Anybody is welcome to join, and Ph.D. and master’s students are particularly encouraged to participate.

The reading course takes place at the Department of Mathematics. It is currently coordinated by Fabio Durastante, Stefano Massei, and Leonardo Robol (if you wish to join the group just let us know!).

How to join us

Just subscribe to the reading course mailing list. All communications will be delivered there.

Calendar and Topics

Spring 2023: Model Order Reduction

The new edition of the reading course will focus on Model Order Reduction, following chapters in the book edited by Peter Benner, Mario Ohlberger, Albert Cohen, and Karen Willcox, Model Reduction and Approximation (find it here on the SIAM website).

  • 21/03, 16:00 — 18:00, Aula Riunioni
    An Introduction to LTI systems and Balanced Truncation
    Speaker: Angelo Casulli, SNS.
  • 04/04, 16:00 — 18:00, Aula Riunioni
    Model Order Reduction by Rational Approximation: rational Krylov and IRKA
    Speaker: Alberto Bucci, University of Pisa
  • 18/04, 16:00 — 18:00, Aula Riunioni
    An introduction to Proper Orthogonal Decomposition
    Speaker: Chiara Faccio, SNS
  • 02/05, 16:00 — 18:00, Aula Riunioni
    POD for linear-quadratic control
    Speaker: Santolo Leveque, SNS
  • 16/05, 16:00 — 18:00, Aula Riunioni
    Model Order Reduction for conservative problems
    Speaker: Milo Viviani, SNS (da confermare)
  • 30/05, 16:00 — 18:00, Aula Riunioni
    Speaker: TBA

The topic(s) for the last meeting(s) will be decided throughout the course, and are still subject to changes. Available options and alternatives include: Reduced basis methods, high-dimensional interpolation, the Loewner framework, Discrete Empirical Interpolation Method (DEIM), or Model Reduction methods for complex network systems.

Previous editions

Fall 2022: Randomized Linear Algebra

This session is concerned with randomized methods for linear systems, eigenvalue problems, least squares, factorizations, low-rank approximation, and trace estimation problems.

    • 03/10/2022, 14:00--16:00, Aula Riunioni
      Introduction to randomized low-rank approximation: randomized embeddings [1, Chapters 6-10].
      Speaker: Angelo Casulli.

    • 17/10/2022, 14:00--16:00, Aula Riunioni
      Randomized range finder, interpolative decompositions, and Randomized SVD [1, Chapters 11-14] and [2].
      Speaker: Alberto Bucci.

    • 07/11/2022, 14:00--16:00, Aula Riunioni
      Randomized least squares: Blendenpik [3].
      Speaker: Chiara Faccio.

    • 21/11/2022, 14:00--16:00, Aula Riunioni
      Randomized methods for eigenvalues and linear systems [4].
      Speaker: Igor Simunec.

    • 05/12/2022, 14:00--16:00, Aula Riunioni
      Hutch++: Optimal stochastic trace estimation [5].
      Speaker: Michele Rinelli.

    • 20/12/2022, 11:00--13:00, Aula Magna
      Norm and trace estimation with random rank one vectors [6].
      Speaker: Nikita Deniskin.


[1] Randomized numerical linear algebra: Foundations and algorithms, Martinsson and Tropp, Acta Numerica, 2020.
[2] Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions, Haiko et. al, SIAM review, 2011.
[3] Blendenpik: Supercharging LAPACK'S least-squares solver, Avron et. al, SISC, 2010.
[4] Fast & accurate randomized algorithms for linear systems and eigenvalue problems, Nakatsukasa and Tropp, arXiv, 2021.
[5] Hutch++: Optimal stochastic trace estimation, Meyer et. al, SIAM, 2021.
[6] Norm and trace estimation with random rank one vectors, Bujanovic and Kressner, SIMAX, 2021.
[7] Input sparsity time low-rank approximation via Ridge leverage score sampling, Cohen et. al, SIAM, 2017.
[8] Random matrix theory, Edelman and Rao, Acta Numerica, 2005.

Back to top