15 June 2023 - 15 June 2023

There is a long history linking optics and machine learning, going back to the 1980’s when optics was first used for the implementation of neural networks. The interest in the optical implementation of neural networks has been revived recently due to the explosion in the size of the networks that are realized and the associated high energy consumption required to train and operate digitally these networks. In this presentation, I will focus primarily on multimode fibers and their use as nonlinear optical computing elements. I will show that in a variety of classification tasks, the combination of nonlinear optical elements and digital co-processors [1] can reach classification accuracy competitive with very large digital multi-layer networks but with lower energy consumption. A possible application area of this technology is autonomous robots, vehicles and drones where low energy consumption is a critical need. [1] Programming Nonlinear Propagation for Efficient Optical Learning Machines Scalable optical learning operator, Uğur Teğin, Mustafa Yıldırım, Iİker Oğuz, Christophe Moser, Demetri Psaltis, Nature Computational Science, volume 1, pages542–549 (2021)
Room C1.03 - East Campus USI-SUPSI

5 May 2023 - 5 May 2023

Dopo avere ricordato i principali aspetti metodologici della scienza esatta ellenistica e alcuni dei suoi risultati ben documentati, si propone una ricostruzione degli aspetti matematici di una teoria dinamica perduta, applicata all’astronomia, alla quale Plutarco accenna nel dialogo "De facie quae in orbe lunae apparet". La ricostruzione è basata su passi della Meccanica pseudo-aristotelica, degli Elementi Euclide e del trattato Sulle spirali di Archimede.
Sala C1.02 - Campus Est USI-SUPSI

27 April 2023 - 27 April 2023

The distinction between deep learning and system identification can be quite intricate, as these fields have evolved through decades of research and contributions from diverse communities. In this talk, we aim to showcase how concepts from deep learning and system identification can be synergistically combined to create innovative algorithms and tools for data-driven modeling and analysis of nonlinear dynamical systems. Three main results will be presented: - A novel neural network architecture, called dynoNet, which integrates transfer functions into a deep learning framework, providing a bridge between traditional system identification and modern deep learning techniques. - A new algorithm for rapid model adaptation of neural network models, enabling fast and efficient fine-tuning to accommodate changes in system dynamics or operating conditions. - Quantification of predictive uncertainty in deep-learning models describing nonlinear dynamical systems, providing insights into model confidence and facilitating robust decision-making.
Room C1.02 - East Campus

20 April 2023 - 20 April 2023

In the talk, I will present some recent research in Explainable Artificial Intelligence (XAI) and some novel application of Inductive Learning of Answer Set Programs (ILASP) I have carried out with my collaborators in three areas. The presentation will encompass both published results and ongoing work, reflecting the latest advancements in the field. ILASP is a powerful method for learning logic programs under the answer set semantics that has shown great potential in addressing explainability challenges. By exploring the use of ILASP in different contexts, we aim to demonstrate its versatility and the value it brings to the field of XAI. The talk will be organized into three main sections, each focusing on a distinct line of research: (i) ILASP for explaining reinforcement learning agents: In this section, we will present how ILASP can be utilised to generate human-understandable explanations for reinforcement learning agents' decision-making processes; (ii) ILASP for explaining preference learning systems: preference learning deals with the prediction of users' preferences based on observed data. We will discuss the application of ILASP to create explainable models of the preference learning systems in terms of weak constraints, making the process more transparent and interpretable; (iii) ILASP in the context of learning Abstract Argumentation Frameworks. Abstract Argumentation Frameworks (AFs) are a powerful approach to reasoning about conflicting information. In this section, we will glimpse at the use of ILASP to learn Afs semantics. Remarks on challenges and future research directions in this rapidly evolving field will conclude the talk.
Room C2.09 - East Campus USI-SUPSI

22 March 2023 - 22 March 2023

Often time series are organized into a hierarchy. For example, the total visitors of a country can be divided into regions and the visitors of each region can be further divided into sub-regions. This is a hierarchical time series. Hierarchical forecasts should be coherent; for instance, the sum of the forecasts of the different regions should equal the forecast for the total. The forecasts are incoherent if they do not satisfy such constraints. Temporal hierarchies are another application of hierarchical time series, in which the same variable is predicted at different scales (e.g., monthly, quarterly and yearly) and coherence across the different temporal scales is needed. Reconciliation is the process of adjusting forecasts which are created independently for each time series, so that they become coherent. I will discuss the state-of-the-art of reconciliation algorithms.
Room B1.17 East Campus USI-SUPSI

3 March 2023 - 3 March 2023

Transformers - the purely attention based NN architecture - have emerged as a powerful tool in sequence processing. But how does a transformer think? When we discuss the computational power of RNNs, or consider a problem that they have solved, it is easy for us to think in terms of automata and their variants (such as counter machines and pushdown automata). But when it comes to transformers, no such intuitive model is available. In this talk I will present a programming language, RASP (Restricted Access Sequence Processing), which we hope will serve the same purpose for transformers as finite state machines do for RNNs. In particular, we will identify the base computations of a transformer and abstract them into a small number of primitives, which are composed into a small programming language. We will go through some example programs in the language, and discuss how a given RASP program relates to the transformer architecture.
Room A1.02 - East Campus USI-SUPSI

10 February 2023 - 10 February 2023

Range functions are an important tool for interval computations, and they can be employed for the problem of root isolation. In this talk, I will first introduce two new classes of range functions for real functions. They are based on the remainder form by Cornelius and Lohner (1984) and provide different improvements for the remainder part of this form. On the one hand, I will show how centered Taylor expansions can be used to derive a generalization of the classical Taylor form with higher than quadratic convergence. On the other hand, I will discuss a recursive interpolation procedure, in particular based on quadratic Lagrange interpolation, leading to recursive Lagrange forms with cubic and quartic convergence. These forms can be used for isolating the real roots of square-free polynomials with the algorithm EVAL, a relatively recent algorithm that has been shown to be effective and practical. Finally, a comparison of the performance of these new range functions against the standard Taylor form will be given. Specifically, EVAL can exploit features of the recursive Lagrange forms which are not found in range functions based on Taylor expansion. Experimentally, this yields at least a twofold speedup in EVAL.
Room B1.17