Wolfgang Pauli Institute (WPI) Vienna

Workshop on Neural Dynamical Systems and Time-Series Data (external website )

Location: Kolingasse 1090 Wed, 23. Apr (Opening: 9:00) - Fri, 25. Apr 25
Topics:
neural and controlled differential equations, reservoir computing, signature and signature-kernel methods, and more
with applications in modeling and optimization across fields like finance, biology, and medicine.
Organisation(s)
QUARIMAFI @ U.Wien
WPI
Inst CNRS Pauli
Organiser(s)
Linus Bleistein (Inria Montpellier and EPFL)
Christa Cuchiero (WPI c/o U.Wien)
Nina Drobac (LPSM)
Adeline Fermanian (Califrais)
Paul Hager (U.Wien)
Olivier Wintenberger (Inst. CNRS Pauli c/o Paris Sorbonne)

Talks in the framework of this event


Harang, Fabian (BI Norwegian Business School, Norway) Skylounge, 12th floor Fak.Math Univ. Wien, Oskar Morgensternplatz 1, 1090 Wien Wed, 23. Apr 25, 9:00
"Signatures with memory - exploring the Volterra signature"
In this talk, we discuss the Volterra signature, an extension of the classical rough path signature. This concept was originally proposed in H. and Tindel (2021) for Volterra-type processes driven by potentially singular kernels. Motivated by Volterra equations and their Picard iterations, we define the signature using a convolution-based algebraic structure. We explore its analytic and algebraic properties, focusing on universality in functional approximation and injectivity of the mapping from the signature to its driving signal. We also discuss the role of Chen’s relation, and computational aspects of this signature over "simple Volterra paths".
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Rauscher, Marco (Technical University Munich, Germany) Skylounge, 12th floor Fak.Math Univ. Wien, Oskar Morgensternplatz 1, 1090 Wien Wed, 23. Apr 25, 9:45
"A Fourier Inversion Formula for the Truncated Signature Group"
In this talk, we develop the unitary irreducible representations for the group of truncated signatures. We describe the concept of induced representations for locally compact, nilpotent Lie groups and characterize all equivalence classes of these representations. Finally, we exploit our characterization to determine the characters of the unitary irreducible representations to formulate and simplify the Fourier Inversion formula in this context.
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Hager, Paul (University of Vienna, Austria) Skylounge, 12th floor Fak.Math Univ. Wien, Oskar Morgensternplatz 1, 1090 Wien Wed, 23. Apr 25, 11:00
"Dynamic and Distributional Features on Path Space"
abstract tba
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Hager, Paul (University of Vienna, Austria) Skylounge, 12th floor Fak.Math Univ. Wien, Oskar Morgensternplatz 1, 1090 Wien Wed, 23. Apr 25, 11:30
Mini Course 1: "Distributional Features on Path Space" (Part 1)
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Yang, Lingyi (University of Oxford, United Kingdom) Skylounge, 12th floor Fak.Math Univ. Wien, Oskar Morgensternplatz 1, 1090 Wien Wed, 23. Apr 25, 13:45
"Detecting fake data through anomaly detection"
In the last couple of years, we have seen an astronomical increase in the popularity of generative deep learning, for example in the form of ChatGPT and DALL-E. At our fingertips, we can generate essays, art, music, and much more. With new technologies, come new challenges. These tools can be maliciously used for misinformation and plagiarism, and therefore detection tools need to keep up with the evolution of generative AI. We frame the problem of detecting fake/generated data as an anomaly detection problem. Anomaly detection aims to identify whether new data points significantly deviate from its training corpus and certain transformations may aid this. We present SigMahaKNN, a pipeline designed for scoring anomalous streams based on clean corpus anomaly detection. The conformance scores of new samples are derived from a combination of path signatures with the Mahalanobis distance. Such an approach preserves desirable invariances, namely to a"ne transformations of the data and appending metadata. Our pipeline is versatile and can be used with a wide array of non-stationary, multi-modal tick data with complex missingness patterns. We showcase its e!ectiveness in detecting time series obtained from deep generative models for voice cloning.
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Drobac, Nina (Sorbonne Université, France) Skylounge, 12th floor Fak.Math Univ. Wien, Oskar Morgensternplatz 1, 1090 Wien Wed, 23. Apr 25, 14:30
"Signed, Sealed, Predicted: Time Series Forecasting with Signatures"
Recent challenges in electricity load forecasting, such as the COVID-19 pandemic and the energy crisis, have highlighted the need for adaptive predictive models. In this talk, I will introduce a time series forecasting framework based on the signature - a non-parametric feature set for sequential data that e!ectively captures temporal dynamics and interaction patterns. I will first present theoretical guarantees that motivate the use of linear regression models on signatures calculated over a sliding window. To bridge the gap between theory and practice, I will showcase an online algorithm that exploits the algebraic structure of the signature space for more e"cient computation. The presen- tation concludes with experimental results, demonstrating the framework’s potential in forecasting electricity load.
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Carrondo, Tomás (University of Vienna, Austria) Skylounge, 12th floor Fak.Math Univ. Wien, Oskar Morgensternplatz 1, 1090 Wien Wed, 23. Apr 25, 15:00
"Well-posedness and approximation properties of signature CDEs"
In this talk, I will introduce signature controlled di!erential equations (Sig-CDEs) as a natural and universal class within the framework of path-dependent controlled di!erential equations. A central open problem is the absence of a general existence and uniqueness theory, even in the simpler case of bounded variation drivers. I will present two equivalent formulations of Sig-CDEs and provide su"cient conditions for existence and uniqueness in each. Additionally, I will establish a stability result showing that any solution of a path-dependent CDE can be approximated, under mild as- sumptions and in a suitable sense, by a solution of a Sig-CDE. The aim is to o!er new insights into the foundational theory of Sig-CDEs and their role in modeling path-dependent dynamics. This talk is based on recent and ongoing joint work with Christa Cuchiero, Paul Hager, and Fabian Harang.
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Pachschwöll, Julian (University of Vienna, Austria) Skylounge, 12th floor Fak.Math Univ. Wien, Oskar Morgensternplatz 1, 1090 Wien Wed, 23. Apr 25, 16:00
"Solving High-Dimensional Riccati Equations in Signature Volatility Models"
We study signature-based volatility models where the volatility is given as a linear combination of signature terms of an underlying primary process, specified as a multivariate time-extended Ornstein- Uhlenbeck process. Using the a"ne framework introduced by Cuchiero et al., we view the log price enhanced with the signature of the primary process, as an finite-dimensional a"ne process. Under certain non-trivial assumptions, this allows us to express the characteristic function of the log price as the solution to an infinite-dimensional Riccati ODE. Truncating this system provides a practical method for approximating the characteristic function, enabling option pricing via Fourier methods. This talk focuses on the numerical solution of the truncated Riccati system, highlighting implementation challenges, particularly the need for fast and e"cient evaluation of the ODE function. We discuss these aspects in detail and present numerical results demonstrating the performance and limitations of the approach.
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Chan, Ric (Singapore Management University, Singapore) Skylounge, 12th floor Fak.Math Univ. Wien, Oskar Morgensternplatz 1, 1090 Wien Wed, 23. Apr 25, 16:30
"Signature-Based Clustering of Financial Time Series Using Weighted Signature Kernels"
This study introduces a novel methodology for clustering financial time series through the application of weighted signature kernels — a technique that blends stochastic analysis with modern machine learning. Leveraging the signature transform to e!ectively capture the sequential dynamics inherent in financial data, our approach computes a weighted signature kernel matrix that quantifies the similarity between multivariate time series. We validate our method on US equity data (NVDA, AAPL, MSFT, and GOOGL) spanning from 2022 to 2024, where log returns are standardized and fed into a PDE-based solver to obtain the signature kernel. The clustering pipeline integrates k-means with silhouette score optimization to determine the opti- mal number of clusters, thereby revealing distinct market regimes and anomaly patterns. In addition, we extend the analysis by incorporating an optimal measure framework and evaluating hyperbolic development kernels to quantify the alignment with Wiener measure. This dual analysis not only refines the clustering outcomes but also provides a robust error metric that compares the optimal dis- crete measure against theoretical benchmarks. Supplementary quality assessments, including metrics such as the Calinski-Harabasz and Davies-Bouldin scores, further support the e"cacy of the proposed method. Overall, the methodology demonstrates that weighted signature kernels are a powerful tool for unveil- ing complex structural similarities in financial time series, o!ering promising applications in market segmentation, portfolio optimization, and anomaly detection.
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Friz, Peter (Technische Universität Berlin & Weierstrass Institute, Germany) Skylounge, 12th floor Fak.Math Univ. Wien, Oskar Morgensternplatz 1, 1090 Wien Wed, 23. Apr 25, 17:00
"On Expected Signature Kernels"
The expected signature kernel arises in statistical learning tasks as a similarity measure of probabil- ity measures on path space. Computing this kernel for known classes of stochastic processes is an important problem that, in particular, can help reduce computational costs. Building on the repre- sentation of the expected signature of inhomogeneous Lévy processes as the development of a smooth path in the extended tensor algebra [F.-H.-Tapia, Forum of Mathematics: Sigma (2022), "Unified signature cumulants and generalized Magnus expansions"], we extend the arguments developed for smooth rough paths in [Lemercier-Lyons (2024), "A high-order solver for signature kernels"] to derive a PDE system for the expected signature of inhomogeneous Lévy processes. As a specific example, we demonstrate that the expected signature kernel of Gaussian martingales satisfies a Goursat PDE. (Joint work with P. Hager)
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Riedel, Sebastian (FernUni Hagen, Germany) Seminar room 5, Kolingasse 14-16 Thu, 24. Apr 25, 9:00
"Stochastic control with signatures"
We present a new approach to study stochastic optimal control problems using the signature, an object originated from rough paths theory. We will show how to solve the optimal stopping problem and furthermore study optimal control of stochastic di!erential equations with the signature. This is joint work with Peter Bank, Christian Bayer, Paul Hager, Tobias Nauen and John Schoenmakers.
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Ballarin, Giovanni (University of St. Gallen, Switzerland) Seminar room 5, Kolingasse 14-16 Thu, 24. Apr 25, 9:45
"From Many Models, One: Macroeconomic Forecasting with Reservoir Ensembles"
Model combination is a powerful approach to achieve superior performance with a set of models than by just selecting any single one. We study both theoretically and empirically the e!ectiveness of ensembles of Multi-Frequency Echo State Networks (MF-ESNs), which have been shown to achieve state-of-the-art macroeconomic time series forecasting results (Ballarin et al., 2024). Hedge and Follow-the-Leader schemes are discussed, and their online learning guarantees are extended to the case of dependent data. In applications, our proposed Ensemble Echo State Networks show significantly improved predictive performance compared to individual MF-ESN models.
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Hager, Paul (University of Vienna, Austria) Seminar room 5, Kolingasse 14-16 Thu, 24. Apr 25, 11:00
Mini Course 1: "Distributional Features on Path Space" (Part 2)
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Salvi, Cris (Imperial College London, United Kingdom) Seminar room 5, Kolingasse 14-16 Thu, 24. Apr 25, 11:30
"Signature Methods in Finance"
abstract tba
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Lemercier, Maud (University of Oxford, United Kingdom) Seminar room 5, Kolingasse 14-16 Thu, 24. Apr 25, 13:45
"High order solvers for signature kernels"
Signature kernels are at the core of several machine learning algorithms for analysing multivariate time series. The kernels of bounded variation paths, such as piecewise linear interpolations of time series data, are typically computed by solving a linear hyperbolic second-order PDE. However, this approach becomes considerably less practical for highly oscillatory inputs, due to significant time and memory complexities. To mitigate this issue, I will introduce a high order method which involves replacing the original PDE, which has rapidly varying coe"cients, with a system of coupled equations with piecewise constant coe"cients. These coe"cients are derived from the first few terms of the log-signatures of the input paths and can be computed e"ciently using existing Python libraries.
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Hsieh, Ya-Ping (ETH Zürich, Switzerland) Seminar room 5, Kolingasse 14-16 Thu, 24. Apr 25, 15:00
"Schrödinger Bridge Framework for Modeling Snapshot Data"
The Schrödinger Bridge (SB) framework provides a principled approach to reconstructing dynamical processes from snapshot data, with deep connections to optimal transport and stochastic optimal control. In this talk, we present a novel training algorithm that establishes rigorous guarantees for learning SBs. Our method leverages classical optimization techniques, specifically mirror descent and stochastic approximation, to ensure e"cient and theoretically grounded training.
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Akobian, Liana (University of Vienna, Austria) Seminar room 5, Kolingasse 14-16 Thu, 24. Apr 25, 16:00
"Uncovering Neural Control: A Dynamical Systems Approach to Disentangling Intrinsic and Controlled Neural Dynamics"
Understanding how neurons interact to produce behavior is a key challenge in neuroscience. The dy- namics of these interacting neurons define the computations that underlie the processing of sensory information, decision making, and the generation of motor output. Recent advances in dynami- cal system modeling have formalized observed neural activity as the temporal evolution of states within a neural state space governed by dynamical laws. While many existing models assume au- tonomous evolution, they may not su"ciently capture external perturbations that influence neural computation. In this work, we introduce a controlled decomposed linear dynamical system (cdLDS). In an unsupervised way, this algorithm learns unknown inputs that modulate neural state transi- tions, extending prior work using autonomous dynamical system models (dLDS). We apply cdLDS to whole brain activity data from C. elegans and demonstrate its ability to separate intrinsic neural dynamics from control signals. This decomposition provides insights into how external perturbations shape neural computation, o!ering a principled framework for understanding the impact of control mechanisms on neural dynamics. By bridging neuroscience and mathematical modeling, our work contributes to broader applications in biological systems and dynamical modeling.
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Hernandez-Vargas, Esteban (Univerisity of Idaho, United States of America) Seminar room 5, Kolingasse 14-16 Thu, 24. Apr 25, 16:30
"Hybrid Neural Differential Equations to model Unknown Mechanisms and States"
Efforts to model complex systems increasingly face challenges from ambiguous relationships within the model, such as through partially unknown mechanisms or unmodelled intermediate states. Hybrid neural di!erential equations are a recent modeling framework that has been previously shown to enable the identification and prediction of complex phenomena, especially in the context of partially unknown mechanisms. We extend the application of hybrid neural di!erential equations to enable the incorporation of theorized but unmodelled states within di!erential equation models. We find that beyond their capability to incorporate partially unknown mechanisms, hybrid neural di!erential equations provide an e!ective method to include knowledge of unmeasured states into di!erential equation models.
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Walker, Benjamin (University of Oxford, United Kingdom) Seminar room 5, Kolingasse 14-16 Thu, 24. Apr 25, 17:00
"Linear Neural Controlled Differential Equations"
Controlled di!erential equations (CDEs) describe the relationship between a control path and the evolution of a solution path. Neural CDEs (NCDEs) extend this concept by parameterising the CDE’s vector field with neural networks, treating time series as observations from a control path, and interpreting the solution as a continuously evolving hidden state. Their robustness to irregular sampling makes NCDEs highly e!ective for real-world data modelling. This talk highlights Linear Neural Controlled Di!erential Equations (LNCDEs), where the vector field is linear in the hidden state. LNCDEs combine the expressive power of non-linear recurrent neural networks with the computational parallelism of structured state-space models. However, their cubic computational cost in hidden dimension limits their scalability. We introduce three novel architectures—sparse, Walsh–Hadamard, and block-diagonal LNCDEs—collectively called Structured Linear Controlled Di!erential Equations (SLiCEs). We theoretically show that SLiCEs maintain the expressiveness of dense LNCDEs while significantly reducing computational complexity. Empirical benchmarks on state-tracking tasks confirm their practical e"ciency and scalability.
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Salvi, Cris (Imperial College London, United Kingdom) Seminar room 5, Kolingasse 14-16 Thu, 24. Apr 25, 17:45
"Quantum Signature Kernels"
abstract tba
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

dos Reis, Goncalo (University of Edinburgh, United Kingdom) Seminar room 5, Kolingasse 14-16 Fri, 25. Apr 25, 8:30
"Deep Importance Sampling in sector-Index options"
abstract tba
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Moreno-Pino, Fernando (University of Oxford, United Kingdom) Seminar room 5, Kolingasse 14-16 Fri, 25. Apr 25, 9:15
"Rough Transformers: Lightweight and Continuous Time Series Modelling through Signature Patching"
Time-series data in real-world settings typically exhibit long-range dependencies and are observed at non-uniform intervals. In these settings, traditional sequence-based recurrent models struggle. To overcome this, researchers often replace recurrent architectures with Neural ODE-based models to account for irregularly sampled data and use Transformer-based architectures to account for long- range dependencies. Despite the success of these two approaches, both incur very high computational costs for input sequences of even moderate length. To address this challenge, we introduce the Rough Transformer, a variation of the Transformer model that operates on continuous-time representations of input sequences and incurs significantly lower computational costs. In particular, we propose multi- view signature attention, which uses path signatures to augment vanilla attention and to capture both local and global (multi-scale) dependencies in the input data, while remaining robust to changes in the sequence length and sampling frequency and yielding improved spatial processing. We find that, on a variety of time-series-related tasks, Rough Transformers consistently outperform their vanilla attention counterparts while obtaining the representational benefits of Neural ODE-based models, all at a fraction of the computational time and memory resources.
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Mohammadi, Hossein (University of Exeter, United Kingdom) Seminar room 5, Kolingasse 14-16 Fri, 25. Apr 25, 10:00
"Emulating Complex Dynamical Simulators with Random Fourier Features"
A Gaussian process (GP)-based methodology is proposed to emulate complex dynamical computer models (or simulators). The method relies on emulating the numerical flow map of the system over an initial (short) time step, where the flow map is a function that describes the evolution of the system from an initial condition to a subsequent value at the next time step. This yields a probabilistic distribution over the entire flow map function, with each draw o!ering an approximation to the flow map. The model output time series is then predicted (under the Markov assumption) by drawing a sample from the emulated flow map (i.e., its posterior distribution) and using it to iterate from the initial condition ahead in time. Repeating this procedure with multiple such draws creates a distribution over the time series. The mean and variance of this distribution at a specific time point serve as the model output prediction and the associated uncertainty, respectively. However, drawing a GP posterior sample that represents the underlying function across its entire domain is computationally infeasible, given the infinite-dimensional nature of this object. To overcome this limitation, one can generate such a sample in an approximate manner using random Fourier features (RFF). RFF is an e"cient technique for approximating the kernel and generating GP samples, o!ering both computational e"ciency and theoretical guarantees. The proposed method is applied to emulate several dynamic nonlinear simulators including the well-known Lorenz and van der Pol models. The results suggest that our approach has a promising predictive performance and the associated uncertainty can capture the dynamics of the system appropriately.
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Salvi, Cris (Imperial College London, United Kingdom) Seminar room 5, Kolingasse 14-16 Fri, 25. Apr 25, 11:00
Mini Course 2: "Signature Methods in Finance" (Part 1 & 2)
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

Bayer, Christian (Weierstrass Institute, Germany) Seminar room 5, Kolingasse 14-16 Fri, 25. Apr 25, 12:00
"Pricing American options under rough volatility"
Rough volatility models are an important class of stock price models, which are widely recognised for allowing excellent fits to market prices of options. However, the roughness of the volatility dynamics, and, even more so, the lack of Markov property lead to considerable numerical challenges, especially regarding path-dependent options. We introduce a range of e"cient numerical methods for pricing of American options under rough volatility based on path signatures. After providing theoretical analysis of the methods, we verify their accuracy using numerical examples. (Based on joint works with P. Hager, L. Pelizzari, S. Riedel, J. Schoenmakers, and J. J. Zhu.)
  • Event: Workshop on Neural Dynamical Systems and Time-Series Data (2025)

© WPI 2001-2004. www.wpi.ac.at