Signature methods represent a non-parametric way for extracting characteristic features from time series data which is essential in machine learning tasks, dynamic stochastic modeling, and mathematical finance. Indeed, signature based approaches allow for data-driven and thus more robust model selection mechanisms, while first principles (coming e.g. from physics) can still be guaranteed.
One focus of this talk lies on the use of signature as universal linear regression basis of continuous paths functionals for applications in stochastic modeling. In these applications key quantities that have to be computed efficiently are the expected signature or the characteristic function of the signature of some underlying stochastic process. Surprisingly this can be achieved for generic classes of (jump-)diffusions, called signature-SDEs (with possibly path dependent characteristics), via techniques from so-called affine and polynomial processes. More precisely, we show how the signature process of these diffusions can be embedded in the framework of affine and polynomial processes and how the infinite dimensional Feynman Kac PDE can be reduced to an ODE either of Riccati or linear type. In terms of concrete applications we show a portfolio selection problem which can be reduced to a convex quadratic optimization problem due to the linear structure of the signature feature set.
At the end we present some new research directions, including novel architectures where signature is combined with attention mechanisms, generative adversarial approaches to time series generation based on signature SDEs and neural signature kernel learning.
The talk is based on joint works with Janka Mller, Francesca Primavera, Sara-Svaluto Ferro and Josef Teichmann.