Week 25.02.2024 – 02.03.2024
Monday (26 Feb)
The study of stochastic PDEs has known tremendous advances in recent years and, thanks to Hairer's theory of regularity structures and Gubinelli and Perkowski's paracontrolled approach, (local) existence and uniqueness of solutions of subcritical SPDEs is by now well-understood. The goal of this talk is to move beyond the aforementioned theories and present novel tools to derive the scaling limit (in the so-called weak coupling scaling) for some stationary SPDEs at the critical dimension. Our techniques are inspired by the resolvent method developed by Landim, Olla, Yau, Varadhan, and many others, in the context of particle systems in the supercritical dimension. Time allowing, we will explain how it is possible to use our techniques to study a much wider class of statistical mechanics models at criticality such as (self-)interacting diffusions in random environment.
Tuesday (27 Feb)
In this talk, we will focus on the group of Hamiltonian diffeomorphisms (and area-preserving homeomorphisms) of the 2-sphere. A tremendous amount of progress has been made in the study of these groups in the last few years, but many problems remain, including the Equator Conjecture. An equator on the 2-sphere is a simple closed curve whose complementary components have equal area. The Equator Conjecture predicts that for any positive K, there are pairs of equators such that any Hamiltonian diffeomorphism sending one equator to the other must have Hofer norm larger than K. We will prove an alternative conjecture, by replacing "Hofer norm" with "quantitative fragmentation norm". To prove this, we construct new quasimorphisms defined on all area-preserving homeomorphisms on the 2-sphere, coming from methods inspired from mapping class groups and geometric group theory. Joint work with Yongsheng Jia.
Wednesday (28 Feb)
By the end of their degree, we expect our students to be independent learners who can read and understand mathematical texts (e.g. textbooks/papers) and study to understand the course material rather than trying to learn the exam. In practice, many of our students do not meet this expectation, in large part because we rarely teach these skills directly — they form part of the so-called hidden curriculum.
In this talk, I will present some activities I employed during an undergraduate calculus course which aimed to address these issues. I will also discuss how I borrow tools and techniques from my experience as a qualified English as a foreign language teacher to design my teaching sessions. Part of this was developed jointly with David Sheard (KCL).
Thursday (29 Feb)
In this talk, we introduce a fractional analogue of the Wiener chaos expansion. It is important to highlight that the fractional order relates to the order of chaos decomposition elements, and not to the process itself, which continues to be the standard Wiener process. The central instrument in our fractional analogue of the Wiener chaos expansion is the function denoted as $\mathcal{H}_\alpha(x,y)$, which is referred to herein as a power-normalised parabolic cylinder function ( and is very similar to the Hermite function).
Through careful analysis of several fundamental deterministic and stochastic properties, we affirm that this function essentially
serves as a fractional extension of the Hermite polynomial.
The power-normalised parabolic cylinder function $\mathcal{H}_\alpha(W_t,t)$ demonstrates martingale properties and can be interpreted as a fractional Itô integral with 1 as the integrand, thereby drawing parallels with its non-fractional counterpart.
We study the estimation of partial derivatives of nonparametric regression functions with many variables, with a view to conducting a significance test for the said derivatives. Our test is based on the moment generating function of the smoothed partial derivatives of an estimator of the regression function, where the estimator is a deep neural network. We demonstrate that in the context of modelling with neural networks, derivative estimation is in fact quite different from estimating the regression function itself, and hence the smoothing operation becomes important. To conduct an effective test with predictors of high or even diverging dimension, we assume that first, the observed high-dimensional predictors arise from a factor model and that second, only the lower-dimensional but latent factors and a subset of the marginals of the high-dimensional predictors drive the regression function. Moreover, we finely adjust the regression function estimator in order to achieve the desired asymptotic normality under the null hypothesis that the partial derivative in question is zero. We demonstrate the performance of our test in simulation studies.