Week 23.10.2023 – 29.10.2023

Monday (23 Oct)

Perla Sousi (University of Cambridge)
23 Oct at 15:00 - 16:00
KCL, Strand - Strand Building S4.29

Let X be a simple random walk in \mathbb{Z}_n^d with d\geq 3 and let t_{\rm{cov}} be the expected time it takes for X to visit all vertices of the torus. In joint work with Prévost and Rodriguez we study the set \mathcal{L}_\alpha of points that have not been visited by time \alpha t_{\rm{cov}} and prove that it exhibits a phase transition: there exists \alpha_* so that for all \alpha>\alpha_* and all \epsilon>0 there exists a coupling between \mathcal{L}_\alpha and two i.i.d. Bernoulli sets \mathcal{B}^{\pm} on the torus with parameters n^{-(a\pm\epsilon)d}with the property that \mathcal{B}^-\subseteq \mathcal{L}_\alpha\subseteq \mathcal{B}^+ with probability tending to 1 as n\to\infty. When \alpha\leq \alpha_*, we prove that there is no such coupling.

Posted by samuel.g.johnston@kcl.ac.uk

Tuesday (24 Oct)

Oscar Randal-Williams (Cambridge University )
24 Oct at 15:00 - 16:00
KCL, Strand - S4.29

Kreck and Su have recently described, almost completely, the mapping class group of a smooth hypersurface in CP^4. There is a "monodromy" map from the fundamental group of the space of all smooth hypersurfaces in CP^4 to this mapping class group, and I will explain how the image of this map can be described. I will then give some idea of the differential topology methods which go into the proof.

Posted by mehdi.yazdi@kcl.ac.uk

Thursday (26 Oct)

Eugene Shargorodsky (KCL)
26 Oct at 11:00 - 12:00
KCL, Strand - S5.20

I will discuss sharp estimates for the norm of the operator “identity minus conditional expectation”. They allow one to find the optimal constant in the bounded compact approximation property of Lp([0, 1]), 1 < p < infty. I will also discuss related open problems. The talk is based on a joint paper with T. Sharia.

Posted by felipe.marceca@kcl.ac.uk
Nicolás Hernández (UCL)
26 Oct at 14:00 - 15:00
KCL, Strand - S5.20

Gaussian Processes and the Kullback-Leibler divergence have been deeply studied in Statistics and Machine Learning. This paper marries these two concepts and introduce the local Kullback-Leibler divergence to learn about intervals where two Gaussian Processes differ the most. We address subtleties entailed in the estimation of local divergences and the corresponding interval of local maximum divergence as well. The estimation performance and the numerical efficiency of the proposed method are showcased via a Monte Carlo simulation study. In a medical research context, we assess the potential of the devised tools in the analysis of electrocardiogram signals.

Posted by yu.luo@kcl.ac.uk