Kernel methods for spatiotemporal learning with public policy applications
Seminar | January 18 | 4-5 p.m. | 1011 Evans Hall
Seth Flaxman, Department of Statistics, Oxford
In this talk I will highlight the statistical machine learning methods that I am developing, in response to the needs of my social science collaborators, to address public policy questions. My research focuses on flexible nonparametric modeling approaches for spatiotemporal data and scalable inference methods to be able to fit these models to large datasets. Most critically, my models and inference methods are tailored to answer scientific and public policy questions.
I will illustrate by talking about two ongoing criminology projects:
Predicting crime rates in a city neighborhood over weeks/months [Flaxman et al., ICML 2015]
We introduce a scalable Kronecker approach to Gaussian process inference using the Laplace approximation, an approximate Bayesian inference method. We show how the log-Gaussian Cox Process, with an expressive kernel parameterization, can learn space/time structure in a large point pattern dataset. Our approach has nearly linear scaling, allowing us to efficiently fit a point pattern dataset of n = 233,088 crime events over a decade in Chicago and discover spatially varying multiscale seasonal trends and produce highly accurate long-range local area forecasts.
Is gun violence contagious? [Loeffler and Flaxman, arXiv:1611.06713]
We assess spatiotemporal clustering and contagion of shootings in Washington, DC collected by an acoustic gunshot locator system, using a Hawkes process model, to quantify the spatial and temporal scales over which shooting events diffuse. While we find robust evidence for spatiotemporal diffusion, the spatial and temporal scales are extremely short (126 meters and 10 minutes), and thus more likely to be consistent with a discrete gun fight, lasting for a dmatter of minutes, than with a diffusing, infectious process linking violent events across hours, days, or weeks.