Cross-validation for dependent data

Harvard Rue (KAUST)

Tuesday 3rd October, 2023 16:00-17:00 Joseph Black A504

Abstract

I will discuss our new take on cross-validation (CV) for dependent data. Traditional use of CV, like leave-one-out CV, is justified using independence-like assumptions. With dependent data, leave-one-out CV makes less sense, as we are evaluating interpolation properties rather than prediction properties. We can adapt the CV idea to dependent data by removing a set of "near-by" data-points (to be defined) before predicting, but the issue is then how to do this in practice, which is less evident for more involved models. I will discuss our approach in the context of Latent Gaussian Models (LGM) where we can automatically select appropriate groups of data to remove before predicting one data point. The new group-CV approach is available in the R-INLA package.

Add to your calendar

Download event information as iCalendar file (only this event)