Effective degrees of freedom for nonlinear least squares lasso

Niels Richard Hansen
(University of Copenhagen)
Thiele Seminar
Thursday, 21 November, 2013, at 13:15-14:00, in Koll. G (1532-214)
Abstract:

The main motivation for the results presented in this talk is the modeling of large biological systems using ODE or SDE models.

The ambition is to develop models and methods for continuous time dynamical systems that scale well with the dimension of the system. To this end, we have studied nonlinear least squares lasso estimation.

In the mean value space the family of estimators are metric projections onto closed subsets. The family is parametrized by tuning parameters. To select the appropriate tuning parameters we need to estimate the generalization error. For least squares this amounts to estimation of the effective degrees of freedom, which is a well known concept for linear methods. Lasso is an inherently nonlinear method, but for linear least squares lasso and Gaussian errors the question of estimating the effective degrees of freedom has recently been completely settled.

For nonlinear least squares lasso and Gaussian errors we present a general result on the effective degrees of freedom, which relies on that the metric projection onto a closed subset is of bounded variation. It is, moreover, Lebesgue almost everywhere differentiable, and the trace of the derivative is an underestimator of the effective degrees of freedom. The result is closely related to TIC, but it is not based on asymptotic expansions.

We present an application of the results for estimation and model selection of systems where the mean solves a linear differential equation. The lasso-penalization induces sparsity and the implementation relies on fast sparse matrix exponentiation.

 

Organised by: The T.N. Thiele Centre
Contact person: Søren Asmussen