Optimal dimension selection in the functional linear regression model with functional ouptuts André Mas, Institut de Modélisation Mathématiques de Montpellier Abstract: Functional data arise when considering samples made of discretized curves. In this framework, classical multivariate methods are usually irrelevant first because the number of variables (ie the times of discretisation here) may exceed the sample size (the number of curves) second because the variables are highly correlated. Reconstructing the functions by approximation and signal processing techniques make it possible to propose regression models fitted for random curves. In this talk we consider a linear model with functional ouput. This model is expressed by means of an integral equation with an unknown kernel (to be estimated) plus a white noise. We provide the asymptotic mean square prediction error with exact constants and derive an optimal choice for the dimension reduction parameter $k_n$. The rates obtained are optimal in minimax sense and generalize those found when the output is real. The main results hold with mild assumptions on the rate of decay of the eigenvalues of the input. This allows to consider a class of parameters which is wider than those needed in previous papers on this topic. We will also address the issue of a central limit theorem for the predictor and show that no weak convergence result can be obtained for the bare estimate (without weak topologies or smooth norms). |