Nonparametric Inference in Functional Linear Quantile Regression by RKHS Approach
This paper studies an asymptotics of functional linear quantile regression in which the dependent variable is scalar while the covariate is a function. We apply a roughness regularization approach of a reproducing kernel Hilbert space framework. In the above circumstance, narrow convergence with respect to uniform convergence fails to hold, because of the strength of its topology. A new approach we propose to the lack-of-uniform-convergence is based on Mosco-convergence that is weaker topology than uniform convergence. By applying narrow convergence with respect to Mosco topology, we develop an infinite-dimensional version of the convexity argument and provide a proof of an asymptotic normality of argmin processes. Our new technique also provides the asymptotic confidence intervals and the generalized likelihood ratio hypothesis testing in fully nonparametric circumstance.