Optimal weighted least-squares methods
The SMAI Journal of computational mathematics, Tome 3 (2017), pp. 181-203

Voir la notice de l'article provenant de la source Numdam

We consider the problem of reconstructing an unknown bounded function u defined on a domain X d from noiseless or noisy samples of u at n points (x i ) i=1,,n . We measure the reconstruction error in a norm L 2 (X,dρ) for some given probability measure dρ. Given a linear space V m with dim (V m )=mn, we study in general terms the weighted least-squares approximations from the spaces V m based on independent random samples. It is well known that least-squares approximations can be inaccurate and unstable when m is too close to n, even in the noiseless case. Recent results from [6, 7] have shown the interest of using weighted least squares for reducing the number n of samples that is needed to achieve an accuracy comparable to that of best approximation in V m , compared to standard least squares as studied in [4]. The contribution of the present paper is twofold. From the theoretical perspective, we establish results in expectation and in probability for weighted least squares in general approximation spaces V m . These results show that for an optimal choice of sampling measure dμ and weight w, which depends on the space V m and on the measure dρ, stability and optimal accuracy are achieved under the mild condition that n scales linearly with m up to an additional logarithmic factor. In contrast to [4], the present analysis covers cases where the function u and its approximants from V m are unbounded, which might occur for instance in the relevant case where X= d and dρ is the Gaussian measure. From the numerical perspective, we propose a sampling method which allows one to generate independent and identically distributed samples from the optimal measure dμ. This method becomes of interest in the multivariate setting where dμ is generally not of tensor product type. We illustrate this for particular examples of approximation spaces V m of polynomial type, where the domain X is allowed to be unbounded and high or even infinite dimensional, motivated by certain applications to parametric and stochastic PDEs.

Publié le :
DOI : 10.5802/smai-jcm.24
Classification : 41A10, 41A25, 41A65, 62E17, 93E24
Keywords: multivariate approximation, weighted least squares, error analysis, convergence rates, random matrices, conditional sampling, polynomial approximation.

Cohen, Albert 1 ; Migliorati, Giovanni 1

1 Sorbonne Universités, UPMC Univ Paris 06, CNRS, UMR 7598, Laboratoire Jacques-Louis Lions, 4, place Jussieu 75005, Paris, France.
@article{SMAI-JCM_2017__3__181_0,
     author = {Cohen, Albert and Migliorati, Giovanni},
     title = {Optimal weighted least-squares methods},
     journal = {The SMAI Journal of computational mathematics},
     pages = {181--203},
     publisher = {Soci\'et\'e de Math\'ematiques Appliqu\'ees et Industrielles},
     volume = {3},
     year = {2017},
     doi = {10.5802/smai-jcm.24},
     mrnumber = {3716755},
     zbl = {1416.62177},
     language = {en},
     url = {http://geodesic.mathdoc.fr/articles/10.5802/smai-jcm.24/}
}
TY  - JOUR
AU  - Cohen, Albert
AU  - Migliorati, Giovanni
TI  - Optimal weighted least-squares methods
JO  - The SMAI Journal of computational mathematics
PY  - 2017
SP  - 181
EP  - 203
VL  - 3
PB  - Société de Mathématiques Appliquées et Industrielles
UR  - http://geodesic.mathdoc.fr/articles/10.5802/smai-jcm.24/
DO  - 10.5802/smai-jcm.24
LA  - en
ID  - SMAI-JCM_2017__3__181_0
ER  - 
%0 Journal Article
%A Cohen, Albert
%A Migliorati, Giovanni
%T Optimal weighted least-squares methods
%J The SMAI Journal of computational mathematics
%D 2017
%P 181-203
%V 3
%I Société de Mathématiques Appliquées et Industrielles
%U http://geodesic.mathdoc.fr/articles/10.5802/smai-jcm.24/
%R 10.5802/smai-jcm.24
%G en
%F SMAI-JCM_2017__3__181_0
Cohen, Albert; Migliorati, Giovanni. Optimal weighted least-squares methods. The SMAI Journal of computational mathematics, Tome 3 (2017), pp. 181-203. doi: 10.5802/smai-jcm.24

Cité par Sources :