Voir la notice de l'article provenant de la source Numdam
We consider the problem of reconstructing an unknown bounded function defined on a domain from noiseless or noisy samples of at points . We measure the reconstruction error in a norm for some given probability measure . Given a linear space with , we study in general terms the weighted least-squares approximations from the spaces based on independent random samples. It is well known that least-squares approximations can be inaccurate and unstable when is too close to , even in the noiseless case. Recent results from [6, 7] have shown the interest of using weighted least squares for reducing the number of samples that is needed to achieve an accuracy comparable to that of best approximation in , compared to standard least squares as studied in [4]. The contribution of the present paper is twofold. From the theoretical perspective, we establish results in expectation and in probability for weighted least squares in general approximation spaces . These results show that for an optimal choice of sampling measure and weight , which depends on the space and on the measure , stability and optimal accuracy are achieved under the mild condition that scales linearly with up to an additional logarithmic factor. In contrast to [4], the present analysis covers cases where the function and its approximants from are unbounded, which might occur for instance in the relevant case where and is the Gaussian measure. From the numerical perspective, we propose a sampling method which allows one to generate independent and identically distributed samples from the optimal measure . This method becomes of interest in the multivariate setting where is generally not of tensor product type. We illustrate this for particular examples of approximation spaces of polynomial type, where the domain is allowed to be unbounded and high or even infinite dimensional, motivated by certain applications to parametric and stochastic PDEs.
DOI : 10.5802/smai-jcm.24
Keywords: multivariate approximation, weighted least squares, error analysis, convergence rates, random matrices, conditional sampling, polynomial approximation.
Cohen, Albert 1 ; Migliorati, Giovanni 1
@article{SMAI-JCM_2017__3__181_0,
     author = {Cohen, Albert and Migliorati, Giovanni},
     title = {Optimal weighted least-squares methods},
     journal = {The SMAI Journal of computational mathematics},
     pages = {181--203},
     publisher = {Soci\'et\'e de Math\'ematiques Appliqu\'ees et Industrielles},
     volume = {3},
     year = {2017},
     doi = {10.5802/smai-jcm.24},
     mrnumber = {3716755},
     zbl = {1416.62177},
     language = {en},
     url = {http://geodesic.mathdoc.fr/articles/10.5802/smai-jcm.24/}
}
                      
                      
                    TY - JOUR AU - Cohen, Albert AU - Migliorati, Giovanni TI - Optimal weighted least-squares methods JO - The SMAI Journal of computational mathematics PY - 2017 SP - 181 EP - 203 VL - 3 PB - Société de Mathématiques Appliquées et Industrielles UR - http://geodesic.mathdoc.fr/articles/10.5802/smai-jcm.24/ DO - 10.5802/smai-jcm.24 LA - en ID - SMAI-JCM_2017__3__181_0 ER -
%0 Journal Article %A Cohen, Albert %A Migliorati, Giovanni %T Optimal weighted least-squares methods %J The SMAI Journal of computational mathematics %D 2017 %P 181-203 %V 3 %I Société de Mathématiques Appliquées et Industrielles %U http://geodesic.mathdoc.fr/articles/10.5802/smai-jcm.24/ %R 10.5802/smai-jcm.24 %G en %F SMAI-JCM_2017__3__181_0
Cohen, Albert; Migliorati, Giovanni. Optimal weighted least-squares methods. The SMAI Journal of computational mathematics, Tome 3 (2017), pp. 181-203. doi: 10.5802/smai-jcm.24
Cité par Sources :
