Low-degree learning and the metric entropy of polynomials
Discrete analysis (2023) Cet article a éte moissonné depuis la source Scholastica

Voir la notice de l'article

Let $\mathscr{F}_{n,d}$ be the class of all functions $f:\{-1,1\}^n\to[-1,1]$ on the $n$-dimensional discrete hypercube of degree at most $d$. In the first part of this paper, we prove that any (deterministic or randomized) algorithm which learns $\mathscr{F}_{n,d}$ with $L_2$-accuracy $\varepsilon$ requires at least $Ω((1-\sqrt{\varepsilon})2^d\log n)$ queries for large enough $n$, thus establishing the sharpness as $n\to\infty$ of a recent upper bound of Eskenazis and Ivanisvili (2021). To do this, we show that the $L_2$-packing numbers $\mathsf{M}(\mathscr{F}_{n,d},\|\cdot\|_{L_2},\varepsilon)$ of the concept class $\mathscr{F}_{n,d}$ satisfy the two-sided estimate $$c(1-\varepsilon)2^d\log n \leq \log \mathsf{M}(\mathscr{F}_{n,d},\|\cdot\|_{L_2},\varepsilon) \leq \frac{2^{Cd}\log n}{\varepsilon^4}$$ for large enough $n$, where $c, C>0$ are universal constants. In the second part of the paper, we present a logarithmic upper bound for the randomized query complexity of classes of bounded approximate polynomials whose Fourier spectra are concentrated on few subsets. As an application, we prove new estimates for the number of random queries required to learn approximate juntas of a given degree, functions with rapidly decaying Fourier tails and constant depth circuits of given size. Finally, we obtain bounds for the number of queries required to learn the polynomial class $\mathscr{F}_{n,d}$ without error in the query and random example models.
Publié le :
@article{DAS_2023_a5,
     author = {Alexandros Eskenazis and Paata Ivanisvili and Lauritz Streck},
     title = {Low-degree learning and the metric entropy of polynomials},
     journal = {Discrete analysis},
     year = {2023},
     language = {en},
     url = {http://geodesic.mathdoc.fr/item/DAS_2023_a5/}
}
TY  - JOUR
AU  - Alexandros Eskenazis
AU  - Paata Ivanisvili
AU  - Lauritz Streck
TI  - Low-degree learning and the metric entropy of polynomials
JO  - Discrete analysis
PY  - 2023
UR  - http://geodesic.mathdoc.fr/item/DAS_2023_a5/
LA  - en
ID  - DAS_2023_a5
ER  - 
%0 Journal Article
%A Alexandros Eskenazis
%A Paata Ivanisvili
%A Lauritz Streck
%T Low-degree learning and the metric entropy of polynomials
%J Discrete analysis
%D 2023
%U http://geodesic.mathdoc.fr/item/DAS_2023_a5/
%G en
%F DAS_2023_a5
Alexandros Eskenazis; Paata Ivanisvili; Lauritz Streck. Low-degree learning and the metric entropy of polynomials. Discrete analysis (2023). http://geodesic.mathdoc.fr/item/DAS_2023_a5/