We propose a tool for high-dimensional approximation based on hyperbolic wavelet regression, where we only allow low-dimensional dimension interactions. We adapt wavelets on the real axes to construct finite-dimensional periodic wavelet spaces. For functions in Sobolev or Besov-Nikolskij spaces we give a characterization in terms

of the wavelet coefficients. This gives us certain decay of the wavelet coeffcients. We study the problem of scattered-data approximation, where we evaluate the basis functions at the given sample points, create a matrix and solve the matrix equation with an LSQR-algorithm to get an approximation. In our case this matrix is sparse, since we deal with compactly supported wavelets. If we choose the number of parameters such that we have logarithmic oversampling, we can give a lower bound for the norm of the Moore-Penrose inverse, so that the LSQR-algorithm gives useful results. If we are concerned with i.i.d. samples, we show that the approximation error decays with the same rate as the error of the projection onto the finite dimensional function space with high probability. We can even bound the worst-case error for the whole class of functions in a Sobolev or Besov-Nikolskij space.

If the function has low effective dimension, we additionally determine the ANOVA decomposition of the approximated function, which allows us to omit ANOVA-terms with low variance in a second step in order to increase the accuracy.