A. Litvinenko, D.E. Keyes, V. Khoromskaia, B.N. Khoromskij, H.G. Matthies
Computational Methods in Applied Mathematics, 19 (1), pp. 101-122, (2019)
Fourier transform, Low-rank tensor approximation, Geostatistical optimal design, Kriging, Mat ern covariance, Hilbert tensor, Kalman filter, Bayesian update, Loglikelihood surrogate
In this work, we describe advanced numerical tools for working withmultivariate functions and forthe analysis of large data sets. These tools will drastically reduce the required computing time and thestorage cost, and, therefore, will allow us to consider much largerdata sets or finer meshes. Covariancematrices are crucial in spatio-temporal statistical tasks, but are often very expensive to compute andstore, especially in 3D. Therefore, we approximate covariance functions by cheap surrogates in alow-rank tensor format. We apply the Tucker and canonical tensor decompositions to a family ofMat ́ern- and Slater-type functions with varying parameters anddemonstrate numerically that theirapproximations exhibit exponentially fast convergence. We prove the exponential convergence of theTucker and canonical approximations in tensor rank parameters.Several statistical operations areperformed in this low-rank tensor format, including evaluating the conditional covariance matrix,spatially averaged estimation variance, computing a quadratic form, determinant, trace, loglikelihood,inverse, and Cholesky decomposition of a large covariance matrix. Low-rank tensor approximationsreduce the computing and storage costs essentially. For example,the storage cost is reduced from anexponentialO(nd) to a linear scalingO(drn), wheredis the spatial dimension,nis the number ofmesh points in one direction, andris the tensor rank. Prerequisites for applicability of the proposedtechniques are the assumptions that the data, locations, and measurements lie on a tensor (axes-parallel) grid and that the covariance function depends on a distance,‖x−y‖.