Many data are naturally represented as multiway arrays or tensors, and as a result, tensor-based approaches have revolutionized data analysis, feature extraction, and data compression tasks. Despite the success, high-dimensional data analysis tools suffer from a so-called "curse of multidimensionality;" that is, that fundamental linear algebra properties break down in higher dimensions. Recent advances in matrix-mimetic tensor approaches have made it possible to preserve linear algebraic properties and, as a result, to obtain optimal representations of multiway data. Matrix-mimeticity arises from interpreting tensors as t-linear operators, which effectively are operations parameterized by invertible linear transformations. The choice of transformation is crucial to representation quality, and thus far, has been made heuristically. In this talk, we will learn data-dependent, orthogonal transformations by leveraging the optimality of matrix-mimetic representations. In particular, we will exploit the coupling between transformations and optimal tensor representations using variable projection. To preserve the invertibility of the linear transformation, we learn orthogonal transformations via Riemannian optimization. We will highlight the generality of our proposed approach through numerical experiments on a wide range of tasks, including portfolio optimization, image compression, and reduced order modeling.
Optimal Matrix-Mimetic Tensor Algebras via Variable Projection
Elizabeth Newman, Emory UniversityAuthors: Elizabeth Newman, Katherine Keegan
2023 AWM Research Symposium
Women in Tensor Optimization [Organized by Longxiu Huang and Jing Qin]