#### Date of Degree

6-2020

#### Document Type

Dissertation

#### Degree Name

Ph.D.

#### Program

Computer Science

#### Advisor

Victor Pan

#### Committee Members

Feng Gu

Liang Zhao

Bo Yuan

#### Subject Categories

Theory and Algorithms

#### Keywords

Low rank matrix approximation, matrix sketch, random projection, dimension reduction, SRHT, SRAHT

#### Abstract

Recent advances in matrix approximation have seen an emphasis on randomization techniques in which the goal was to create a sketch of an input matrix. This sketch, a random submatrix of an input matrix, having much fewer rows or columns, still preserves its relevant features. In one of such techniques random projections approximate the range of an input matrix. Dimension reduction transforms are obtained by means of multiplication of an input matrix by one or more matrices which can be orthogonal, random, and allowing fast multiplication by a vector. The Subsampled Randomized Hadamard Transform (SRHT) is the most popular among transforms. An m x n matrix can be multiplied by an n x l SRHT matrix in O(mn log l) arithmetic operations where typically l << min(m, n). This dissertation introduces an alternative, which we call the Subsampled Randomized Approximate Hadamard Transform (SRAHT), and for which complexity of multiplication by an input matrix decreases to O( (2n + l log n) m ) operations. We also prove that our sublinear cost variants of a popular subspace sampling algorithm output accurate low rank approximation (hereafter LRA) of a large class of input. Finally, we introduce new sublinear algorithms for the CUR LRA matrix factorization which consists of a column subset C and a row subset R of an input matrix and a connector matrix U. We prove that these CUR algorithms provide close LRA with a high probability on a random input matrix admitting LRA.

#### Recommended Citation

Svadlenka, John T., "Novel Fast Algorithms For Low Rank Matrix Approximation" (2020). *CUNY Academic Works.*

https://academicworks.cuny.edu/gc_etds/3834

**Graduate Center users:**

To read this work, log in to your GC ILL account and place a thesis request.

**Non-GC Users:**

See the GC’s lending policies to learn more.