Dissertations, Theses, and Capstone Projects

Date of Degree

6-2022

Document Type

Dissertation

Degree Name

Ph.D.

Program

Computer Science

Advisor

Jun Li

Committee Members

Alexey Ovchinnikov

Victor Pan

Gueo Grantcharov

Subject Categories

Numerical Analysis and Scientific Computing | OS and Networks | Other Computer Sciences | Theory and Algorithms

Keywords

Coding Theory, Information Theory, Distributed Computing, Machine Learning, Matrix Multiplication

Abstract

A ubiquitous problem in computer science research is the optimization of computation on large data sets. Such computations are usually too large to be performed on one machine and therefore the task needs to be distributed amongst a network of machines. However, a common problem within distributed computing is the mitigation of delays caused by faulty machines. This can be performed by the use of coding theory to optimize the amount of redundancy needed to handle such faults. This problem differs from classical coding theory since it is concerned with the dynamic coded computation on data rather than just statically coding data without any consideration of the algorithms to be performed on said data. Of particular interest is the operation of matrix multiplication, a fundamental operation in many big data/machine learning algorithms, which is the main focus of this dissertation. Two serendipitous consequences of the (bi-)linear nature of matrix multiplication is that it is both highly parallelizable and that linear codes can be applied to it; making the coding theory approach a fruitful avenue of research for this particular optimization of distributed computing.

Share

COinS