Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Revision(s):

Revision #2 to TR15-128 | 2nd April 2017 12:16

Local Reconstruction of Low-Rank Matrices and Subspaces

RSS-Feed




Revision #2
Authors: Roee David, Elazar Goldenberg, Robert Krauthgamer
Accepted on: 2nd April 2017 12:16
Downloads: 730
Keywords: 


Abstract:

We study the problem of \emph{reconstructing a low-rank matrix}, where the input is an $n\times m$ matrix $M$ over a field $\mathbb{F}$ and the goal is to reconstruct a (near-optimal) matrix $M'$ that is low-rank and close to $M$ under some distance function $\Delta$.
Furthermore, the reconstruction must be local, i.e., provides access to any desired entry of $M'$ by reading only a few entries of the input $M$ (ideally, independent of the matrix dimensions $n$ and $m$).
Our formulation of this problem is inspired by the local reconstruction framework of Saks and Seshadhri (SICOMP, 2010).

Our main result is a local reconstruction algorithm for the case where $\Delta$ is the normalized Hamming distance (between matrices).
Given $M$ that is $\epsilon$-close to a matrix of rank $d<1/\epsilon$ (together with $d$ and $\epsilon$), this algorithm computes with high probability a rank-$d$ matrix $M'$ that is $O(\sqrt{d\epsilon})$-close to $M$.
This is a local algorithm that proceeds in two phases.
The \emph{preprocessing phase} reads only $O(\sqrt{{d}/{\epsilon^3}})$ random entries of $M$,
and stores a small data structure.
The \emph{query phase} deterministically outputs a desired entry $M'_{i,j}$ by reading only the data structure and $2d$ additional entries of $M$.

We also consider local reconstruction in an easier setting, where the algorithm can read an entire matrix column in a single operation. When $\Delta$ is the normalized Hamming distance between vectors,
we derive an algorithm that runs in polynomial time
by applying our main result for matrix reconstruction.
For comparison, when $\Delta$ is the truncated Euclidean distance and $\mathbb{F}=\mathbb{R}$,
we analyze sampling algorithms by using statistical learning tools.



Changes to previous version:

Fixed several bugs in the proof Appeared on Section 4.


Revision #1 to TR15-128 | 27th December 2015 15:14

Local Reconstruction of Low-Rank Matrices and Subspaces





Revision #1
Authors: Roee David, Elazar Goldenberg, Robert Krauthgamer
Accepted on: 27th December 2015 15:14
Downloads: 1347
Keywords: 


Abstract:

We study the problem of \emph{reconstructing a low-rank matrix}, where the input is an $n\times m$ matrix $M$ over a field $\F$
and the goal is to reconstruct a (near-optimal) matrix $M'$
that is low-rank and close to $M$ under some distance function $\Delta$.
Furthermore, the reconstruction must be local, i.e., provides access to
any desired entry of $M'$ by reading only a few entries of the input $M$
(ideally, independent of the matrix dimensions $n$ and $m$).
Our formulation of this problem is inspired by the local reconstruction framework of Saks and Seshadhri (SICOMP, 2010).

Our main result is a local reconstruction algorithm for the case
where $\Delta$ is the normalized Hamming distance (between matrices).
Given $M$ that is $\epsilon$-close to a matrix of rank $d<1/\epsilon$
(together with $d$ and $\epsilon$),
this algorithm computes with high probability a rank-$d$ matrix $M'$
that is $O(\sqrt{d\epsilon})$-close to $M$.
This is a local algorithm that proceeds in two phases.
The \emph{preprocessing phase}
reads only $\tO(\sqrt{{d}/{\epsilon^3}})$ random entries of $M$,
%runs in time $\tO\big( \frac{1}{\epsilon} (\frac{(8e)^2\ln^2 d}{ d\epsilon })^{d} \big)$
and stores a small data structure.
The \emph{query phase} deterministically outputs a desired entry $M'_{i,j}$
by reading only the data structure and $2d$ additional entries of $M$.
% runs in time $O(d^2)$

We also consider local reconstruction in an easier setting,
where the algorithm can read an entire matrix column in a single operation.
When $\Delta$ is the normalized Hamming distance between vectors,
we derive an algorithm that runs in polynomial time
by applying our main result for matrix reconstruction.
For comparison, when $\Delta$ is the truncated Euclidean distance and $\F=\R$,
we analyze sampling algorithms by using statistical learning tools.


Paper:

TR15-128 | 10th August 2015 10:49

Local Reconstruction of Low-Rank Matrices and Subspaces


Abstract:

We study the problem of \emph{reconstructing a low-rank matrix}, where the input is an $n\times m$ matrix $M$ over a field $\mathbb{F}$ and the goal is to reconstruct a (near-optimal) matrix $M'$ that is low-rank and close to $M$ under some distance function $\Delta$.
Furthermore, the reconstruction must be local, i.e., provides access to any desired entry of $M'$ by reading only a few entries of the input $M$ (ideally, independent of the matrix dimensions $n$ and $m$).
Our formulation of this problem is inspired by the local reconstruction framework of Saks and Seshadhri (SICOMP, 2010).

Our main result is a local reconstruction algorithm for the case where $\Delta$ is the normalized Hamming distance (between matrices).
Given $M$ that is $\epsilon$-close to a matrix of rank $d<1/\epsilon$ (together with $d$ and $\epsilon$), this algorithm computes with high probability a rank-$d$ matrix $M'$ that is $O(\sqrt{d\epsilon})$-close to $M$.
This is a local algorithm that proceeds in two phases.
The \emph{preprocessing phase} reads only $O(\sqrt{{d}/{\epsilon^3}})$ random entries of $M$,
and stores a small data structure.
The \emph{query phase} deterministically outputs a desired entry $M'_{i,j}$ by reading only the data structure and $2d$ additional entries of $M$.

We also consider local reconstruction in an easier setting, where the algorithm can read an entire matrix column in a single operation. When $\Delta$ is the normalized Hamming distance between vectors,
we derive an algorithm that runs in polynomial time
by applying our main result for matrix reconstruction.
For comparison, when $\Delta$ is the truncated Euclidean distance and $\mathbb{F}=\mathbb{R}$,
we analyze sampling algorithms by using statistical learning tools.



ISSN 1433-8092 | Imprint