Background on random projections in euclidean spaces

A *randomized embedding* between normed metric spaces with distortion , with
and failure probability is a distribution probability over mappings
,
such that for every pair
, the following property holds with probability :

Hence, by choosing such that , it is proved the following:

The embedding exhibited in [13] consists in random projections from
into
, represented
by matrices with random orthonormal vectors.
Similar results may be obtained by using simpler embeddings, represented through random matrices
, where are random variables such that:

For sake of simplicity, we call random projections even this kind of embeddings.

We may build random matrices in different ways to obtain random projections obeying the Johnson-Lindenstrauss lemma:

- The
*Plus-Minus-One (PMO)*projection: , randomly chosen such that:

- The
*Achlioptas*projection [1]:

*Generalized Achlioptas*projection:

with .- A
*Normal*projection: . - Other projections such that:

A particular case of randomized map is represented by *Random Subspace (RS)* projections [11].
Indeed these projections do not satisfy the *JL lemma* and may induce large distortions into the projected data.
They are represented by matrices
, where
are uniformly chosen with entries in , and with exactly one per row and at most one per column.
It is worth noting that, in this case,
the "compressed" data set
can be quickly computed in time
, independently from .
Unfortunately, it easy to see that in this case
and
and hence
*RS* does not satisfy the *JL lemma*.