On Learning from Ghost Imaging without Imaging

On Learning from Ghost Imaging without Imaging

Issei Sato
The University of Tokyo / RIKEN
sato@k.u-tokyo.ac.jp
Abstract

Computational ghost imaging is an imaging technique with which an object is imaged from light collected using a single-pixel detector with no spatial resolution. Recently, ghost cytometry is proposed for an ultrafast cell-classification method that involves ghost imaging and machine learning in flow cytometry. Ghost cytometry skipped the reconstruction of cell images from signals and directly use signals for cell-classification because this reconstruction is the bottleneck in high-speed analysis. In this paper, we provide a theoretical analysis for learning from ghost imaging without imaging.



1 Introduction

Ghost imaging was first observed with entangled photon pairs and viewed as a quantum phenomenon [1]. It acquires object information through the correlation calculations of the light-intensity fluctuations of two beams: object and reference [2, 3]. The object beam passes through the object and is detected using a single-pixel detector, and the reference beam does not interact with the object and is recorded using a multi-pixel detector with spatial resolution. It was experimentally demonstrated that ghost imaging can be achieved using only a single detector [4].

Computational ghost imaging is an imaging technique with which an object is imaged from light collected using a single-pixel detector with no spatial resolution [5, 6]. By replacing reference-beam measurements, it only requires a single-pixel detector, which simplifies the experiments in comparison to traditional two-detector ghost imaging. Using the signals and illumination pattern enables us to computationally reconstruct cell images.

Let be a transmission function of an object. An object is illuminated by a speckle field generated by passing a laser beam through an optical diffuser, which is a material that diffuses light to transmit light. A detector measures the total intensity, , transmitted through the object given by

(1)

where is the -th speckle field.

The detector measurements are cross-correlated with the measured intensities. We can reconstruct the transmission function expressed by

(2)

where .

Ghost cytometry [7] is an ultrafast cell-classification method and involves ghost imaging and machine learning in flow cytometry. Flow cytometry is a technique to measure the characteristics of a population of particles (cell, bacteria etc.) at high speed such as cell size, cell count, cell morphology (shape and structure), and cell cycle phase. cyto- and -metry mean cell and measure, respectively. With flow cytometry, we can measure the information of a single cell. A sample including cells, e.g., blood cells , is injected into a flow cytometer, which is composed of three systems: flow/fluid, optical, and electric systems. It detects scattered light and the fluorescence of cells. From the detected scattered light and fluorescence signals, we can obtain information on the relative size and internal structure of a cell and on the cell membrane, cytoplasm, various antigens present in the nucleus, and quantities of nucleic acids.

Computational ghost imaging is well known as an imaging method. However, there was a breakthrough in ghost cytometry in which the reconstruction of cell images from raw signals can be skipped because this reconstruction is the bottleneck in high-speed analysis. Ghost cytometry directly uses raw signals to classify cells. Also, compressive ghost imaging uses randomly generated multiple illumination patterns to reconstruct an image. However, in ghost cytometry, cells pass through a randomly allocated illumination pattern and the signals are detected in time series using a single pixel detector. That is, we do not need to switch the illumination pattern to obtain the fluorescence-intensity features extracted from multiple illumination patterns, which differs from ghost imaging.

In this paper, we provide a theoretical analysis for learning from ghost imaging without imaging both general ghost imaging and specific ghost cytometry settings. The key in ghost cytometry is to approximate the radial bais function (RBF) kernel between cell images by using signals without imaging. That is,

(3)

where is the RBF kernel between image objects and and is the RBF kernel between signals and in ghost cytometry.

The remainder of this paper is organized as follows. In Sections 2 and 4, we give the details of ghost features, which are detected raw signals, in ghost imaging and ghost cytometry. In Sections 3 and 5, we theoretically analyze ghost features. In Section 6, we discusses how ghost cytometry captures the morphological features of cells.

2 Ghost Features in Ghost Imaging

Let be an pixel random binary masks where . The -th element, , indicates the -th speckle field . We construct by using

(4)

where is a parameter.

Denote an matrix representing an object as , i.e., the -th element, , indicates the value of a transmission function of an object, given by . Note that is nonnegative. Therefore, we reformulate measured using a detector, given by

(5)

We can reconstruct

(6)

where . However, we consider learning from ghost imaging without image reconstruction. We call Ghost features of object .

We define the -dimensional vector function expressed by

(7)
(8)
(9)

where is a transpose of a vector and matrix.

3 Analysis of Ghost Features in Ghost Imaging

In this section, we analyze the ghost features obtained from Eq. (5). First, we analyze the basic statistics of ghost features and describe their various properties. We then present Theorem 1. Ghost features are regarded as a type of random projections [8, 9, 10, 11]. Thus, we analyze ghost features in terms of random projections.

Definition 1 (L norm and Frobenius norm).

Denote the L norm of vector as and Frobenius norm of matrix as .

Definition 2 (L norm and Frobenius norm).

Let the summation of matrix elements be

(10)

Note that

(11)

First, let us find the expectation and variance of a ghost feature.

Proposition 1 (Basic Statistics of Ghost Feature).
(12)
Proof.

Since ,

(13)

Since and ,

(14)

Lemma 1 (Hoeffding’s Lemma).

Let be a bounded random variable with . Then for all ,

(15)
Corollary 1.

Let be a Bernoulli random variable:

(16)

where . Then, for all ,

(17)

We then consider the two parts of a ghost feature.

(18)
Lemma 2 (Basic statistics of Part I).
(19)
(20)
Proof.
(21)
(22)

Lemma 3 (Basic statistics of Part II).
(23)
(24)
Proof.
(25)
(26)

Next, we analyze the property, subGaussian, of a ghost feature.

Definition 3 (-subGaussian).

A random variable is said to be -subGaussian if and there exists such that its moment generating function satisfies

(27)
Lemma 4 (Part I is -subGaussian).
(28)
Proof.
(29)

Lemma 5 (Part II is -subGaussian).
(30)
Proof.
(31)

Lemma 6.

Assume that is -subGaussian and is -subGaussian, then is -subGaussian.

Proof.

On the basis of HÖlder’s inequality, let us introduce for some ; thus, we have

(32)

When we set , we have

(33)

Proposition 2.

Ghost feature is -subGaussian.

Proof.

From Lemmas 4 and 5,

  • is -subGaussian,

  • is -subGaussian.

Thus, is -subGaussian. ∎

We use notations and below.

Lemma 7.

Let be a random variable such that

(34)

Then, for any positive integer ,

(35)

where is a gamma function. Moreover,

(36)

and

(37)
Proof.

For ,

(38)

By using Fubini’s theorem,

(39)

Moreover, and for any . Thus, for any

(40)

Lemma 8.

Let be -subGaussian with . Then,

(41)
Proof.
(42)

Thus, if , we have

(43)

Lemma 9.

For ,

(44)
Proof.
(45)

We now have

(46)

That is,

(47)

Therefore, on the basis of HÖlder’s inequality,

(48)

Since is -subGaussian, for ,

(49)

Thus,

(50)

Since is -subGaussian, for ,

By using , for , we have

(52)

Theorem 1.

For real matrices and and , we set

(53)
(54)

With probability at least ,

(55)

and

(56)
Proof.

This holds as a consequence of Lemmas 10, 11 and Proposition 3 below.

From Lemmas 10, and 11 for every real matrix , with probability at least ,

(57)

On the basis of the linearity of a ghost feature (Proposition 3), substitute for