Revealing quantum chaos with machine learning

Revealing quantum chaos with machine learning

Abstract

Understanding the properties of quantum matter is an outstanding challenge in science. In this work, we demonstrate how machine learning methods can be successfully applied for the classification of various regimes in single-particle and many-body systems. We realize neural network algorithms that perform a classification between regular and chaotic behavior in quantum billiard models with remarkably high accuracy. By taking this method further, we show that machine learning techniques allow to pin down the transition from integrability to many-body quantum chaos in Heisenberg XXZ spin chains. Our results pave the way for exploring the power of machine learning tools for revealing exotic phenomena in complex quantum many-body systems.

1

Introduction. Significant attention to machine learning techniques is related to their applications in tasks of finding patterns in data, such as image recognition, speech analysis, computer vision, and many other domains LeCun2015 (). Quantum physics is well known to produce atypical patterns in data, which are in principle can be revealed using machine learning methods Biamonte2017 (). This idea has stimulated an intensive ongoing research of this subject. The scope so far includes identification of phases of quantum matter and detecting phase transitions Broecker2016 (); Melko2017 (); Schindler2017 (); Chng2017 (); Nieuwenburg2017 (); Ringel2018 (); Beach2018 (); Greitemann2018 (), as well as representing quantum states of many-body systems in regimes that are intractable for existing exact numerical approaches Troyer2017 (); Glasser2018 (); Lu2018 (). Another branch of research is related to the applications of machine learning tools to the analysis of experimental data Troyer2018 (); Zhang2018 (); Sriarunothai2018 (). Recently, a machine learning approach has been used for processing data from gas microscopes and evaluating predictions of competing theories that describe the doped Hubbard model without a bias towards one of the theories Knap2019 ().

Remarkable progress on building large-scale quantum simulators has opened fascinating prospects for the observation of novel quantum phases and exotic states Monroe2013 (); Rey2017 (); Lukin2017 (); Monroe2018 (). They also provide interesting insights to traditionally challenging problems in studies of complex quantum systems, such as investigation of quantum critical dynamics and quantum chaos Polkovnikov2016 (). Quantum systems with chaotic behaviour are of great interest in the view of a possibility to explore quantum scars in them Papic2018 (). Quantum many-body scars can be potentially compatible with long-lived states, which are of importance for quantum information processing. A standard criterion for the separation between regular and chaotic regimes is based on the nearest-neighbor energy level statistics Berry1977 (); Bohigas1984 (): Poisson and Wigner-Dyson distributions correspond to integrable and chaotic systems, respectively. However, the energy level statistics of highly excited states is not always directly accessible in experiments.

From the machine learning perspective, an interesting problem is to understand whereas it is possible to distinguish between regular and chaotic behavior, in the best-case scenario, based on experimentally accessible quantities such as data from projective measurements. Within this context, finding an appropriate criteria to identify a transition within machine learning tools is essential.

Figure 1: Neural network approach for identifying a transition between chaotic and regular states in quantum billiards and Heisenberg spin chains. The input data contains probability distribution in the configuration space, the two neuron activation functions are used for the identification of the two regimes.
Figure 2: Convolutional neuron network outputs for (a) the Sinai billiard, (b) the Bunimovich stadium, and (c) the Pascal’s limaçon as functions of the chaoticity parameter characterizing billiard’s boundary shape. The highlighted critical region corresponds to the regions of “uncertainty” in neuron network output activation curves. The analysis of the chaotic/regular transition for the Bunimovich stadium is the most challenging due to its extreme sensitivity to the variation of the chaoticity parameter (see Ref. Tao2008 ()).

In the present paperá we implement a neural network based algorithm to perform a classification between regular and chaotic states in single-particle and many-body systems. The input data contains the wavefunction amplitudes of excited states and the output is represented by two neurons corresponding to integrable and chaotic classes (Fig. 1). In the single-particle case, we consider a paradigmatically important models of quantum billiards, such as Sinai billiard, the Bunimovich stadium, and the Pascal’s billiard. We then apply a semisupervised “learning by confusion” scheme Nieuwenburg2017 () in order to detect the integrability/chaos transition and to evaluate a critical transition region. This approach is then extended in order to study the transition in Heisenberg XXZ spin-1/2 chain in the presence of additional interactions that break integrability, such as next-nearest neighbour spin-spin interaction and a coupling of a single spin to a local magnetic field or a magnetic impurity. In our work, regular/chaos transitions are identified with the classification accuracy up to . We show that our results based on the machine learning approach are in a good agreement with the analysis of level spacing distributions.

To address the problem of revealing the transition between regular and chaotic behaviour, we propose a learning approach based on the prior evaluation of the critical region and further detecting the critical point within its boundaries, performed by the confusion scheme Nieuwenburg2017 (). At the first stage, we train the network to distinguish states belonging to the extreme cases of regular () and chaotic () regimes, where is the chaoticity parameter. We then determine the critical domain where the neural network predicts a transition between the two regimes. At the second stage, we perform the ‘learning by confusion’ scheme and we refer the middle peak on W-like performance curves of the neural network as the transition point Nieuwenburg2017 ().

Quantum billiards. Quantum billiards are among the simplest models exhibiting quantum chaos. The problem of transition from regular to chaotic behaviour in quantum billiards have been intensively studied for decades Jain2017 (). The transition from integrability to chaos is controlled by the shape of the billiard boundary. Quantum billiards have been realized in various experimental setups including microwave cavities Sridhar1991 (), ultracold atoms Raizen2001 (), and graphene quantum dots Geim2008 (). Quantum scars Heller1993 (), which are regions with enhanced amplitude of the wavefunction in the vicinity of unstable classical periodic trajectories, is the hallmark of quantum chaos. Quantum scars are of a great interest in quantum billiards Heller1993 (); Tao2008 () and their analogs have recently been found in many-body systems Papic2018 ().

We consider three standard types of two-dimensional quantum billiards: Sinai billiard, Bunimovich stadium, and Pascal’s limaçon (Robnik) billiard. We define a dimensionless parameter of chaoticity for each billiard type, where is determined by the billiard shape. In Sinai billiard the chaoticity parameter is controlled by the ratio of the radius of the inner circle to the width/height of the external rectangle, so that . In the case of Bunimovich stadium the parameter is and in the of Pascal’s limaçon billiard shape is defined via the conformal map on the complex plane , where . At the limit of these billiards have regular shapes and therefore are integrable. Varying the parameter allows one to trace out a continuous transition from integrability to quantum chaos.

Figure 3: Universal W-like NN perormance curves in the “learning by confusion” scheme for the Sinai billiard (top panel) and the Pascal’s limaçon (bottom panel). The predicted transition point is highlighted. The estimated position of the transition point predicted from the KL divergence calculation [see Eq. (1)] based on the lowest 500 energy levels is shown with a red dot.
Figure 4: Unsupervised learning of regular and chaotic states in quantum billiards with variational autoencoder (VAE). Latent space representation for the wavefunctions in (a) Bunimovich stadium, (b) Sinai billiard, are coordinates in the latent space with dimension 2.

We use a supervised learning approach for revealing chaotic/regular transitions in quantum billiard models. We train a binary classifier based on convolutional neural network (CNN) using real space images of the probability density function (PDF) . The training dataset consist of randomly sampled snapshots of the PDF in fragments excluding the billiard’s boundary in the regions of interest (ROI). The datasets are prepared separately for the each billiard type. The wave functions are obtained from the numerical solution of the stationary Shrödinger equation (for details on the numerical solution of the Shrödinger equation and the dataset preparation, see Ref. SM ()). Since the information about the transition from the regular to chaotic regimes is mostly represented in the properties of highly excited states, we use wavefunctions with sufficiently large values of in our dataset.

The snapshots corresponding to we label as “regular” (class 1), and snapshots corresponding to a large value of we label as “chaotic” (class 2). The activations of the neurons in the last layer allow to classify between chaotic/regular snapshots in the test dataset with a high accuracy, see Fig. 2. The activation curves for each of the three billiard types (Sinai, Bunimovich, Pascal) for different values of in Fig. 2 demonstrate that the CNN algorithm is able to learn the difference between regular and chaotic wavefunctions and reveals existence of a transition region. The CNN confidence for the binary classification for away from the transition region. The transition region determined by the CNN is highlighted in red in Fig. 2. In Sinai and Bunimovich billiards the critical region detected by the CNN algorithm is . The critical region for the Pascal billiard is . The boundaries of the transition regions provided by the CNN classifier are in a good agreement with the ones obtained from the analysis of the energy levels spacing statistics SM ().

The transition region can be analyzed in more details within the “learning by confusion” scheme Nieuwenburg2017 () by performing a dynamical reassignment of the class labels with respect to a given value of . We present the NN “confusion curves” with a typical W-like shape for Sinai and Pascal billiards in Fig. 3. The central peak of the W-like CNN performance curve gives an estimate of the position of the critical point separating regular and chaotic regimes. We note that a precise definition of the transition point is somewhat ambiguous and depends on selected criteria, because all observables have a smooth dependence on the parameter . Therefore, in our approach we only estimate the location of a characteristic critical point , separating regular and chaotic regimes. The prior identification of the critical region is important in the “learning by confusion” scheme, since it allows to garantee the presence of the transition point inside of the selected range of . The estimated position of the critical point is in Sinai billiard and in Pascal limaçon billiard. We note that the analysis of the chaotic/regular transition for the Bunimovich stadium is challenging due to its extreme sensitivity to the variation of the chaoticity parameter (see Ref. Tao2008 ()).

Figure 5: Neural network classification accuracy between integrable and chaotic XXZ spin chains for spins. In order to independently pinpoint a location of the transition from integrability to chaos we present distribution of energy level spacings and the Poisson/Wigner-Dyson distributions (). Top panels: XXZ model with the next-nearest neighbor interactions; bottom panels: XXZ model in the presence of a local magnetic field (a magnetic impurity) at the central site of the spin chain.

One of the key features that allows us to perform machine learning of the regular-to-chaos transition is the difference in statistical properties of in the two regimes. While in the chaotic case the wavefunctions have Gaussian statistics, in regular case the probability distribution is non-universal and has a power-law singularity at small values of  Beugeling2017 ().

The standard approach to identify a transition from an integrability to a quantum chaos is based on the comparison of the energy level spacing statistics with the Poisson distribution and the Wigner-Dyson distributions. In order to characterize a “degree of chaoticity” of the system it is convenient to introduce a single scalar quantity, a measure of chaos. One of the examples of a such measure is the average ratio of consecutive level spacings , where and  Atas2013 (). In the present work we introduce a different measure based on the Kullback-Leibler (KL) divergence, defined as follows:

(1)

where is the level spacing distribution for a given value of , and is the Wigner-Dyson or Poisson distribution: , . Here is the unfolded nearest neighbour energy level spacing.

In the transition region between regular and chaotic regimes the energy spacings distribution is neither the Poisson nor the Wigner-Dyson. The KL distance between and () is the measure of integrability (chaoticity) of the system. There exists a point when is equidistant from both Poisson and Wigner-Dyson distributions in the KL metric, , which we refer as a “critical point”. As it is shown in Fig. 3, the critical points predicted by the confusion scheme and KL divergence curves are in a good agreement. It is important to note that the confusion scheme uses experimentally accessible quantities, whereas extracting the energy levels statistics from experimental data is hardly achievable in condensed matter and atomic simulator experiments.

An alternative approach to differentiate between regular and chaotic wavefunctions is to use unsupervised machine learning techniques, such as Variational Autoencoder (VAE). VAEs are generative NN models that are able to directly learn statistical distributions in raw data and can be efficiently used for solving clasterization problems Kingma2014 (); Sohn2015 (). VAE consists of encoding NN, latent space and decoding NN, Fig. 4a. During the training VAE “learns” to reproduce initial data by optimizing the weight in the encoder and decoder NN and parameters in the latent layer. Training VAE on the images with corresponding regular () and chaotic () cases and by taking samples from the latent space with the dimension 2 results in two clearly separated clusters representing regular and chaotic wavefunctions. In Figs. 4(b) and  4(c) we demonstrate latent space distributions for the cases of Bunimovich and Sinai billiards. The separation in the two clusters shows that VAE is able to learn the difference in the statistical properties of in regular and chaotic billiards. Similar approach was used for unsupervised learning of phase transitions Wetzel2017 (). Exploring a full potential of unsupervised machine learning methods for clasterizing quantum states is beyond the scope of the present work.

Quantum chaos in XXZ spin chains. While quantum billiards is an instructive example of a single particle quantum chaos, the most interesting and challenging problem is quantum chaos in many-body systems. Developing machine learning approaches to characterize/classify many-body states in chaotic and integrable regimes using only limited information from measurements is a non-trivial task. For example, such techniques can benefit the analysis of experimental data from quantum simulators. As a prototypical example of a quantum many-body integrable system we consider 1D Heisenberg XXZ spin chain, which is of great interest for realizing models of quantum magnetism using quantum simulators Bloch2017 (). Recent experimental advances have opened exciting prospects for exploiting a rich variety of tunable interactions in Rydberg atoms Browaeys2016 (); Lukin2016 (); Lukin2017 (); Browaeys2018 (); Browaeys2018-2 () and cold polar molecules Buchler2012 (); Ye2012 (); Rey2013 () for the engineering of spin Hamiltonians including the XXZ model.

The Hamiltonian of the Heisenberg XXZ model reads:

(2)

where is the number of spins, and are the Heisenberg exchange constants and are Pauli spin-1/2 operators. For simplicity we consider only antiferromagnetic XXZ model, . Hereafter we set . The XXZ model is integrable and exactly solvable by the Bethe ansatz Buchler2012 (), however it can be non-integrable in the presence of additional interactions.

Here we consider two types of perturbations that violate integrability of the XXZ model: (i) antiferromagnetic next-nearest neighbour spin-spin interaction (NNN), (ii) a local static magnetic field acting on a single spin (impurity). We parametrize the perturbation Hamiltonians in the following form:

(3)

We consider spin chains with an odd number of spins , so that in the case (ii) the local magnetic field is acting on the spin in the middle of the chain, i.e. . Hence, the Hamiltonian of the perturbed XXZ model reads:

(4)

We train a multilayer perceptron on the dataset containing the probabilities of the spin configurations in representation ( refers to basis states in -representation), e.g. , that are experimentally accessible data. The eigenfunctions are obtained by exact diagonalization of spin-chain Hamiltonian, here we consider system size . Similarly to the case of quantum billiards, we consider only highly excited states with corresponding to the levels lying in the middle of the energy spectrum, .

Further, in order to pindown transition in these systems we evaluate NN classification prediction for the test dataset as a function of , see Fig. 5. The transition region is highlighted with red. For XXZ + NNN and XXZ + impurity detected critical regions are and respectively, which turn out to be in agreement with level spacing distributions represented in Fig. 5. Within these critical regions we performed “learning by confusion”, that resulted in W-like NN performance curves, see Fig. 6, and detected transition points for XXZ + NNN and for XXZ + impurity. We note that we have a reasonable agreement with the results based on the KL divergence calculations.

Figure 6: Reconstructed universal W-like NN performance curves for (a) XXZ chain with NNN and (b) XXZ chain in the presence of an impurity, the predicted transition point is highlighted. The transition point predicted by the KL divergence calculation (1) for the energy spacing distribution is also presented.

Conclusion. In summary, we have shown the potential of classical supervised and unsupervised machine learning techniques for classification of regular/chaotic regimes in single-particle and many-body systems. For quantum billiards and XXZ spin chains we demonstrated that neural networks can serve as a binary classifier to distinguish between the two regimes with remarkably high accuracy. We revealed the integrability-chaos transition region purely based on machine learning techniques and located the transition point using “learning by confusion” approach. The extension of our work opens a new avenue to study chaotic and integrable regimes in quantum systems using experimentally accessible data in different many-body quantum systems including atomic simulators. Harnessing machine learning methods could open up exciting possibilities for studying exotic many-body phenomena with controlled quantum many-body systems, such as many-body localization Altshuler2006 (), many-body quantum scars Papic2018 (), and ergodic/non-ergodic phase transitions Shlyapnikov2018 () and near-critical properties of these systems.

Acknowledgements. We are grateful to M.B. Zvonarev and V.V. Vyborova for valuable suggestions. We thank G.V. Shlyapnikov, V.I. Yudson, and B.L. Altshuler for fruitful discussions and useful comments. The work was supported by the RFBR (Grant No. 18-37-00096).

Supplementary material

s0.1 Numerical solution of the Schrödinger equation for quantum billiards.

We solve a stationary Shrödinger equation describing a single particle in a quantum billiard with the Dirichlet boundary condition:

(S1)

where is the wavefunction and is the energy of a particle in the billiard with the boundary ; is the two-dimensional Laplace operator. Hereafter we set the Plank’s constant and the mass to unity, . In order to solve Eq. (S1) for an arbitrary 2D billiard boundary shape we use Matlab PDE toolbox. The PDE solver is based on the finite element method with an adaptive triangular mesh for a given boundary geometry. In order to reduce computational complexity and to avoid additional complications due to degeneracies of eigenstates, we constrain the eigenfunctions to a specific symmetry (parity) sector. We remove degeneracies by considering the lowest symmetry segments of billiards. In the case of the Bunimovich stadium we consider a quarter of the billiard [see inset of Fig. 2(b) in the main text]. For the Sinai billiard we consider a boundary with the incommensurate ratio of vertical and horizontal dimensions of the external rectangle, (we denote in the main text). In the case of the Pascal limaçon billiard, the degeneracy is lifted when considering only the upper part of the billiard .

s0.2 Dataset preparation for quantum billiards

Wavefunctions obtained from numerical solution of the Schrödinger equation are converted into images of PDFs . From original images with pixels we randomly select a square fragments (region of interest) which exclude the billiard boundary, pixels. In order to reduce the size of the images we perform a coarse graining (downsampling) to images with dimensions . The dataset for each billiard type contains wavefunctions corresponding to highly energy states, . In order to increase amount of images in the dataset we perform an augmentation of the dataset by adding horizontal and vertical reflections, discrete rotations by angles and rotations by random angles from the uniform distribution . The total number of images in the resulting dataset for each billiard type and each value of is . The trial samples from the dataset for the Bunimovich billiard are shown in Fig. S1.

•

Figure S1: Sample images of in the dataset for Bunimovich billiard. Regular case () and chaotic case ().

The training dataset consists of labeled images from the class 1 (regular, ) and class 2 (chaotic, ). The value of we independently choose for each billiard type: Sinai - , Bunimovich - , Pascal - . In order to check that at the system is in the chaotic regime we compare the energy level spacing distribution with the Wigner-Dyson distribution. As long as the value of is much greater than the critical , , the NN activations curves remain practically unchanged (see Fig. 2 in the main text).

The training and test dataset are split in the proportion . The test set for each billiard type consists of images for several values of (including values of not present in the training dataset), evaluation of the NN output for the sample images from the test dataset for each value of results in the NN prediction curves presented in Fig. 2 in the main text.

s0.3 Convolutional neural network

The used CNN consists of two convolutional layers followed by a fully connected layer and a final softmax layer. The output from the second convolutional layer is the subject to dropout regularization and batch normalization. The cost function for the binary classifier is the cross-entropy. The neuron activation function is ReLU. The scheme of the CNN architecture is presented in Fig. S2.

Figure S2: CNN used for recognizing chaotic regimes in quantum billiards.

The weights in the CNN are optimized with the use of the Adam optimizer. The batch size is 60, the number of training epochs is of about , the learning rate is .

Energy level spacing statistics in quantum billiards

Figure S3: Left column: The CNN activation functions (Fig. 2 from the main text). The histograms show the energy level spacing distributions (lowest 500 energy levels). In order to compare NN prediction for the regular-to-chaos transition region we compare the energy level spacing distribution with the standard Poisson/GOE distributions.

s0.4 Unsupervised learning with VAE

We perform unsupervised learning of two classes (“regular” and “chaotic”) using a variational autoencoder (VAE). The unlabeled dataset was prepared in a similar way as for the supervised learning. Dataset consist of randomly sampled images of with the dimensions , number of samples in the training dataset for each billiard type is , number of testing samples is for each billiard type. VAE was trained and tested for the states with in Bunimovich and Sinai’s billiards, corresponds to the “regular” class, corresponds to the “chaotic” class. VAE consists of the encoder, decoder, sampler and the latent space of dimension 2 (latent space parameters and ) representing the two classes, “regular” and “chaotic”. The sampler generates random latent space variables with the mean and the dispersion . Encoder and decoder are represented by a fully connected NN with two hidden layers and neurons in each layer. The objective function is a sum of reconstruction loss (binary cross entropy) and KL divergence loss. VAE was trained over 50 epochs using Adam optimizer Kingma2014 (), learning rate is , batch size is samples.

Figure S4: Architecture of variational autoencoder (VAE) for unsupervised learning of regular-chaos transition in quantum billiards.

s0.5 Exact diagonalization of the Hamiltonian of XXZ model

We find eigenstates of Heisenberg XXZ model for an arbitrary value of perturbation parameter by the exact diagonalization method based on the Lancsoz algorithm Sandvik2011 (). We used Python implementation of QuSpin software package Weinberg2017 (). In order to avoid extensive computational costs, the size of Hamiltonian matrix was reduced by considering only the eigenstates in certain parity and magnetization sectors of the XXZ Heisenberg model. Specifically, we find eigenstates in the even parity sector and the lowest magnetization sector. The lowest magnetization sector corresponds to the states with (for odd spin chains), where and the number of up and down spins, respectively.

s0.6 Dataset preparation for Heisenberg XXZ chains

Dataset for Heisenberg XXZ chains consists of vectors of probability densities (PDs) corresponding to integrable and chaotic Hamiltonians. We take the wavefunction corresponding to a quantum state with the energy lying in the center of the spectrum. In order to prepare a diverse dataset for a given value of we randomly select from the uniform distribution . Since the XXZ model is integrable for any value of we build a dataset corresponding to a set of different Hamiltonians by varying . In the training set we include PDs for regular systems () and chaotic systems () and label the samples, accordingly. The test set contains PDs corresponding to a discrete set of lying in the interval . The training set contains 400 samples, testing set consists of 100 samples.

s0.7 Multi-layer perceptron

Figure S5: Multilayer perceptron used for investigation integrable/chaotic transitions in Heisenberg XXZ chains.

We used a standard multi-layer perceptron neural network that consists of an input layer with the size , which is equal to the size of vector with probability densities in the specified symmetry (parity and total magnetization) sector of the eigenstates; one hidden layer with neurons, and an output softmax layer. Neurons of the hidden layer receive input and a weight () and compute output , where . An output is computed with a sigmoid activation function . Further, each output with a corresponding weight () is passed to a neuron of an output layer, which finally results a scalar value between 0 and 1. The objective function is the binary cross-entropy. Neural network’s weights are optimized using Adam optimizer Kingma2014 () with the learning rate , batch size of samples, training epochs. The scheme of the neural network architecture is presented in Fig. S5.

References

Footnotes

  1. preprint: APS/123-QED

References

  1. Y. LeCun, Y. Bengio, and G. Hinton, Nature (London) 521, 436 (2015).
  2. J. Biamonte, P. Wittek, N. Pancotti, P. Rebentrost, N. Wiebe, and S. Lloyd, Nature (London) 549, 195 (2017).
  3. L. Wang, Phys. Rev. B 94, 195105 (2016).
  4. J. Carrasquilla and R.G. Melko, Nat. Phys. 13, 431 (2017).
  5. P. Broecker, J. Carrasquilla, R.G. Melko, and S. Trebst, Sci. Rep. 7, 8823 (2017).
  6. F. Schindler, N. Regnault, and T. Neupert, Phys. Rev. B 95, 245134 (2017).
  7. K. Ch�ng, J. Carrasquilla, R. G. Melko, and E. Khatami, Phys. Rev. X 7, 031038 (2017).
  8. E.P.L. van Nieuwenburg, Y.-H. Liu, and S.D. Huber, Nat. Phys. 13, 435 (2017).
  9. M. Koch-Janusz and Z. Ringel, Nat. Phys. 14, 578 (2018).
  10. M.J.S. Beach, A. Golubeva, and R.G. Melko, Phys. Rev. B 97, 045207 (2018).
  11. J. Greitemann, K. Liu, and L. Pollet, Preprint at arXiv.org:1804.08557.
  12. X.-Y. Dong, F. Pollmann, and X.-F. Zhang, Preprint at arXiv.org:1806.00829.
  13. B.S. Rem, N. Käming, M. Tarnowski, L. Asteria, N. Fläschner, C. Becker, K. Sengstock, and C. Weitenberg, Preprint at arXiv.org:1809.05519.
  14. K. Liu, J. Greitemann, and L. Pollet, Preprint at arXiv.org:1810.05538.
  15. G. Carleo and M. Troyer, Science 355, 602 (2017).
  16. I. Glasser, N. Pancotti, M. August, I.D. Rodriguez, and J.I. Cirac, Phys. Rev. X 8, 011006 (2018).
  17. S. Lu, X. Gao, and L.-M. Duan, Preprint at arXiv.org:1810.02352.
  18. G. Torlai, G. Mazzola, J. Carrasquilla, M. Troyer, R. Melko, and G. Carleo, Nat. Phys. 14, 447 (2017).
  19. T. Sriarunothai, S. Wölk, G.S. Giri, N. Friis, V. Dunjko, H.J. Briegel, and C. Wunderlich, Quantum Sci. Technol. 4, 015014 (2019).
  20. Y. Zhang, A. Mesaros, K. Fujita, S. D. Edkins, M. H. Hamidian, K. Ch�ng, H. Eisaki, S. Uchida, J.C. Séamus Davis, E. Khatami, and E.-A. Kim, Preprint at arXiv.org:1808.00479.
  21. A. Bohrdt, C.S. Chiu, G. Ji, M. Xu, D. Greif, M. Greiner, E. Demler, F. Grusdt, and M. Knap, Preprint at arXiv.org:1811.12425.
  22. R. Islam, C. Senko, W.C. Campbell, S. Korenblit, J. Smith, A. Lee, E.E. Edwards, C.-C. J. Wang, J.K. Freericks, and C. Monroe, Science 340, 583 (2013).
  23. M. Gärttner, J.G. Bohnet, A. Safavi-Naini, M.L. Wall, J.J. Bollinger, and A.M. Rey, Nat. Phys. 13, 781 (2017).
  24. H. Bernien, S. Schwartz, A. Keesling, H. Levine, A. Omran, H. Pichler, S. Choi, A.S. Zibrov, M. Endres, M. Greiner, V. Vuletić, and M.D. Lukin, Nature (London) 551, 579 (2017).
  25. J. Zhang, G. Pagano, P. W. Hess, A. Kyprianidis, P. Becker, H. Kaplan, A.V. Gorshkov, Z.-X. Gong, and C. Monroe, Nature (London) 551, 601 (2017).
  26. L. D’Alessio, Y. Kafri, A. Polkovnikov, and M. Rigol, Adv. Phys. 65, 239 (2016).
  27. C.J. Turner, A.A. Michailidis, D.A. Abanin, M. Serbyn, and Z. Papic, Nat. Phys. 14, 745 (2018).
  28. M.V. Berry and M. Tabor, P. Roy. Soc. Lond. A Mat. 356, 375 (1977)
  29. O Bohigas, M. J. Giannoni, and C. Schmit, Phys. Rev. Lett. 52, 1 (1984).
  30. S.R. Jain and R. Samajdar, Rev. Mod. Phys. 89, 045005 (2017).
  31. S. Sridhar, Phys. Rev. Lett. 67, 785 (1991).
  32. V. Milner, J.L. Hanssen, W.C. Campbell, and M.G. Raizen, Phys. Rev. Lett. 86, 1514 (2001).
  33. L.A. Ponomarenko, F. Schedin, M.I. Katsnelson, R. Yang, E.H. Hill, K.S. Novoselov, and A.K. Geim, Science 320, 356 (2008).
  34. E.J. Heller, Phys. Rev. Lett. 53, 1515 (1984).
  35. T. Tao, Structure and Randomness: pages from year one of a mathematical blog (American Mathematical Society, 2008).
  36. W. Beugeling, A. Bäcker, R. Moessner, and M. Haque, Phys. Rev. B 98, 155102 (2018).
  37. Y.Y. Atas, E. Bogomolny, O. Giraud, and G. Roux, Phys. Rev. Lett. 110, 084101 (2013).
  38. See supplemental material.
  39. D.P. Kingma and M. Welling, Auto-encoding variational Bayes, ICLR (2014).
  40. K. Sohn, H. Lee, and X. Yan. Advances in Neural Information Processing Systems (NIPS, 2015).
  41. S. J. Wetzel, Phys. Rev. E 96, 022140 (2017).
  42. C. Gross and I. Bloch, Science 357, 995 (2017).
  43. D. Barredo, S. de Leseleuc, V. Lienhard, T. Lahaye, and A. Browaeys, Science 354, 1021 (2016).
  44. M. Endres, H. Bernien, A. Keesling, H. Levine, E.R. Anschuetz, A. Krajenbrink, C. Senko, V. Vuletić, M. Greiner, and M.D. Lukin, Science 354, 1024 (2016).
  45. D. Barredo, V. Lienhard, S. de Léséleuc, T. Lahaye, and A. Browaeys, Nature (London) 561, 79 (2018).
  46. S. de Léséleuc, S. Weber, V. Lienhard, D. Barredo, H.P. Büchler, T. Lahaye, and A. Browaeys, Phys. Rev. Lett. 120, 113602 (2018).
  47. D. Peter, S. Ml̈ler, S. Wessel, and H. P. Büchler, Phys. Rev. Lett. 109, 025303 (2012).
  48. A. Chotia, B. Neyenhuis, S.A. Moses, B. Yan, J.P. Covey, M. Foss-Feig, A.M. Rey, D.S. Jin, and J. Ye, Phys. Rev. Lett. 108, 080405 (2012).
  49. K.R.A. Hazzard, S.R. Manmana, M. Foss-Feig, and A.M. Rey, Phys. Rev. Lett. 110, 075301 (2013).
  50. D.M. Basko, I.L. Aleiner, and B.L. Altshuler, Ann. Phys. (Amsterdam) 321, 1126 (2006).
  51. X. Deng, V.E. Kravtsov, G.V. Shlyapnikov, and L. Santos, Phys. Rev. Lett. 120, 110602 (2018).
  52. A.W. Sandvik, Preprint at arXiv.org:1101.3281.
  53. P. Weinberg and M. Bukov, SciPost Phys. 2, 003 (2017).
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
""
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
   
Add comment
Cancel
Loading ...
340884
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description