Experimental Optimal Verification of Entangled States using Local Measurements
The initialization of a quantum system into a certain state is a crucial aspect of quantum information science. While a variety of measurement strategies have been developed to characterize how well the system is initialized, for a given one, there is in general a trade-off between its efficiency and the accessible information of the quantum state. Conventional quantum state tomography can characterize unknown states by reconstructing the density matrix; however, its exponentially expensive postprocessing is likely to produce a deviate result. Alternatively, quantum state verification provides a technique to quantify the prepared state with significantly fewer measurements, especially for quantum entangled states. Here, we experimentally implement an optimal verification of entangled states with local measurements, where the estimated infidelity is inversely proportional to the number of measurements. The utilized strategy is tolerant of the impurity of realistic states, hence being highly robust in a practical sense. Even more valuable, our method only requires local measurements, which incurs only a small constant-factor () penalty compared to the globally optimal strategy requiring nonlocal measurements.
Quantum information is a field where the information is encoded into quantum states, and taking advantage of the “quantumness” of these states, we can perform more efficient computations Shor () and more secure cryptography Bennett () compared to their classical counterparts. Basically, there are two preconditions for the achievement of these breakthroughs. On one hand, we need to initialize quantum systems into the target quantum states reliably. On the other hand, we have to develop precise and efficient techniques to characterise the prepared states. A variety of techniques have been developed for quantum state characterization, each of which is designed and optimized along the specific scenario. Quantum state tomography (QST) James () is designed for characterizing a completely unknown state, and the reconstructed density matrix can provide the full information of the quantum state. However, the conventional tomographic reconstruction of a state is an exponentially time-consuming and computationally difficult process. Adaptive QST relies on solving an optimization problem using previous results, hence to be computationally expensive as standard QST Mahler (); Qi (); Chapman (); Yang (). With the prior knowledge that the measured states are within some special categories of states, compressed sensing Flammia1 (); Gross () and matrix product state tomography Cramer () can achieve a significantly higher efficiency compared to QST. Entanglement witnesses extract only partial information about the state with far fewer measurements Toth1 (); Toth2 (). In recent years, a device-independent method, namely, self-testing, has provided a guarantee of the lowest possible fidelity as the figure of merit of the tested devices Zhang1 (); Zhang2 (); Zhang3 (); Jed (); Coladangelo ().
From a practical point of view, in many important scenarios, we merely need to know how well the system is prepared into a target state. In other words, we need to certify that the on-hand quantum device, designed to produce a particular quantum state, does indeed produce states that are sufficiently close to the target states. Apparently, QST can be applied in these scenarios and provide complete information about the actual states; however, its exponential complexity prevents its application to large-scale systems. Alternatively, quantum state verification (QSV) represents a technique to overcome this shortage at substantially lower complexity. Concretely, QSV is implemented by performing a sufficient amount of measurements on the output states, and eventually the verifier reaches a conclusion from the statistical analysis, such as: “the device outputs copies of a state that has at least fidelity with the target, with confidence”. This conclusion can be read into two stages: first, the verifier certifies that the produced states are within a ball of radius of the correct state; second, this conclusion might be incorrect with a probability of .
Intuitively, optimizing the performance of a specific verification strategy can be realized by minimizing or with a given number of measurements . Rigorously, this performance is quantified as the scaling of or with . In the realm of physical parameter estimation, the precision is normally standard-quantum-limited, which is proportional to . Heisenberg-scaling metrology can achieve scaling; however, its realization generally requires precious quantum resources, e.g., quantum entanglement Chen1 (); Chen2 (). Quantum state measurement, also faces a similar situation; it is difficult to attain the scaling with normal methods Sugiyama (); Flammia2 (). For the QSV of multi-partite quantum system, scaling can be attained via entangled measurement Pallister (), which is also a rare quantum resource and difficult to implement in experiment Gisin (). Recently, a theoretical breakthrough suggests that it is also possible to realize scaling in QSV with local measurements Pallister (). However, when applying their strategy to a real QSV experiment, impurity of realistic states is likely to produce a rejection outcome within a very limited number of measurements, which prevents the QSV from a valid conclusion. In this work, we experimentally investigate this strategy on a series of two-qubit entangled states with impurities as low as 0.005, and the statistical analysis is modified to be tolerant of single rejections. These features make our scheme robust to state imperfections, and eventually achieve a remarkable improvement in efficiency over previous works. Our results clearly indicate that an optimal verification of entangled states has been realized, with only nonadaptive and noncollective local measurements.
Robust and optimal QSV proposal with local measurements. We consider QSV in a real experimental situation. The quantum device produces a series of quantum states , which are all supposed to be under the assumption of independent and identical distribution. In practice, these states may deviate from or even be different from each other. However, it is still reasonable to expect that all these states are close to with the following fidelity:
The task for the verifier is to certify that this is the true case and minimize the uncertainty by performing a finite number of measurements. Especially, these measurements are expected to be local, nonadaptive, and noncollective. Obviously, for a pure target state, the globally optimal strategy is to project to the target state and its orthogonal state . Because a verification procedure consists of a finite number of measurements and the outcome of each measurement is inevitably probabilistic, the states dissatisfying Eq. 1 still have a certain probability to pass the verification. The confidence level of giving the conclusion is defined to be with to be a small value standing for the probability giving wrong conclusion. With a globally optimal strategy, the minimum number of measurements required to achieve a certain value of and is then given by
It can be seen that a scaling can be realized for the infidelity with a given confidence level. The coefficient is , which remains a small constant for a high confidence level. Globally optimal scaling allows for a high verification efficiency; however, entangled measurements are intricate to implement and are regarded as precious resources in quantum information processing lutk (); Vaidman (); Calsamiglia (); Ewert (). In contrast, local measurements are more feasible in a practical sense. A recent theoretical breakthrough showed that it is possible to achieve a scaling in the verification of two-qubit pure entangled states , while simply with a few local measurements settings Pallister (). Without loss of generality, considering the task to certify the singlet state , one locally optimal strategy utilizing three measurement settings is written as:
where is the projector onto the negative eigen-subspace of the tensor product of Pauli matrices XX (and likewise for and ).
For each state from the device, a measurement strategy is applied and the verifier accepts the outcome “-1” or rejects the outcome “+1”. In an ideal case, where these states are perfectly pure and all his received outcomes are “-1”, the relation between the number of measurements , the fidelity and the confidence level can be written as:
Evidently, a scaling of with is predicted by this equation, which approaches the globally optimal strategy and slower by a factor of . For the singlet state is calculated to be 2/3, while it depends on the specific strategy for other . This equation can be read in different ways according to the given task. If the verifier wants to certify the state up to certain fidelity and confidence level , Eq. 4 gives the lowest number of measurements that must be performed. From another point of view, if the verifier has performed measurement with all the outcomes being “-1”, the achieved fidelity and confidence level can also be identified from Eq. 4.
As all these profits are built on the premise that all the outcomes should be “-1”, a single appearance of “+1” will cease the verification without any valid statement. In practice, the imperfections in the experiment will unavoidably yield a certain probability to observe “+1” for each measurement. Although single rejection probability is small, the probability that all the outcomes are accepted decreases exponentially with the number of measurements; therefore, the current QSV strategy is severely limited for experimental implementation.
A modified strategy is thus developed here by considering the proportion of accepted outcomes, which is tolerant of a certain degree of state imperfections. Quantitatively, we have the corollary that if for all the measured states, the probability for each outcome to be accepted is smaller than . As a result, in the case that the verifier observes an accepted probability , it should be concluded that the actual state satisfies Eq. 1 with confidence level to be , where and are calculated from the equation Dimic ()
and is the number of accepted outcomes among measurements. Benefiting from this modification, single rejections only lead to a decrease in the expected fidelity or confidence. If only the final accepted probability , the verifier can still give a valid statement about the distance between the actual and target states.
Experimental Implementation of QSV. The proposed QSV is performed with the setup shown in Fig.2, which mainly consists of an entangled photon-pair source and a random sampling measurement apparatus. In the first part, tunable two-qubit entangled states are prepared by pumping a nonlinear crystal placed into a phase-stable Sagnac interferometer (SI) (see Method Section for details). Polarization-entangled photon pairs are generated in the state ( and denote the horizontally and vertically polarized components, respectively) and is controlled by the pumping polarization. A maximally entangled singlet state, partially entangled state and product state can be generated when and , respectively. The projective measurements on both sides, which are jointly decided by the same quantum random generator (QRG), are randomly performed. The randomness here means the measurements are unknown to the entangled photon pairs until they are measured.
In the experiment, we perform random projective measurements for quantum states with and . For the maximally entangled singlet state with , the three measurement settings are selected as those in Eq. (3). For a partially entangled state with , the proposed optimal strategy can be found in the Method Section. The situation changes for the product state with , for which the locally optimal strategy is consistent with the globally optimal strategy, and thus, we only need to project the product state onto itself and its orthogonal complementary space. For all these three states, we perform 40000 measurements in a single trial, and the values of and for each measurement are calculated from Eq. (5), as shown in Fig. 3. In Fig. 3 (a), is calculated with the infidelity set as for all the three tested states. Within 1000 measurements, rapidly approaches 0, meaning that a near-unity degree of confidence level can be achieved with a pretty high efficiency. Alternatively, we can set the confidence level to be 0.95 and calculate . Fig. 3(b) shows that for all the three states the estimated descends below 0.01 after 40000 measurements. For both analysis methods, random rejections occur and cause abrupt increases in the estimated values of and . As a result, neither nor can persistently descend with increasing measurements as predicted by Eq. (4). One example is the verified infidelity for the singlet state, which approximates the infidelity value from the tomography study after 40000 measurements.
In the study of the scaling of the infidelity with the increasing number of measurements, the results of a single trial of QSV are not convincing because it is too probabilistic, considering the random rejections caused by imperfect realistic states. In order to study the true scaling behavior, we perform 50 trials of verification and average the results ranging from 20 to 80 measurements, as shown in Fig. 4. Due to the high quality of the prepared states, the rejection event in this range is rare, and thus, the exhibited scaling can be regarded as the true performance of the verification proposal. The line for the product state with almost overlaps with that of the globally optimal strategy, which is predicted by Eq. (2). For the two entangled states with and , the observed scaling is also approximately parallel to the globally optimal line, indicating that a scaling verification is achieved in our experiment. The use of local measurement settings only incurs a constant-factor penalty over the globally optimal strategy, which are and 1.5 for and , respectively.
Given a confidence level, the QSV strategy utilized here directly accesses the fidelity of the actual state to the target state. Although its description cannot be as comprehensive as that of the point estimation such as state tomography, it achieves superior performance in in three aspects: first, QSV enables much less complex postprocessing and higher efficiency than state tomography if the verifier is only concerned about the fidelity; second, for standard quantum state tomography, maximum likelihood is used to obtain physical estimates for quantum states, which is strongly biased given a small quantity of samplesSchwemmer (); third, the point estimation results may not accurately describe the actually prepared states, because the tomographic reconstruction is given probabilistically and it is difficult to estimate the distance between the target and actual states. Previous works show that adaptive measurements or collective measurements help in achieving better results in tomography. The adaptive measurements require that the experimental setup be able to perform feedback control and varying measurements, which are not easily calibrated in practice Okamoto (). The apparatus to implement collective measurements Hou (), such as quantum walk circuits, is also intricate because multi-path interference is difficult to generate and maintain in practical experiment.
To summarize, we experimentally realize an optimal QSV, which is easy to implement and robust to realistic imperfections. The exhibited 1/n scaling results from the strategy itself, without entangled or adaptive measurements. Our results have clear implications for many quantum measurement tasks, and may be used as a firm basis for subsequent work on more complex quantum systems.
Iv Materials and Methods
Generation of entangled photon pairs. Concretely, a 405.4 nm single mode laser is used to pump a 5mm long bulk type-II nonlinear periodically poled potassium titanyl phosphate (PPKTP) nonlinear crystal placed into a phase-stable SI to produce polarization entangled photon pairs at 810.8 nm. A PBS followed by a HWP and a PCP are used to control the polarization mode of the pump beam. The lens before and after the SI are used to focus the pump light and collimate the entangled photons, respectively. The interferometer is composed of two high reflective and polarization-maintained mirrors, a Di-HWP and a Di-PBS. “Di” here means it works for both 405.4 nm and 810.8 nm. The Di-HWP flips the polarization of passing photons, such that the type-II PPKTP can be pumped by the same horizontal light from both clockwise and counterclockwise directions. Di-IF and LPF (Long pass filter) are used to remove the pump beam light. BPF (band pass filter) and SMF are used for spectral and spatial filtering, which can significantly increase the fidelity of entangled states.
The whole setup, in particular the PPKTP, is sensitive to temperature fluctuations. Placing the PPKTP on a temperature controller (C stability) and sealing the SI with an acrylic box would be helpful for improving temperature stability.
Tomography is performed on all the three states with a large amount of samples () to yield a reliable reference for verification results.
Implementation of QSV. The power of the laser is attenuated to decrease the generation rate of entangled photon pairs. As a result, for each randomly selected measurement, with a near-unity probability, there is at most only one coincidence to be recorded. The verifier accepts when Alice’s outcome is opposite to that of Bob’s; otherwise it rejects.
For partially pure entangled states ( and ), the locally optimal strategy involves four measurement settings:
For each pair of the entangled states, one of the four measurements is randomly chosen with weight (,,,). It is easy to check that the four probabilities add up to 1. With these measurement settings, is calculated as .
The concrete angles of HWP and QWP used to construct the corresponding measurement settings for verification are shown in Table I.
- (1) PW. Shor, Algorithms for quantum computation: Discrete logarithms and factoring Proceedings 35th annual symposium on foundations of computer science. 124-134 (Ieee, 1994).
- (2) C. H. Bennett, G. Brassard, Proceedings of the IEEE International Conference on Computers, Systems and Signal Processing.175-179 (Bangalore, 1984).
- (3) D. F. V. James, P. G. Kwiat, W. J. Munro, A. G. White, Measurement of qubits.Phys. Rev. A 64, 052312 (2001).
- (4) D. H. Mahler, L. A. Rozema, A. Darabi, C. Ferrie, R. Blume-Kohout, A. M. Steinberg, Adaptive quantum state tomography improves accuracy quadratically.Phy. Rev. Lett., 111, 183601 (2013).
- (5) Bo. Qi, Z. B. Hou, Yuan-long. Wang, Dao-yi. Dong, Han-Sen. Zhong, Li. Li, Guo-Yong.Xiang, H. M. Wiseman, Chuan-Feng. Li, Guang-Can. Guo, npj Quantum Inf. 3, 19 (2017).
- (6) R. J. Chapman, C.Ferrie, A. Peruzzo, Experimental demonstration of self-guided quantum tomography. Phys. Rev. Lett. 117, 040402 (2016).
- (7) L. K. Yang, G. Chen, W-H. Zhang, X-X. Peng, S. Yu, X-J. Ye, C-F. Li, G-C. Guo, Self-guided method to search maximal Bell violations for unknown quantum states.Phys. Rev. A 96, 052310 (2017).
- (8) S. T. Flammia, D. Gross, Y. K.Liu, J.Eisert, Quantum tomography via compressed sensing: error bounds, sample complexity and efficient estimators. New Journal of Physics, 14.9, 095022 (2012).
- (9) D. Gross, Y. K. Liu, S. T. Flammia, Quantum state tomography via compressed sensing. Phy. Rev. Lett., 105, 150401 (2010).
- (10) M. Cramer, M. B. Plenio, S. T. Flammia, R. Somma, D. Gross, S. D. Bartlett, Olivier. Landon-Cardinal, David.Poulin, Yi-Kai. Liu, Efficient quantum state tomography. Nat. commun. 1, 149 (2010).
- (11) G.Tth, O.G’́uhne, Detecting genuine multipartite entanglement with two local measurements.Phy. Rev. Lett., 94, 060501 (2005).
- (12) G.Tth, O. G’́uhne, Entanglement detection in the stabilizer formalism.Phys. Rev. A 72, 022340 (2005).
- (13) W-H. Zhang, G. Chen, X-X. Peng, X-J. Ye, P. Yin, Y. Xiao, Z-B. Hou, Z-D. Cheng, Y-C. Wu, J-S. Xu, C-F. Li, G-C. Guo, Experimentally robust self-testing for bipartite and tripartite entangled states. Phy. Rev. Lett., 121, 240402 (2018).
- (14) W-H. Zhang, G. Chen, P. Yin, X-X. Peng, X-M. Hu, Z-B. Hou, Z-Y. Zhou, S. Yu, X-J. Ye, Z-Q. Zhou, X-Y. Xu, J-S. Tang, J-S. Xu, Y-J. Han, B-H. Liu, C-F. Li, G-C. Guo, Experimental demonstration of robust self-testing for bipartite entangled states.npj Quantum Information, 5, 4 (2019).
- (15) W-H. Zhang, G. Chen, X-X. Peng, X-J. Ye, P. Yin, X-Y. Xu, J-S. Xu, C-F. Li, G-C. Guo,Experimental Realization of Robust Self-Testing of Bell State Measurements.Phy. Rev. Lett., 122, 090402 (2019).
- (16) J. Kaniewski, Analytic and nearly optimal self-testing bounds for the clauser-horne-shimony-holt and mermin inequalities. Phys. Rev. Lett. 117, 070402 (2016).
- (17) A. Coladangelo, K. T.Goh, V. Scarani, All pure bipartite entangled states can be self-tested.Nat. Commun. 8, 15485 (2017).
- (18) G. Chen, N. Aharon, Y-N. Sun, Z-H. Zhang, W-H.Zhang, D-Y. He, J-S. Tang, X-Y. Xu, Y. Kedem, C-F. Li, G-C. Guo, Heisenberg-scaling measurement of the single-photon Kerr non-linearity using mixed states. Nat. Commun. 9, 93 (2018).
- (19) G. Chen, L. Zhang, W-H.Zhang, X-X. Peng, L. Xu, Z-D. Liu, X-Y. Xu, J-S. Tang, Y-N. Sun, D-Y. He, J-S. Xu, Z-Q. Zhou, C-F. Li, G-C. Guo, Achieving Heisenberg-scaling precision with projective measurement on single photons.Phys. Rev. Lett. 121, 060506 (2018).
- (20) T. Sugiyama, P. S. Turner, M.Murao, Precision-guaranteed quantum tomography.Phys. Rev. Lett. 111, 160406 (2013).
- (21) S. T. Flammia, Y. K. Liu, Direct fidelity estimation from few Pauli measurements. Phys. Rev. Lett. 106, 230501 (2011).
- (22) S. Pallister, N. Linden, A. Montanaro, Optimal verification of entangled states with local measurements.Phys. Rev. Lett. 120, 170502 (2018).
- (23) N. Gisin, Entanglement 25 years after Quantum Teleportation: testing joint measurements in quantum networks, arXiv: 1809.10901.
- (24) N. Lütkenhaus, J. Calsamiglia, K. A. Suominen, Bell measurements for teleportation.Phys. Rev. A 59, 3295 (1999).
- (25) L. Vaidman, N. Yoran, Methods for reliable teleportation. Phys Rev A 59, 116 (1999).
- (26) J. Calsamiglia, N. L’́utkenhaus, Maximum efficiency of a linear-optical Bell-state analyzer. Applied Physics B, 72, 67-71 (2001).
- (27) F. Ewert, P. van Loock, 3/4-efficient Bell measurement with passive linear optics and unentangled ancillae. Phys. Rev. Lett. 113, 140403 (2014).
- (28) A. Dimi, B. Daki, Single-copy entanglement detection. npj Quantum Information, 4, 11 (2018).
- (29) C. Schwemmer, L. Knips, D. Richart, H. Weinfurter, T. Moroder, M. Kleinmann, O. Gühne, Systematic errors in current quantum state tomography tools. Phys. Rev. Lett. 114, 080403 (2015).
- (30) R. Okamoto, M. Iefuji, S. Oyama, K. Yamagata, H. Imai, A. Fujiwara, S. Takeuchi, Experimental demonstration of adaptive quantum state estimation.Phys. Rev. Lett. 109, 130404 (2012).
- (31) Z. Hou, J-F. Tang, J. Shang, H. Zhu, J. Li, Y. Yuan, K-D. Wu, G-Y. Xiang, C-F. Li, G-C. Guo, Deterministic realization of collective measurements via photonic quantum walks. Nat. Commun. 9, 1414 (2018).
Funding: This work was supported by the National Key Research and Development Program of China (Nos. 2016YFA0302700, 2017YFA0304100), National Natural Science Foundation of China (Grant Nos. 11874344, 61835004, 61327901, 11774335, 91536219, 11821404), Key Research Program of Frontier Sciences, CAS (No. QYZDY-SSW-SLH003), Anhui Initiative in Quantum Information Technologies (AHY020100, AHY060300), the Fundamental Research Funds for the Central Universities (Grant No. WK2030020019, WK2470000026), Science Foundation of the CAS (No. ZDRW-XH-2019-1). W.-H.Z. and Z.C. contribute equally to this work. Author Contributions: W.-H.Z. and G.C. planned and designed the experiment. Z.C. and X.-J.Y. proposed the framework of the theory and made the calculations. W.-H.Z. carried out the experiment assisted by P.Y., J.-S.X., S.Y. and X.-Y.X., whereas X.-X.P. designed the computer programs. G.C. and Y.-J.H. analyzed the experimental results and wrote the manuscript. G.-C.G. and C-F.L. supervised the project. All authors discussed the experimental procedures and results. Competing interests: The authors declare that they have no competing interests. Data and materials availability: All data needed to evaluate the conclusions in the paper are present in the paper. Additional data related to this paper may be requested from the authors.