Local Control and v-Representability of Correlated Quantum Dynamics

Local Control and -Representability of Correlated Quantum Dynamics

S.E.B. Nielsen Lundbeck Center for Theoretical Chemistry, Department of Chemistry, Aarhus University, 8000 Aarhus C, Denmark    M. Ruggenthaler Department of Physics, Nanoscience Center, University of Jyväskylä, 40014 Jyväskylä, Finland    R. van Leeuwen Department of Physics, Nanoscience Center, University of Jyväskylä, 40014 Jyväskylä, Finland European Theoretical Spectroscopy Facility (ETSF)
July 15, 2019

We present a local control scheme to construct the external potential that, for a given initial state, produces a prescribed time-dependent density in an interacting quantum many-body system. The numerical method is efficient and stable even for large and rapid density variations irrespective of the initial state and the interactions. The method can at the same time be used to answer fundamental -representability questions in density-functional theory. In particular, in the absence of interactions, it allows us to construct the exact time-dependent Kohn-Sham potential for arbitrary initial states. We illustrate the method in a correlated one-dimensional two-electron system with different interactions, initial states and densities. For a Kohn-Sham system with a correlated initial state we demonstrate the interplay between memory and initial-state dependence as well as the failure of any adiabatic approximation.

31.15.ee, 32.80.Qk, 71.15.Mb

The -representability question is one of the outstanding problems in density functional theory (DFT) DFT (); Chayes1985 (); Lammert2010 (). In time-dependent DFT (TDDFT) RvL1999 (); Baer2008 (); Li2008 (); Verdozzi2008 (); KurthS2011 (); CarstenBook (); GFPP (); Farzanehpour2012 (), the question is whether, for a given initial state, there exists a local external potential that yields a prescribed density by solution of the time-dependent Schrödinger equation (TDSE). In the case of a non-interacting system, -representability amounts to the existence of a Kohn-Sham (KS) system DFT (); CarstenBook (). The KS system has played a major role in the study of correlated many-body systems as it allows for the treatment of interacting systems in an effective one-particle framework. This feature greatly reduces computational costs Andrade2012 () and (TD)DFT has hence been one of the leading methods in electronic structure theory Burke2012 (). In practice the accuracy of the method is limited by the approximate nature of the density functionals that are used. In TDDFT the most commonly used density functionals are based on the adiabatic approximation in which the KS potential only depends on the instantaneous density. These functionals can, however, fail in important cases CarstenBook () and therefore there is a great need for better functionals. To develop and benchmark such new functionals the availability of exact time-dependent KS potentials is highly desirable. Although such potentials can be constructed in special cases Lein2005 (); Godby2012 () no general practical scheme has been available so far. In this Letter we provide such a scheme based on a recently introduced fixed-point formulation of TDDFT GFPP (); GFPP 1-D (). It is at the same time an efficient local control scheme based on the density which augments other important control methods OCT1 (); OCT2 (); LCT1 (); LCT2 () already of extensive use in laser physics, quantum optics Mancini2005 () and the physics of ultracold gases CHU02 (). The scheme is closely related to existing methods LCT1 (); LCT2 () but targets a spatially extended quantity instead. It is applicable to general interactions and initial states and can deal with fast and large density changes. We demonstrate the approach for an interacting one-dimensional two-electron system with different interactions, initial states and densities. For a KS system with a non-separable initial state we illustrate the connection between memory and initial state dependence as well as the failure of any adiabatic approximation.

The global fixed point method. We consider a -electron system with a time-dependent Hamiltonian , where is the kinetic energy, the time-dependent external potential and the many-body interaction (which may even be time-dependent). The expectation values and of the density and current operators (atomic units are used throughout)

satisfy equations of motion given by


Here the internal local force is defined by

where is the time-dependent many-body state obtained from the TDSE with potential and given initial state. Eqs.(1) and (2) imply

where is regarded as a functional of through the state . For a fixed density and initial state this is an implicit equation for the potential. To solve this implicit equation we define an iterative sequence of potentials by the iterative solution of


In previous works GFPP (); GFPP 1-D () we proved, for general initial states and interactions, that under mild restrictions on the density the sequence converges in Banach norm sense to a potential which is both fixed-point of the equation and produces the prescribed density .
Although the fixed point method itself is well-defined it is highly non-trivial to develop a stable numerical algorithm. To do this we found it advantageous to make explicit use of also the current (still being a functional of the density). This is most easily done for one-dimensional systems since the continuity Eq.(1) can be integrated analytically. We find where


and where is an arbitrary point. For this reason, and for simplicity of presentation, we restrict ourselves to the one-dimensional case in this Letter. We first show how we can eliminate the quantity from our equations. By integrating Eq.(3) and using Eq.(4) we obtain

where is an integration constant. From Eq.(2) for a system with potential we then find


To obtain an equation that is only dependent on densities we can use Eq.(1) and Eq.(4) to find


where is a new constant. While mathematically equivalent Eqs.(5) and Eq.(6) are not numerically equivalent as their discretizations on a space-time grid generally differ. In practice, it is therefore advantageous to use


as follows immediately by multiplying Eq.(5) by and Eq.(6) by and adding the results. Here is a parameter at our disposal and is a new constant. This equation defines an iterative procedure to determine from . The constant in this equation is uniquely determined by the spatial boundary conditions on (and hence depends on ). When then and and Eq.(7) implies that . Since we also obtain after convergence we can calculate any observable, and in particular the current . This is an explicit realization of the Runge-Gross result Runge-Gross () that any observable is a functional of the density and the initial state.
Numerical procedure. The iterative method based on Eq. (7) should be implemented stepwise in time for high efficiency. We use a midpoint based time-stepping method which uses the midpoint potentials to propagate the wave function on a time-grid with time-points . For this we implemented the Split Operator and Lanczos method Leforestier1991 (). Let us now suppose that we have obtained , and hence the giving the required density for . Then to determine , and hence , we define an iterative procedure in which we guess an initial potential and loop over potentials until we converge to the desired :

  1. Use to calculate from by time-stepping.

  2. From calculate and .

  3. Calculate from


    where .

Eq.(8) is obtained from Eq.(7) by a discretization w.r.t. time using only times with for the derivatives. We further used the fact that we have already converged up to time and replaced by on the left hand side of the equation since we found that this does not affect the convergence. The constants and depend on the discretization scheme and the of Eq.(7), which effectively leaves the choice of their values at our disposal. The constant in Eq.(8) depends on the boundary conditions and hence the geometry of the system. Below we will present examples for a periodic system. In that case the constant is determined by the periodicity condition on the spatial interval . This yields the condition


During time-propagation numerical errors can build up in the modulus and phase of the wave function. The density is determined by the modulus only while the current also depends on the phase. This implies, for example, that errors in the phase can lead to inaccurate currents while still producing an almost correct density. This tends to happen in the case that since in that case the procedure enforces the correct density without constraints on the current as can be seen from Eq.(8). The opposite happens in the case that . By taking nonzero values for and we control the accuracy of both the density and the current and hence the modulus and phase of the wave function. This suffices to stabilize the algorithm in most cases. Perfect stability is obtained by spatially smoothening the potentials as they are obtained. Without smoothening the best algorithm is obtained when dominates for nonzero while these values are not so important with smoothening (we used and ). We find that iterations generally suffice to converge and that the precision of the potential is limited mainly by the time-stepping method for the wave-function (assuming a sufficient spatial resolution). By increasing the precision thereof almost arbitrary precision can be achieved even when the density changes by orders of magnitude.

Figure 1: (color online) The potentials that produce the prescribed densities and (insets). Panel (a) , (b) , (c) , (d) . Note that in (a) we plotted minus the potential for better visibility.

Translating and splitting a given density. To illustrate the algorithm we consider two electrons on a quantum ring of length over a time period of length . We start by calculating the singlet ground state (which has a spatially symmetric wave function) of a (properly periodic) Hamiltonian with external potential and interaction given by

where is the interaction strength. The ground state density is denoted by . We then construct the (spatially periodic) time-dependent densities and by:

The density describes a situation where the initial density is rigidly translated around the ring exactly once whereas the density describes a situation where the initial density is split in equal halves that are rigidly translated in opposite directions to rejoin at times and . We have used our algorithm to calculate the potentials that produce these prescribed densities and via time-propagation of the initial state by the TDSE. This was done for the interaction strengths and . In Fig. 1 we present the corresponding potentials and densities (insets). We see large differences in the potentials for the interacting case (panels (c) and (d)) as compared to the non-interacting case (panels (a) and (b)). The convergence of our algorithm shows that the prescribed densities are indeed -representable and that the algorithm can be used for density changes of orders of magnitude.

Figure 2: (color online) The explicitly time-dependent KS potential that keeps the density (inset) static for the correlated initial state . We stress that the potential is not periodic in time.
Figure 3: (color online) 4 snapshots of at times where the KS potential in Fig.(2) is extreme. Note that the electrons are well-separated at time but are confined to the same region at time .

Exact KS potential for a non-separable initial state. As a second example we construct an exact KS system, i.e. a non-interacting system having the same time-dependent density as that of an interacting reference system. For the KS system we also need to specify an initial state with the correct initial density . This state does not need to be the KS ground state (the ground state of a non-interacting system with density ) as the Runge-Gross theorem Runge-Gross () allows for general initial states (see for further discussion ElliotMaitra ()). Here we take the KS initial state to be identical to the true correlated ground state of the interacting system. As the interacting reference system we consider a system forever kept in the ground state of the previous example for . The density is therefore stationary and equal to . Since is not an eigenstate of a noninteracting system the KS state and potential will in general be time-dependent, but in such a way that they still produce the static density . We denote the KS potential by and the KS Hamiltonian is thus given by

We have determined the time-dependent potential with our algorithm and displayed it in Fig.2. The square of the corresponding KS wave function is displayed in Fig.3 at four times corresponding to extreme values of the KS potential. We see strong internal motions in the wave function as it passes through states in which the electrons are well-separated and states where they are confined to the same region in space , although the corresponding density is completely static. The wave function at these times as well as the intermediate times and are in correspondence with the extreme values of the potential in Fig.2. We also see the exact potential cannot be an adiabatic functional of the density, and hence must have memory, as an adiabatic functional produces a static potential when we insert the exact density, in conflict with Fig.2. This can be illustrated further by choosing as (the spatial part of) the initial KS state a separable state of the form


with . In this case the KS-potential is static and given by


up to an arbitrary constant. In this case the exact KS potential is static as would also have been predicted by any adiabatic approximation. This explicitly demonstrates the interplay between memory and initial statesMaitra2002 (); MaitraBook ().
Outlook. We presented a stable and fast algorithm to construct the external potential that, for a given initial state, produces a prescribed time-dependent density in an interacting many-body system. The method will be valuable for further development of density functionals and local control theory. Especially exciting is the possibility to use more advanced (multi-configurational) initial states in DFT in combination with existing and new approximate functionals and to test them using our benchmarking algorithm. This can open up new possibilities for the study of strongly correlated systems within a DFT framework.

Acknowledgement. S.E.B.N. acknowledges support from the Lundbeck Foundation. M.R. acknowledges support by the Erwin Schrödinger Fellowship J 3016-N16 of the FWF (Austrian Science Fonds). We further thank Prof. J. Olsen for valuable discussions.


Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
Add comment
Loading ...
This is a comment super asjknd jkasnjk adsnkj
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test description