Spectral Density Estimation with the Gaussian Integral Transform

The spectral density operator ρ(ω)=δ(ωH) plays a central role in linear response theory as it’s expectation value, the dynamical response function, can be used to compute scattering cross-sections. In this work, we describe a near optimal quantum algorithm providing an approximation to the spectral density with energy resolution Δ and error ϵ using O(√(log(1/ϵ)(log(1/Δ)+log(1/ϵ)))/Δ) operations. This is achieved without using expensive approximations to the time-evolution operator but exploiting instead qubitization to implement an approximate Gaussian Integral Transform (GIT) of the spectral density. We also describe appropriate error metrics to assess the quality of spectral function approximations more generally.