Author ORCID Identifier
https://orcid.org/0009-0003-2274-799X
Date of Award
Summer 8-27-2025
Document Type
Thesis (Ph.D.)
Department or Program
Physics and Astronomy
First Advisor
Lorenza Viola
Abstract
Quantum sensors operating at the microscale are an emerging branch of quantum technologies where tangible experimental successes have already been reported. State-of-the-art atomic interferometers allow to measure and estimate a variety of physical parameters with unprecedented precision. In principle, exploiting the full power of quantum mechanics would lead to quantitatively better performance bounds over the best possible classical strategies under the same given set of resource constraints. However, the quantum systems' fragility to external disturbances has so far hindered most of these gains to be reached in practice, particularly in the limit of large probe number $N$. Parallel, {\em purely dephasing} noise, is a ubiquitous source of decoherence in the relevant experimental platforms, that proves especially difficult to mitigate. In this Thesis, we analyze the impact of pure dephasing to \emph{asymptotic} ($N \gg 1$) accuracy and precision of quantum frequency estimation protocols.
First, we show how dephasing-induced loss of quantum coherence may render the standard strategy used to estimate the target frequency in a regime of high prior knowledge biased or even ill-defined. We then provide a solution to this problem by modifying the interferometric sequence to construct an estimator which is asymptotically unbiased (accurate), without detriment to precision, at the cost of doubling the resource overhead.
Second, when the noise exhibits \emph{partial} spatial correlations, we prove that superclassical precision scaling can be restored by a clever spatial arrangement of the probes. In the absence of noise characterization, an intermediate limit between the optimal noiseless classical and quantum precision bounds is accomplished by randomizing the position of the sensors. When access to noise spectroscopy is available, further gains may be reached by placing the sensors in a lattice with tunable unit length. Both of these strategies fail, however, in the presence of full spatial correlations (collective noise), as the probes effectively occupy a single position in space.
Third, we establish rigorous, state-independent bounds to precision in the collective dephasing setting. These bounds can be saturated up to a constant factor by using a class of spin-squeezed input states and global measurements that are readily available in current platforms, regardless of the noise correlations. Importantly, while gains over the noiseless classical limit are possible for a dephasing process with a long-tailed spectrum, no quantum advantage may be reached for Markovian dephasing, or when the environment has a rapidly decaying spectrum.
Fourth, we explore if the above limits to performance may be surpassed by including additional resources. When the possibility of encoding the signal non-linearly is available, noiseless precision bounds generalize to accommodate for entanglement buildup throughout the evolution period. We then show that, in the presence of collective dephasing, an improvement on absolute precision with respect to the usual, linear, setting can be achieved. This precision bound may be saturated (up to a constant factor) by protocols using noise-robust states and collective measurements. For white noise and colored noise with a rapidly decaying spectrum, no gains over the non-linear noiseless classical limit is possible, however, hinting at a broader no-go. In both of these cases, we turn back to the linear setting considered before, now augmented by the ability to apply instantaneous, \emph{pulsed} open-loop control throughout the evolution. We further prove that this type of dynamical decoupling sequences are unable to lift the bound on superclassical scaling of precision, provided we additionally assume that the noise source is classical in nature. Extensions of the no-go to a quantum environment and long-tailed spectra have not been yet concluded but seem highly plausible.
Finally, we explore the benefits of \emph{continuous driving} as a resource to enhance interferometric performance in the presence of collective dephasing. We consider a large $N$ limit where the dynamics can be approximately described by a time-dependent quadratic bosonic Hamiltonian, and compute the resulting evolution. Formal expressions for the ultimate precision bounds and optimal measurement scheme are derived. While the extent of the resulting advantage for general time-dependent driving fields remains to be fully understood, we explicitly show how the use of parametric squeezing allows us to improve asymptotic scaling of precision with respect to the control less scenario. It is our hope that our results pave the way towards resource-efficient, scalable entanglement-assisted quantum metrology in realistic settings.
Recommended Citation
Riberi, Francisco U., "ENTANGLEMENT-ASSISTED METROLOGY UNDER SPATIOTEMPORALLY CORRELATED QUANTUM NOISE" (2025). Dartmouth College Ph.D Dissertations. 425.
https://digitalcommons.dartmouth.edu/dissertations/425
