Spin Quantum Numbers of Fundamental Fermions and Top Quarks for High-Energy Spin Measurements at the Fermilab Tevatron
The quantum state of one particle cannot be described independently of the other. The simplest example of an entangled system involves a pair of quantum bits (qubits); pieces of quantum information about two particles in the same quantum state that exist in superposition. One of the simplest and most fundamental examples of a qubit is the spin quantum number of a fundamental fermion. Among the fundamental fermions of the standard model of particle physics, the top quark is uniquely suited for high-energy spin measurements because of its unique properties: its immense mass gives it a lifetime (about 10−25 s) notably shorter than the timescale needed for the quantum numbers of a quark to be shrouded by hadronization (around 10−24 s) and spin decorrelation (approximately 10−21 s) effects20. As a result, its spin information is transferred to its decay products. There is a chance to study a pseudo-bare quark without the strong force that shrouds other quarks.
where ({\widehat{{\bf{q}}}}{+}) is the antilepton direction in the rest frame of its parent top quark and ({\widehat{{\bf{q}}}}{-}) is the lepton direction in the rest frame of its parent antitop quark; and Ω+ is the solid angle associated with the antilepton and Ω− is the solid angle associated with the lepton. The number of top-quark and antitop-quark polarities is determined by the number of vectors B. These terms are similar to the general forms of. As the information about the polarizations and spin correlations of the short-lived top quarks is transferred to the decay leptons, their values can be extracted from a measurement of angular observables associated with these leptons, allowing us to reconstruct the (t\bar{t}) spin quantum state.
The effect of non-relativistic corrections on the production of tbart(28) events in the signal and validation regions
$$\frac{1}{\sigma }\frac{{\rm{d}}\sigma }{{\rm{d}}{\varOmega }{+}{\rm{d}}{\varOmega }{-}}=\frac{1+{{\bf{B}}}^{+}\cdot {\widehat{{\bf{q}}}}{+}-{{\bf{B}}}^{-}\cdot {\widehat{{\bf{q}}}}{-}-{\widehat{{\bf{q}}}}{+}\cdot C\cdot {\widehat{{\bf{q}}}}{-}}{{(4{\rm{\pi }})}^{2}},$$
$$\rho =\frac{1}{4}\left[{I}{4}+\sum {i}\left({B}{i}^{+}{\sigma }^{i}\otimes {I}{2}+{B}{i}^{-}{I}{2}\otimes {\sigma }^{i}\right)+\sum {i,j}{C}{ij}{\sigma }^{i}\otimes {\sigma }^{j}\right].$$
Calibration procedures are performed in the signal region and two validation regions to correct data to a fiducial phase space at the particle level. The three regions include all systematic uncertainties. The results were expected.
The current accuracy of the measurement in the validation regions allows us to disregard any of the Monte Carlo setup that was used. It is important to note that close to the threshold, non-relativistic QCD processes, such as Coulomb bound state effects, affect the production of (t\bar{t}) events28 and are not accounted for in the Monte Carlo generators. The main change is to the line shape of the tbart spectrum. The effect of the missing effects was tested by re-weighting the Monte Carlo and found to be less than 1%. Other systematic uncertainties on the top-quark decay (1.6%) and top-quark mass (0.7%) also similarly change the line shape within our experimental resolution and have a much larger impact. The reweighting is not included because it would not affect the result within the precision quoted.
A summary of the different sources of systematic uncertainty and their impact on the result is given in Table 1. A standard model prediction is calculated using POWHEG + PYTHIA and the size of the systematic uncertainty depends on the D value. The section on systemic uncertainties describes the systematic uncertainties that were considered in the analysis.
For all of the detector-related uncertainties, the particle-level quantity is not affected and only detector-level values change. The effects at the particle level can cause shifts in the detector level. Uncertainties in modelling the background processes can affect how much background is subtracted from the data and cause changes in the curve. If a source of systematic uncertainty is expected to affect both the signal and background processes, these uncertainties are treated as fully correlated between the signal and background.
The pairs of and are plotted in the picture. 2b. A straight line leads from one point to another. Any value can be adjusted to the particle level with this curve.
There are two processes that contribute to the analysis, one production of a single top quark with an W boson, and the other production of another top quark with an additional tW. The generators for the hard-scatter processes and the showering are listed in the section ‘Monte Carlo simulation’. The procedure for reconstructing objects is the same for both data and Monte Carlo events.
The background contribution of events with reconstructed objects that are misidentified as leptons, referred to as the ‘fake-lepton’ background, is estimated using a combination of Monte Carlo prediction and correction based on data. This data-driven correction is obtained from a control region dominated by fake leptons. It is defined by using the same selection criteria as above, except that the two leptons must have the same-sign electric charges. The difference in the number of observed events and predicted events in this region is taken into account as a scale factor when evaluating the fake-lepton events in the signal region.
The events were taken during stable-beam conditions and all relevant components were operational. To be selected, events must have exactly one electron and one muon with opposite-sign electric charges. A minimum of two jets is required and at least one of them must originate from a b-hadron.
The experiment is a multi- particle detector with a forward-backward symmetric cylindrical geometry and solid-angle coverage of almost 4. It is used to record particles produced in LHC collisions through a combination of particle position and energy measurements. The coordinate system can be found in the section labeled objects identification in the ATLAS detector. There is a 2 T magnetic field, hadronic calorimeters, and a muon spectrometer, and it has an inner-track detector. The muon spectrometer surrounds the calorimeters and is based on three large superconducting air-core toroidal magnets with eight coils each providing a field integral of between 2.0 T m and 6.0 T m across the detector. There is an extensive software suite used in the reconstruction of data, the analysis of data, and the triggering and data acquisition systems of the experiment. The complete data of pp collision events collected with 13 TeV and an integrated luminosity of 140 fb1 is used. This analysis focuses on the data sample recorded using single-electron or single-muon triggers41.
In both validation regions, the measurement can agree with the predictions from different Monte Carlo setups. This serves as a consistency check to validate the method used for the measurement.
In the signal region, the POWHEG + PYTHIA and POWHEG+ HERWIG generators yield different predictions. The size of the observed difference is consistent with changing the method of shower ordering and is discussed in detail in the section ‘Parton shower and hadronization effects’.
How entanglement in top-quark pairs was observed for the first time by quantum feat: physicists observe entangled quarks for first time
Scientists have had no doubt that top-quark pairs can be entangled. The model of particle physics is built on the theory of quantum mechanics. Researchers say that the latest measurement is still valuable.
There is enough justification in the fact that entanglement has not been rigorously explored at high energies. “People have realized that you can now start to use hadron colliders and other types of colliders for doing these tests,” Howarth says.
Part of what eventually made Afik and Muñoz de Nova’s proposal work is the top quarks’ short lifetime. James Howarth was one of the physicists who took part in the analysis and he said that lighter quarks wouldn’t work. Quarks hate being separated so after a mere 10-23 seconds they start mixing to form hadrons. But a top quark decays quickly enough that it doesn’t have time to ‘hadronize’ and lose its spin information through mixing, Howarth says. Instead, all of that information “gets transferred to its decay particles”, he adds. The researchers could infer the properties of the parent top quarks with the help of the decay products’ properties.
There were pairs of top and anti-top quarks created in the aftermath of a collision. They decay into larger particles after a while.
Source: Quantum feat: physicists observe entangled quarks for first time
Entanglement with the Large Hadron Collider and quantum Mechanics: Juan Aguilar-Saavedra
“You don’t really expect to break quantum mechanics, right?”, says Juan Aguilar-Saavedra, a theoretical physicist at the Institute of Theoretical Physics in Madrid. “Having an expected result must not prevent you from measuring things that are important.”
“It is really interesting because it’s the first time you can study entanglement at the highest possible energies obtained with the LHC,” says Giulia Negro, a particle physicist at Purdue University in West Lafayette, Indiana, who worked on the CMS analysis.