The 2022 international workshop on the high energy Circular Electron-Positron Collider (CEPC) will take place between October 24-28, 2022. The workshop will be operated mainly in online mode, hosted jointly by the Nanjing University and IHEP. Meeting rooms will be provided at IHEP for local participants.
The workshop intends to gather scientists around the world to study the physics potentials of the CEPC, pursue international collaborations for accelerator and detector optimization, deepen R&D work of critical technologies, and develop initial plans towards Technical Design Reports (TDR). The high energy Super proton-proton Collider (SppC), a possible upgrade of the CEPC, will also be discussed. Furthermore, industrial partnership for technology R&Ds and industrialization preparation of CEPC-SppC will be explored.
The workshop program consists of plenary sessions, parallel sessions and poster sessions. Presentations include conveners' selection from the abstract submissions and invitational ones. The workshop encourages participation, especially from graduate students and junior postdocs. Top posters will receive awards, selected through a vote by the SPC members, the conveners and the local organizers.
Driven by the physics program of precise measurement of properties of the Higgs boson, the W and Z bosons, as well as the top quark, future lepton colliders require unprecedented jet energy resolution on their calorimetry systems. Based on the particle- ow paradigm, a novel highly gran- ular crystal electromagnetic calorimeter (ECAL) with excellent three-dimensional spatial resolu- tion as well as good energy and time resolution is proposed to address major challeng es from jet reconstruction and to achieve the optimal EM energy resolution of around 2 − 3 %/ E(GeV ) with the homogenous structure. Comprehensive R&D e orts have been carried out to evaluate the potential and requirements of the crystal ECAL from sensitive detection units to the full sub- detector system. e requirements on crystal candidates, photon sensors, and readout electronics are parametrized and quanti ed in Geant4 full simulation. Hardware R&D activities on the crystal and silicon photomultipliers (SiPMs) are performed to characterize the typical response of basic de- tector units and to improve the simulation results. e physics performance of the crystal ECAL has been studied with the particle- ow algorithm “ArborPFA” which is being optimized. More- over, the development of small-scale detector modules is underway for future beam tests to study the performance for EM showers.
A highly granular electromagnetic calorimeter has been designed within the CALICE collabora- tion for precision measurements of Higgs and electroweak physics at future lepton collider ex- periments, including the Circular Electron Positron Collider (CEPC). Scintillator strips and silicon photomultipliers (SiPMs) are instrumented as sensitive layers and tungsten-copper alloy plates as absorber. Scintillator strips are individually wrapped with ESR foil and directly coupled with SiPMs. A prototype with 32 sampling layers and over 6700 channels (around 600 × 600 × 400 mm3 in dimensions) has been constructed and commissioned in 2020, followed by long-term cosmic-ray tests in 2021 for quantitative studies on the key performance. ere will be a dedicated beam test at CERN SPS in October 2022. is talk will cover key aspects in the prototype development, com- missioning as well as selected results of cosmic ray tests. e latest status on the CERN beam test will also be presented.
Based on the particle- ow paradigm, a novel hadronic calorimeter (HCAL) with high granularity is proposed to address major challenges from precision measurements of jets at future lepton collider experiments, such as the Circular Electron Positron Collider (CEPC). Compared with the baseline designs, a new design scheme based on the glass scintillator (GS-HCAL) aims for further signif- icant improvements of the hadronic energy resolution as well as the particle- ow performance, especially in the low energy region (typically below 10 GeV for major jet components), with a notable increase of the energy sampling fraction and hadronic response compensation due to its high density and doping of neutron-sensitive elements. e R&D group has been established to promote the investigation of high-performance glass scintillators with a density up to 6 g/cm3 and a light yield of 1500 ph/MeV. Simultaneously, physics benchmark potentials of GS-HCAL in an optimized setup are explored in the CEPC so ware framework and standalone simulation. In this contribution, the latest R&D progress of glass scintillators and optimization of key parameters will be introduced.
We discuss the time spectra of showers from photons, muons, and charged pions, simulated in the CEPC electromagnetic calorimeter (ECAL). We present an algorithm for timing reconstruc- tion in highly granular calorimeters (HGC). Assuming the intrinsic hit time resolution measured by the CMS collaboration is accessible, the particle Time-of-Flight (ToF) can be measured with a resolution of 5 ∼ 20 ps for electromagnetic (EM) showers and 80 ∼ 160 ps for hadronic show- ers above 1 GeV. e ToF resolution depends linearly on the timing resolution of a single silicon sensor and improves statistically with increasing incident particle energy. A clustering algorithm that vetos isolated hits improves ToF resolution. In addition, hadronic showers include extremely slow components. In Z->qq events, there is around 1% (10%) ECAL (HCAL) energy deposited a er one microsecond, which may leak out from the triggering window of the corresponding event and pile-up into the a er events.
The future Circular Electron-Positron Collider (CEPC) is a large-scale experimental facility, which aims to accurately measure the Higgs boson, electroweak physics and avor physics. For the detec- tor system in CEPC, a highly granular crystal electromagnetic calorimeter is proposed to achieve an EM energy resolution of less than 3%. It is a homogenous structure with long crystal scintilla- tor bar as active material. e energy deposition range in one crystal bar is about 500keV ̃10GeV. SiPM, as the preferred photon detector in crystal bar ECAL, should cover a dynamic range of at least 50000 photons. e response calibration for SiPMs with such a large dynamic range is chal- lenging. We have developed an experiment which used laser and PMT as light source and scaler respectively. By adjusting the bias voltage, we expanded the linear region of the PMT to cover the whole response range of SiPMs. e nal response curves are reasonable. We have also built a simulation model to describe them. Improvements to this experiment are still ongoing, including design of large dynamic range PMT, optimization of SiPM electronics and light source.
Parton showers are ubiquitous theoretical tools in collider physics providing a crucial link between theory and experiment. In this talk I will give a general introduction to final-state parton showers and review recent developments in understanding and improving their accuracy - all within the context of the PanScales collaboration. One limitation of most parton showers is that they are inherently classical rather than quantum mechanical. Since the fundamental particles of Quantum Chromo Dynamics carry both spin and colour, any faithful description of collider events must include quantum interference effects due to both. I will therefore discuss in more detail how to incorporate quantum interference effects in parton showers, focusing mainly on spin correlations, and how to potentially measure them at colliders like the CEPC.
I illustrate recent findings regarding the structure of linear power corrections to shape variables in e+e- annihilation. These new finding hint at the possibility to estimate linear power correction directly in the three jet region, rather than extrapolating them from the two jet region. Some prospect for the measurement of alpha_S from shape variables at future e+e- colliders are discussed.
Scattering amplitudes provide theoretical descriptions of the hard interactions taking place at collider experiments. In practice, their computation must be performed within the confines of perturbation theory, to the appropriate order of approximation demanded by experiment. When considering interactions that involve many, possibly heavy particles, the computation of the associated scattering amplitude becomes demanding. In this talk, we discuss the problems involved in frontier multi-scale amplitude computations -- at the two-loop level -- and the modern techniques which are currently used to tackle them. We will discuss the state of the art at contemporary hadron colliders, and prospects for computations at future lepton colliders.
Identifying the flavour of reconstructed hadronic jets is critical for precision phenomenology and the search for new physics at colliders, as it allows to pinpoint specific scattering processes and reject backgrounds. We propose a new approach to define the flavour of jets, a flavour dressing algorithm, which is infrared and collinear safe and can be combined with any definition of a jet. We test the algorithm in 𝑒+𝑒− and 𝑝𝑝 environments and consider some practical applications.
The poster session will use “Gather Town” virtual meeting room. Please click the following link to enter the room.
Gather Town
Accelerator Related Topic
We give comprehensive analyses for event shape observables in electron-positron annihilation by using the Principle of Maximum Conformality (PMC) which is a rigorous scale-setting method to eliminate the renormalization scheme and scale ambiguities in perturbative QCD predictions. Conventionally the renormalization scale is simply fixed to the center-of-mass energy $\sqrt{s}$, and only one value of the QCD coupling at the single scale $\sqrt{s}$ can be extracted from event shape observables. The PMC renormalization scales are determined by absorbing the non-conformal contributions. The resulting PMC scales change with event shape kinematics, reflecting the virtuality of the underlying quark and gluon subprocess. The PMC scales thus yield the correct physical behavior of the scale and the PMC predictions agree with precise experimental measurements. More importantly, we can precisely determine the running of the QCD coupling constant $\alpha_s(Q^2)$ over a wide range of $Q^2$ in perturbative domain from event shape distributions measured at a single center-of-mass energy $\sqrt{s}$.
We compute the total cross section of heavy-quark-pair production in 𝑒+𝑒− annihilation mediated by a virtual photon at the next-to-next-to-next-to-leading order (NNNLO) in Quantum Chromodynamics. The result is expressed as a piecewise function defined by several deeply expanded power series. The result significantly reduces the theoretical uncertainty. For example, for a collision energy of 500GeV, the scale dependence has been reduced from 0.72% at the next-to-next-to-leading order (NNLO) to 0.15% at the NNNLO, which meets the request by future lepton colliders.
Energy-energy correlation (EEC) has been studied for almost 40 years, with its analytical calculation only being done quite recently using integration-by-parts (IBP) identities. We present a bootstrap strategy to calculate the EEC up to the next-to-leading order (NLO) correction by crafting an ansatz based on the colour structure of the QCD amplitudes and the Symbols of the master integrals relevant for the 𝛾∗→𝑞𝑞¯𝑔 process, which we impose self-consistent constraints to reduce the ansatz's parameters. We expect that symmetry of √z→−√z argument, the end-point kinematics, and colour structure of the QCD master integrals can constrain the ansatz significantly. The results would be presented in terms of classical polylogarithms.
The energy correlators measure the pattern of the energy deposition in detectors. The collinear limit, where the angle between the detectors approaches zero, is of particular interest for describing the substructure of jets produced at colliders. By utilizing our factorization theorem and calculating the required ingredients, we perform the resummation of the logarithmically enhanced terms for the projected three-point energy correlator in the collinear limit through to NNLL by renormalization group evolution.
ALICE (A Large Ion Collider Experiment) is one of the four main LHC experiments and is optimised to study heavy ion collisions.
The ALICE detectors and readout system have undergone a major upgrade to increase the data acquisition rates to the required level.
The integrated luminosity is expected to be increased by a factor of 100 by increasing the readout rate to 50 kHz for Pb-Pb and to 1 MHz for pp collisions.
A novel trigger and timing distribution system is implemented based on Passive Optical Network and GigaBit Transceiver technology.
To assure backward compatibility, a triggered mode based on RD12 TTC technology is kept and re-implemented under the new Central Trigger System. A new universal ALICE Trigger Board based on the Xilinx Kintex Ultrascale FPGA has been designed that can function as a Central Trigger Processor (CTP), Local Trigger Unit (LTU), and monitoring interface.
From 2022 the LHCb experiment will use a triggerless readout system collecting data at an event rate of 30 MHz and a data rate of 4 Terabytes/second. A software-only High Level Trigger will enable unprecedented flexibility for trigger selections. During the first stage (HLT1), track reconstruction and vertex fitting for charged particles enable a broad and efficient selection process to reduce the event rate to 1 MHz. Tracking and vertexing at 30 MHz represents a significant computing challenge, and LHCb utilizes the inherent parallelism of the triggering process to meet throughput requirements with GPUs. A close integration with the DAQ and event building allows for a particularly compact system, with the GPUs hosted in the same servers as the FPGA cards receiving the detector data, which reduces the network to a minimum. This architecture also inherently eliminates latency considerations, allowing GPUs to be used despite the very high required throughput. We review the software and hardware design of this system, reflect on the challenges of developing for heterogeneous architectures, discuss how it meets LHCb’s performance requirements, and show the commissioning status from LHC Run 3.
ALICE (A Large Ion Collider Experiment) is a heavy-ion detector studying the physics of strongly interacting matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). During the second long shut-down of the LHC, the ALICE detector was upgraded to cope with an interaction rate of 50 kHz in Pb-Pb collisions, producing in the online computing system a sustained input of 3 TB/s.
The new data-acquisition system consists of 200 readout nodes, collecting the data transferred from over 8000 detector links to PCs memory by dedicated PCI boards.
These machines also perform some initial data processing tasks, like compression and data quality monitoring, before sending data over an Infiniband network to aggregate and handle the full detector events in a dedicated online processing farm.
The ALICE experiment at CERN's Large Hadron Collider benefited from a major upgrade during the Long Shutdown 2 (2018 - 2022) which will generate up to 27 Tb/s of data from the detectors during the Runs 3 and 4. The update includes a redesign of the computational system, now named O2 (Online-Offline) and a new suite of web application GUIs which will be used by multiple teams and operators in the control room of the experiment 24 hours a day. Thus the new GUI suite includes a new tool named Bookkeeping, whose main purpose is to keep track of the experiment and provide a history state of the system at any point in time.
Bookkeeping allows users to manually insert system updates which then can be filtered and retrieved to quickly access the information they are looking for. Moreover, it provides the means to other systems to enable them to automatically store and retrieve data. Based on the input, it builds global and individual statistics about the system performances which in turn helps improving the overall efficiency of the experiment.
The provided UI and API use modern web technologies and are based on a shared web framework developed in-house, to ensure the application is robust enough to fill the needs of Run 3 and Run 4 while being easily maintained and enhanced.
This presentation describes the Bookkeeping functionalities, the purpose they serve and the means that have been put in place to fulfill them.
Multiple EW vector boson amplitudes are known to have bad energy behavior for individual Feynman diagrams, which causes many problems for numerical and theoretical analysis. Based on Goldstone equivalence theorem(GET), we introduce a new representation of Feynman rules that makes GET manifest, while reproduces the exact results of the amplitudes. The new helicity has no subtle gauge cancellation, every diagram has a specific physical interpretation, when the pole approaches on-shell. We implement this new Feynman rules into numerical codes of HELAS (Helicity Amplitude Subroutines) and study several process with the new HELAS.
The poster session will use “Gather Town” virtual meeting room. Please click the following link to enter the room.
Gather Town