Skip to main content

Monte Carlo Production Chain

The production of simulated Monte Carlo (MC) data is essential for a variety of tasks. MC simulations serve as a tool for comparing theoretical models with experimental data, optimizing algorithms, and testing detector performance. These simulations allow researchers to predict and interpret the outcomes of experiments, making them necessary in particle physics research.

The MC production process begins with event generation, which is followed by detector simulation. Subsequently, the digitization step occurs, and finally, the reconstruction of the simulated data. A summary of these steps is found below.

1. Event Generation

The event generation stage in ATLAS involves using software like Pythia and Herwig to simulate physics events. These simulations create sets of particle four-momenta, wich is essentially the mathematical descriptions of particle properties like energy and momentum. During this phase, events can be selectively filtered to focus on those having certain characteristics, such as leptonic decay or a significant amount of missing energy.

An important function of the event generator is to simulate prompt decays of particles (such as Z or W bosons). It also determines which particles are classified as "stable" for propagation through the detector. In this context, "stable" refers to particles that do not undergo immediate decay and thus are expected to travel some distance through the detector, allowing their properties to be recorded and analyzed.

2. Detector Simulation

The next phase involves simulating the interaction of particles with the ATLAS detector, both sensitive and non-sensitive parts. Energy depositions in the detector are recorded as "hits", detailing energy, position, and time. The most common tools used for this step are Geant4 and AtlFast3.

3. Digitization

The hits from the detector simulation are processed to emulate the output of the detector readout. This includes signal collection, pulse shaping, readout emulation, and other effects, along with noise modeling and event pileup. Different types of events (e.g., hard scattering signal, minimum bias) can be overlaid at this stage, with detector noise added and the first level trigger simulated.

4. Reconstruction

The output of digitization (for simulation) or the output from the detector (for detector data) is pieced together into “physics objects” that can be used for analysis. This process involves local pattern recognition, reconstruction of tracks, vertices, cells, clusters, and high-level objects like particles and jets. It is almost identical for both simulated and real data, with the exception of processing truth information in simulated data.

Throughout event generation and detector simulation, "truth" information is recorded. In event generation, this includes the history of interactions and particles, and in simulation, it involves truth tracks and decays for certain particles. Simulated Data Objects (SDOs) are created from truth information during digitization, linking hits to particles in the truth record. This tells us how much energy a simulated particle deposits in our calorimeters during a simulated collision.

The configuration details for each stage of the process are encapsulated within the dataset's name, using specific configuration tags. These are briefly described in the file naming convention section. Additionally, the [simulation tools section].(simulation_tools.md) provides insights about some of the tools employed throughout this production chain. For a comprehensive understanding, a more in-depth resource is available: a detailed publication on the ATLAS simulation infrastructure, which, though more technical, offers extensive information and insights.