Conveners
LHC
- Gabriela Alejandra Navarro (Universidad Antonio Narino (CO))
- Carlos Arturo Avila Bernal (Universidad de los Andes)
- José David Ruiz-Álvarez (Universidad de Antioquia)
Presentation materials
The second largest background of this
This work pretends to study the boosted Higgs boson calibration using Z + jets events by optimizing the signal significance after tagging this topology data with the objective that allows others to investigate this type of decay for different processes. Besides, this project also focuses on the automation of the event selection in this tagger to allow the code to be safe and robust to errors,...
Los métodos de identificación de partículas son una herramienta poderosa para filtrar datos no deseados y con ello mejorar los resultados finales del experimento. En este caso en conciso, se estudiara el rendimiento de distintos métodos de identificación de bosones W en eventos de colisiones del experimento ATLAS, tomando especial atención a los métodos basados en el llamado Plano de Lund, el...
Las actualizaciones que se están desarrollando en el LHC, permitirán un incremento de las colisiones de protones en el acelerador de partículas. El incremento de la luminosidad instantánea, permitirá investigar fenómenos físicos poco frecuentes, obtener medidas más precisas de los fenómenos ya conocidos e incluso investigar eventos más allá del modelo estándar. Satisfacer los requerimientos...
El upgrade ATLAS HL-LHC permitirá que algoritmos de reconstrucción de jets similares a aquellos utilizados de forma offline sean implementados a nivel del Trigger más bajo. Estos algoritmos tienen como objetivo reconstruir parámetros de los jets de forma precisa de tal forma que sea posible maximizar la coherencia entre los datos del trigger y aquellos seleccionados en el análisis. El...
En el mundo digital actual, los numerosos cambios transforman constantemente la forma de manejar, almacenar y distribuir datos, y el surgimiento de nuevas infraestructuras computacionales y servicios remotos posibilita realizar distintas tareas de manera virtual, sin contar con recursos computacionales sofisticados. Esto resulta especialmente útil en análisis con datos abiertos de los...
El procesamiento y análisis de datos está tomando cada vez mayor relevancia en el campo de la investigación científica, por lo que actualmente se busca facilitar el desarrollo de herramientas que realicen estos análisis sin la limitación de la infraestructura computacional personal que pueda llegar a tener un usuario. Así, el desarrollo y uso de herramientas de software que provean un servicio...
En física de partículas los algoritmos computacionales son fundamentales para seleccionar eventos interesantes para estudiar, esto se hace reduciendo la cantidad de eventos de fondo en cualquier muestra. Este proyecto intenta utilizar las muestras proporcionadas por el desafío LHC Olympics del 2020 para optimizar un análisis multivariado de tipo BDT capaz de separar los eventos de señal y...
In the process of measuring the beam status and the luminosity during the Run 3 of the Large Hadron Collider, at the interaction point of the LHCb experiment is projected the Probe for Luminosity Measurement (PLUME) detector. Through the use of Geant4 simulations, the adjacent effects to the operation point are studied by means of the Cherenkov light emitted by particles coming from the...
The High Luminosity Large Hadron Collider (HL-LHC) will produce at least 250 inverse femtobarns of data per year. In order to analyze this data, we need to produce a substantial number of events. This possesses a considerable challenge to the already-optimized full CMS detector simulation that uses Geant4. One avenue being explored is modifying the simulation parameters to process events even...
Podio is an EDM tool that generates all code from YAML descriptions. As part of the Key4Hep project it is necessary that it has some tools that are already available in the iLC software, we present advances made during the summer student project implementing tools similar to anajob and dumpevent.
In the field of High energy physics, there are many tools used to perform the statistical analysis needed to do experimental and phenomenological research in that field. Inside of the CMS collaboration, Combine tool is heavily used to produce binned statistical models. Although Combine is open source and is based in other open source tools as RootFit, RooStats and is build over HistFactory,...
In recent years there has been a reproducibility crisis in most science fields, where researchers fail to reproduce other researchers and their own experiments. In HEP, the computational analysis of the data obtained from experiments, such as the LHC, is the new concept of a experiment. The computational experiments are bounded to the environment and equipment used to perform the analysis....
This presentation describes the test beam studies of 3D silicon sensors (pitch 50x50 µm2 ) exposed to a 120 GeV proton beam at the Fermilab Test Beam Facility. We show the pixel efficiency, cluster size and hit resolution before and after irradiation. The 3D silicon sensors are considered for the innermost layers of the Inner Tracker (IT) of the Phase-2 upgrade of CMS Detector. This detector...
Forward photons and electrons in the LHCb experiment are detected with the inner modules of the EM calorimeter. However, the granularity of the cells makes difficult to detect precisely the shape of the showers produced by those particles. Then, photons and electrons candidates are hard to differentiate especially when nearly collinear particles hit the calorimeter. Simulations in Geant4 of...
Among the outstanding questions of particle physics, proof of the existence of a magnetic monopole is still one of great interest. Not only would the observation of a magnetically charged particle bring symmetry between electric and magnetic fields in Maxwell’s equations, but it would also explain the quantization of the electric charge. TeV-mass Dirac Magnetic Monopoles, which behave as a...