Speaker
Description
Cadmium Zinc Telluride (CZT) detectors are a powerful platform for radiation detection in the keV energy range, combining room-temperature operation and a compact geometry. Their deployment in accelerator environments opens new experimental opportunities, but also introduces complex and structured background conditions that might limit their sensitivities. In this work, we present a Machine Learning driven strategy for background optimization of CZT detectors operated at the DAΦNE accelerator at the INFN National Laboratories of Frascati. Detailed Geant4 simulations of the full experimental setup are used to generate labeled datasets that encode the physical origin and topology of signal and background events. Supervised learning models are trained on these simulations to learn multidimensional correlations that are not accessible through traditional cut-based analyses. We show that ML-based event classification significantly improves background rejection while maintaining high signal efficiency, effectively enhancing the achievable sensitivity of the detector system. This approach demonstrates how simulation-informed learning can become a key tool in the design and optimization of next-generation semiconductor detectors in accelerator-based experiments.