20–22 Jan 2026
Faculty of Science, Cairo University, Egypt
Africa/Cairo timezone

Transformers for particle colliders analyses

20 Jan 2026, 11:10
35m
Ibn Sina Hall (Faculty of Science, Cairo University, Egypt)

Ibn Sina Hall

Faculty of Science, Cairo University, Egypt

Speaker

Ahmed Hammad (KEK, Japan)

Description

The Transformer network, originally introduced for online language translation, has recently achieved remarkable success, particularly with the emergence of the GPT family. In this talk, I will explore how Transformer architectures can be adapted for analyses in particle colliders. At the core of Transformer models lies the attention mechanism. I will review various types of attention mechanisms, including self-attention, cross-attention, and sparse attention. Finally, I will discuss how different AI interpretability techniques can be applied to gain insight into the model decision-making process, effectively transforming machine learning from a "black box" into a more transparent "white box" system.

Author

Ahmed Hammad (KEK, Japan)

Presentation materials