The increased computational research approach demands efficient management of code, automation of tasks, and access to high-performance computing resources. Acquiring such skill sets can have a positive impact on the productivity of researchers and increase throughput. This talk will provide an introduction to three fundamental tools: version control, shell scripting, and remote computing....
The Radio Neutrino Observatory in Greenland (RNO-G) is located at Summit Station and is designed to detect Askaryan emission from ultra-high energy (UHE) neutrinos above 100 PeV. The detector is made up of an array of antennas buried at a depth of 100 meters with the purpose of triggering on and reconstructing neutrino-like signals in the radio regime. Interferometry can be used to find the...
Neutrinos are vital messengers for understanding the most extreme astrophysical processes, capable of traveling across cosmic distances without deflection. The Askaryan Radio Array (ARA), deployed at the South Pole, searches for these elusive particles by detecting radio pulses generated when neutrinos interact with Antarctic ice. However, identifying these rare signals is particularly...
Source modeling is the process of modeling the flux distribution of astronomical objects such as galaxies and stars to enable accurate photometric measurements. However, conventional source modeling pipelines applied on wide imaging surveys often struggle with large, nearby galaxies, which are frequently "shredded" into multiple smaller components. This fragmentation leads to inaccurate flux...
This lecture will cover the relationship between "models" and "data", such that "learning" corresponds to making inferences about the parameters of our model. We will see that this paradigm covers everything from curve fitting to contemporary machine learning with neural networks. The theory of general model fitting will be explored with practical examples, with some discussion of the concepts...
Radio-based physics experiments like the Radar Echo Telescope frequently require analysis techniques that can recover signals at or below the level of noise. Singular Value Decomposition is a versatile tool for signal processing, allowing complex signals to be efficiently analyzed, denoised, and reconstructed, even when signal characteristics are not known. This talk highlights the methodology...
Monte Carlo methods are a tool to estimate probability distributions or simulate physical processes, that is commonly used in physics as well as other fields like chemistry, biology, finance and artificial intelligence.
This talk will give a general introduction to the use of Monte Carlo methods for physics simulations, with some example and practical considerations.
These days, machines often perform tasks that were previously far beyond their capabilities, and they are increasingly outperforming humans too! Machine learning has been around for a long time, however, so what has changed? Since, ~2006 we have been in the “third wave” of deep learning, and it has become possible for machines to learn complex “nested representations”, and this has...
Neutrino astronomy is a vibrant field of study in astrophysics, offering unique
insights into the Universe’s most energetic phenomena. The combination of a low cross
section and zero electromagnetic charge ensure that a neutrino retains most information
about its original source while traversing the universe. On the other hand, these low
cross sections, combined with a reduced flux at...
How can you use a machine learning model in your research? This talk will outline when and where a model is most useful in materials science, how it’s created with different methods, and some of the advantages and disadvantages of each. I will also discuss my own research on phase change materials and its machine learning model as a more in-depth example.
Whether searching for dark matter or measuring Standard Model parameters, analyzing data from the Large Hadron Collider is no easy task. Leveraging machine learning in high energy physics (HEP) is not a new idea, but recent AI advancements have accelerated analysis efforts. Methods like transformer models, variational auto-encoders, and graph neural networks have strengthened HEP analysis...