In the context of Cherenkov astronomy, the data processing stages imply both assumptions and comparison to dedicated simulation. Those data can be misleading if not documented. This has implications on the format specification for the data that will be exposed to be used in relation with other frequencies for modeling.
Provenance information in astronomy is important to enable scientists to trace back the origin of a dataset, a document or a device, learn about the people and organizations involved in a project and assess the quality as well as the usefulness of the dataset, document or device for their scientific work. Current efforts to model the Provenance information in Astronomy led to the development...
Since the begin of its operation in 2003, the MAGIC telescopes collected data from more than 60 TeV emitters. The collaboration distributes public FITS files with high-level data such as spectral energy distributions, light curves and skymaps for every published result. Here we report on the efforts to complement this products with more ample information (data quality, fit models, etc) and...
JetSeT is an open source Python framework with a C numerical engine, to reproduce radiative and accelerative processes acting in relativistic jets, allowing to fit the numerical models to observed data([https://jetset.readthedocs.io/en/latest/][1] [https://github.com/andreatramacere/jetset][2])
The main features of this framework are:
- handling observed data (rebinning, definition of...
Python is now the dominant language in scientific computing. Especially astroparticle physics and astronomy experiments have embraced Python enthusiastically, for example, the IceCube Neutrino Observatory. CERN experiments are also moving analysis steadily towards Python. ROOT is the foundational library in many HEP experiments, so I will give a brief summary on current developments in ROOT....
I will present examples of python code optimization. First we will focus on wrapped C++ code to speed up significantly analysis. Secondly, I will talk about function minimization with PyTorch and how it can be used through two examples: likelihood minimization on GPU for ImPACT optimization and deep learning with the gamma-learn project.
An introduction to Numba, a Python package which compiles Python code on-the-fly to produce efficient machine code, potentially providing huge improvements in execution time.
What are good choices / pros and cons of the various high-performance computing options in Python for our codes?