Lomonosov Moscow State University Skobeltsyn Institute of Nuclear Physics
Abstract:Modern Imaging Atmospheric Cherenkov Telescopes (IACTs) generate a huge amount of data that must be classified automatically, ideally in real time. Currently, machine learning-based solutions are increasingly being used to solve classification problems. However, these classifiers require proper training data sets to work correctly. The problem with training neural networks on real IACT data is that these data need to be pre-labeled, whereas such labeling is difficult and its results are estimates. In addition, the distribution of incoming events is highly imbalanced. Firstly, there is an imbalance in the types of events, since the number of detected gamma quanta is significantly less than the number of protons. Secondly, the energy distribution of particles of the same type is also imbalanced, since high-energy particles are extremely rare. This imbalance results in poorly trained classifiers that, once trained, do not handle rare events correctly. Using only conventional Monte Carlo event simulation methods to solve this problem is possible, but extremely resource-intensive and time-consuming. To address this issue, we propose to perform data augmentation with artificially generated events of the desired type and energy using conditional generative adversarial networks (cGANs), distinguishing classes by energy values. In the paper, we describe a simple algorithm for generating balanced data sets using cGANs. Thus, the proposed neural network model produces both imbalanced data sets for physical analysis as well as balanced data sets suitable for training other neural networks.
Abstract:The direction of extensive air showers can be used to determine the source of gamma quanta and plays an important role in estimating the energy of the primary particle. The data from an array of non-imaging Cherenkov detector stations HiSCORE in the TAIGA experiment registering the number of photoelectrons and detection time can be used to estimate the shower direction with high accuracy. In this work, we use artificial neural networks trained on Monte Carlo-simulated TAIGA HiSCORE data for gamma quanta to obtain shower direction estimates. The neural networks are multilayer perceptrons with skip connections using partial data from several HiSCORE stations as inputs; composite estimates are derived from multiple individual estimates by the neural networks. We apply a two-stage algorithm in which the direction estimates obtained in the first stage are used to transform the input data and refine the estimates. The mean error of the final estimates is less than 0.25 degrees. The approach will be used for multimodal analysis of the data from several types of detectors used in the TAIGA experiment.
Abstract:In this work, the ability of rare VHE gamma ray selection with neural network methods is investigated in the case when cosmic radiation flux strongly prevails (ratio up to {10^4} over the gamma radiation flux from a point source). This ratio is valid for the Crab Nebula in the TeV energy range, since the Crab is a well-studied source for calibration and test of various methods and installations in gamma astronomy. The part of TAIGA experiment which includes three Imaging Atmospheric Cherenkov Telescopes observes this gamma-source too. Cherenkov telescopes obtain images of Extensive Air Showers. Hillas parameters can be used to analyse images in standard processing method, or images can be processed with convolutional neural networks. In this work we would like to describe the main steps and results obtained in the gamma/hadron separation task from the Crab Nebula with neural network methods. The results obtained are compared with standard processing method applied in the TAIGA collaboration and using Hillas parameter cuts. It is demonstrated that a signal was received at the level of higher than 5.5{\sigma} in 21 hours of Crab Nebula observations after processing the experimental data with the neural network method.
Abstract:Modern detectors of cosmic gamma-rays are a special type of imaging telescopes (air Cherenkov telescopes) supplied with cameras with a relatively large number of photomultiplier-based pixels. For example, the camera of the TAIGA-IACT telescope has 560 pixels of hexagonal structure. Images in such cameras can be analysed by deep learning techniques to extract numerous physical and geometrical parameters and/or for incoming particle identification. The most powerful deep learning technique for image analysis, the so-called convolutional neural network (CNN), was implemented in this study. Two open source libraries for machine learning, PyTorch and TensorFlow, were tested as possible software platforms for particle identification in imaging air Cherenkov telescopes. Monte Carlo simulation was performed to analyse images of gamma-rays and background particles (protons) as well as estimate identification accuracy. Further steps of implementation and improvement of this technique are discussed.