Abstract:Modern Imaging Atmospheric Cherenkov Telescopes (IACTs) generate a huge amount of data that must be classified automatically, ideally in real time. Currently, machine learning-based solutions are increasingly being used to solve classification problems. However, these classifiers require proper training data sets to work correctly. The problem with training neural networks on real IACT data is that these data need to be pre-labeled, whereas such labeling is difficult and its results are estimates. In addition, the distribution of incoming events is highly imbalanced. Firstly, there is an imbalance in the types of events, since the number of detected gamma quanta is significantly less than the number of protons. Secondly, the energy distribution of particles of the same type is also imbalanced, since high-energy particles are extremely rare. This imbalance results in poorly trained classifiers that, once trained, do not handle rare events correctly. Using only conventional Monte Carlo event simulation methods to solve this problem is possible, but extremely resource-intensive and time-consuming. To address this issue, we propose to perform data augmentation with artificially generated events of the desired type and energy using conditional generative adversarial networks (cGANs), distinguishing classes by energy values. In the paper, we describe a simple algorithm for generating balanced data sets using cGANs. Thus, the proposed neural network model produces both imbalanced data sets for physical analysis as well as balanced data sets suitable for training other neural networks.
Abstract:In this work, the ability of rare VHE gamma ray selection with neural network methods is investigated in the case when cosmic radiation flux strongly prevails (ratio up to {10^4} over the gamma radiation flux from a point source). This ratio is valid for the Crab Nebula in the TeV energy range, since the Crab is a well-studied source for calibration and test of various methods and installations in gamma astronomy. The part of TAIGA experiment which includes three Imaging Atmospheric Cherenkov Telescopes observes this gamma-source too. Cherenkov telescopes obtain images of Extensive Air Showers. Hillas parameters can be used to analyse images in standard processing method, or images can be processed with convolutional neural networks. In this work we would like to describe the main steps and results obtained in the gamma/hadron separation task from the Crab Nebula with neural network methods. The results obtained are compared with standard processing method applied in the TAIGA collaboration and using Hillas parameter cuts. It is demonstrated that a signal was received at the level of higher than 5.5{\sigma} in 21 hours of Crab Nebula observations after processing the experimental data with the neural network method.
Abstract:Imaging Atmospheric Cherenkov Telescopes (IACTs) of gamma ray observatory TAIGA detect the Extesnive Air Showers (EASs) originating from the cosmic or gamma rays interactions with the atmosphere. Thereby, telescopes obtain images of the EASs. The ability to segregate gamma rays images from the hadronic cosmic ray background is one of the main features of this type of detectors. However, in actual IACT observations simultaneous observation of the background and the source of gamma ray is needed. This observation mode (called wobbling) modifies images of events, which affects the quality of selection by neural networks. Thus, in this work, the results of the application of neural networks (NN) for image classification task on Monte Carlo (MC) images of TAIGA-IACTs are presented. The wobbling mode is considered together with the image adaptation for adequate analysis by NNs. Simultaneously, we explore several neural network structures that classify events both directly from images or through Hillas parameters extracted from images. In addition, by employing NNs, MC simulation data are used to evaluate the quality of the segregation of rare gamma events with the account of all necessary image modifications.