Signal-background classification is a central problem in high-energy physics, that plays a major role for the discovery of new fundamental particles. A recent method-the parametric neural network (pNN)-leverages multiple signal mass hypotheses as an additional input feature to effectively replace a whole set of individual classifiers, each providing (in principle) the best response for the corresponding mass hypothesis. In this work we aim at deepening the understanding of pNNs in light of real-world usage. We discovered several peculiarities of parametric networks, providing intuition, metrics, and guidelines to them. We further propose an alternative parametrization scheme, resulting in a new parametrized neural network architecture: the AffinePNN; along with many other generally applicable improvements, like the balanced training procedure. Finally, we extensively and empirically evaluate our models on the HEPMASS dataset, along its imbalanced version (called HEPMASS-IMB) we provide here for the first time, to further validate our approach. Provided results are in terms of the impact of the proposed design decisions, classification performance, and interpolation capability, as well.

Anzalone L., Diotalevi T., Bonacorsi D. (2022). Improving parametric neural networks for high-energy physics (and beyond). MACHINE LEARNING: SCIENCE AND TECHNOLOGY, 3(3), 1-20 [10.1088/2632-2153/ac917c].

Improving parametric neural networks for high-energy physics (and beyond)

Anzalone L.
Primo
Methodology
;
Diotalevi T.
Secondo
Data Curation
;
Bonacorsi D.
Ultimo
Supervision
2022

Abstract

Signal-background classification is a central problem in high-energy physics, that plays a major role for the discovery of new fundamental particles. A recent method-the parametric neural network (pNN)-leverages multiple signal mass hypotheses as an additional input feature to effectively replace a whole set of individual classifiers, each providing (in principle) the best response for the corresponding mass hypothesis. In this work we aim at deepening the understanding of pNNs in light of real-world usage. We discovered several peculiarities of parametric networks, providing intuition, metrics, and guidelines to them. We further propose an alternative parametrization scheme, resulting in a new parametrized neural network architecture: the AffinePNN; along with many other generally applicable improvements, like the balanced training procedure. Finally, we extensively and empirically evaluate our models on the HEPMASS dataset, along its imbalanced version (called HEPMASS-IMB) we provide here for the first time, to further validate our approach. Provided results are in terms of the impact of the proposed design decisions, classification performance, and interpolation capability, as well.
2022
Anzalone L., Diotalevi T., Bonacorsi D. (2022). Improving parametric neural networks for high-energy physics (and beyond). MACHINE LEARNING: SCIENCE AND TECHNOLOGY, 3(3), 1-20 [10.1088/2632-2153/ac917c].
Anzalone L.; Diotalevi T.; Bonacorsi D.
File in questo prodotto:
File Dimensione Formato  
Anzalone_2022_Mach._Learn.__Sci._Technol._3_035017.pdf

accesso aperto

Tipo: Versione (PDF) editoriale
Licenza: Licenza per Accesso Aperto. Creative Commons Attribuzione (CCBY)
Dimensione 1.15 MB
Formato Adobe PDF
1.15 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/914554
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 2
social impact