Models, code, and papers for "Stefano Carrazza":

Lund jet images from generative and cycle-consistent adversarial networks

Sep 03, 2019
Stefano Carrazza, Frédéric A. Dreyer

We introduce a generative model to simulate radiation patterns within a jet using the Lund jet plane. We show that using an appropriate neural network architecture with a stochastic generation of images, it is possible to construct a generative model which retrieves the underlying two-dimensional distribution to within a few percent. We compare our model with several alternative state-of-the-art generative techniques. Finally, we show how a mapping can be created between different categories of jets, and use this method to retroactively change simulation settings or the underlying process on an existing sample. These results provide a framework for significantly reducing simulation times through fast inference of the neural network as well as for data augmentation of physical measurements.

* 11 pages, 16 figures, code available at https://github.com/JetsGame/gLund and https://github.com/JetsGame/CycleJet 

  Click for Model/Code and Paper
Jet grooming through reinforcement learning

Mar 22, 2019
Stefano Carrazza, Frédéric A. Dreyer

We introduce a novel implementation of a reinforcement learning (RL) algorithm which is designed to find an optimal jet grooming strategy, a critical tool for collider experiments. The RL agent is trained with a reward function constructed to optimize the resulting jet properties, using both signal and background samples in a simultaneous multi-level training. We show that the grooming algorithm derived from the deep RL agent can match state-of-the-art techniques used at the Large Hadron Collider, resulting in improved mass resolution for boosted objects. Given a suitable reward function, the agent learns how to train a policy which optimally removes soft wide-angle radiation, allowing for a modular grooming technique that can be applied in a wide range of contexts. These results are accessible through the corresponding GroomRL framework.

* 10 pages, 10 figures, code available at https://github.com/JetsGame/GroomRL 

  Click for Model/Code and Paper
Sampling the Riemann-Theta Boltzmann Machine

Apr 20, 2018
Stefano Carrazza, Daniel Krefl

We show that the visible sector probability density function of the Riemann-Theta Boltzmann machine corresponds to a gaussian mixture model consisting of an infinite number of component multi-variate gaussians. The weights of the mixture are given by a discrete multi-variate gaussian over the hidden state space. This allows us to sample the visible sector density function in a straight-forward manner. Furthermore, we show that the visible sector probability density function possesses an affine transform property, similar to the multi-variate gaussian density.

* 8 pages, 6 figures 

  Click for Model/Code and Paper
Modelling conditional probabilities with Riemann-Theta Boltzmann Machines

May 27, 2019
Stefano Carrazza, Daniel Krefl, Andrea Papaluca

The probability density function for the visible sector of a Riemann-Theta Boltzmann machine can be taken conditional on a subset of the visible units. We derive that the corresponding conditional density function is given by a reparameterization of the Riemann-Theta Boltzmann machine modelling the original probability density function. Therefore the conditional densities can be directly inferred from the Riemann-Theta Boltzmann machine.

* 7 pages, 3 figures, in proceedings of the 19th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2019) 

  Click for Model/Code and Paper
Riemann-Theta Boltzmann Machine

Apr 06, 2018
Daniel Krefl, Stefano Carrazza, Babak Haghighat, Jens Kahlen

A general Boltzmann machine with continuous visible and discrete integer valued hidden states is introduced. Under mild assumptions about the connection matrices, the probability density function of the visible units can be solved for analytically, yielding a novel parametric density function involving a ratio of Riemann-Theta functions. The conditional expectation of a hidden state for given visible states can also be calculated analytically, yielding a derivative of the logarithmic Riemann-Theta function. The conditional expectation can be used as activation function in a feedforward neural network, thereby increasing the modelling capacity of the network. Both the Boltzmann machine and the derived feedforward neural network can be successfully trained via standard gradient- and non-gradient-based optimization techniques.

* 25 pages, 11 figures, typos corrected 

  Click for Model/Code and Paper
Towards hardware acceleration for parton densities estimation

Sep 23, 2019
Stefano Carrazza, Juan Cruz-Martinez, Jesús Urtasun-Elizari, Emilio Villa

In this proceedings we describe the computational challenges associated to the determination of parton distribution functions (PDFs). We compare the performance of the convolution of the parton distributions with matrix elements using different hardware instructions. We quantify and identify the most promising data-model configurations to increase PDF fitting performance in adapting the current code frameworks to hardware accelerators such as graphics processing units.

* 6 pages, 2 figures, 3 tables, in proceedings of PHOTON 2019 

  Click for Model/Code and Paper
Machine Learning in High Energy Physics Community White Paper

Jul 08, 2018
Kim Albertsson, Piero Altoe, Dustin Anderson, Michael Andrews, Juan Pedro Araque Espinosa, Adam Aurisano, Laurent Basara, Adrian Bevan, Wahid Bhimji, Daniele Bonacorsi, Paolo Calafiura, Mario Campanelli, Louis Capps, Federico Carminati, Stefano Carrazza, Taylor Childers, Elias Coniavitis, Kyle Cranmer, Claire David, Douglas Davis, Javier Duarte, Martin Erdmann, Jonas Eschle, Amir Farbin, Matthew Feickert, Nuno Filipe Castro, Conor Fitzpatrick, Michele Floris, Alessandra Forti, Jordi Garra-Tico, Jochen Gemmler, Maria Girone, Paul Glaysher, Sergei Gleyzer, Vladimir Gligorov, Tobias Golling, Jonas Graw, Lindsey Gray, Dick Greenwood, Thomas Hacker, John Harvey, Benedikt Hegner, Lukas Heinrich, Ben Hooberman, Johannes Junggeburth, Michael Kagan, Meghan Kane, Konstantin Kanishchev, Przemysław Karpiński, Zahari Kassabov, Gaurav Kaul, Dorian Kcira, Thomas Keck, Alexei Klimentov, Jim Kowalkowski, Luke Kreczko, Alexander Kurepin, Rob Kutschke, Valentin Kuznetsov, Nicolas Köhler, Igor Lakomov, Kevin Lannon, Mario Lassnig, Antonio Limosani, Gilles Louppe, Aashrita Mangu, Pere Mato, Narain Meenakshi, Helge Meinhard, Dario Menasce, Lorenzo Moneta, Seth Moortgat, Mark Neubauer, Harvey Newman, Hans Pabst, Michela Paganini, Manfred Paulini, Gabriel Perdue, Uzziel Perez, Attilio Picazio, Jim Pivarski, Harrison Prosper, Fernanda Psihas, Alexander Radovic, Ryan Reece, Aurelius Rinkevicius, Eduardo Rodrigues, Jamal Rorie, David Rousseau, Aaron Sauers, Steven Schramm, Ariel Schwartzman, Horst Severini, Paul Seyfert, Filip Siroky, Konstantin Skazytkin, Mike Sokoloff, Graeme Stewart, Bob Stienen, Ian Stockdale, Giles Strong, Savannah Thais, Karen Tomko, Eli Upfal, Emanuele Usai, Andrey Ustyuzhanin, Martin Vala, Sofia Vallecorsa, Mauro Verzetti, Xavier Vilasís-Cardona, Jean-Roch Vlimant, Ilija Vukotic, Sean-Jiun Wang, Gordon Watts, Michael Williams, Wenjing Wu, Stefan Wunsch, Omar Zapata

Machine learning is an important research area in particle physics, beginning with applications to high-level physics analysis in the 1990s and 2000s, followed by an explosion of applications in particle and event identification and reconstruction in the 2010s. In this document we discuss promising future research and development areas in machine learning in particle physics with a roadmap for their implementation, software and hardware resource requirements, collaborative initiatives with the data science community, academia and industry, and training the particle physics community in data science. The main objective of the document is to connect and motivate these areas of research and development with the physics drivers of the High-Luminosity Large Hadron Collider and future neutrino experiments and identify the resource needs for their implementation. Additionally we identify areas where collaboration with external communities will be of great benefit.

* Editors: Sergei Gleyzer, Paul Seyfert and Steven Schramm 

  Click for Model/Code and Paper