Alert button
Picture for Michael Murray

Michael Murray

Alert button

Benign overfitting in leaky ReLU networks with moderate input dimension

Add code
Bookmark button
Alert button
Mar 11, 2024
Kedar Karhadkar, Erin George, Michael Murray, Guido Montúfar, Deanna Needell

Viaarxiv icon

Training shallow ReLU networks on noisy data using hinge loss: when do we overfit and is it benign?

Add code
Bookmark button
Alert button
Jun 16, 2023
Erin George, Michael Murray, William Swartworth, Deanna Needell

Figure 1 for Training shallow ReLU networks on noisy data using hinge loss: when do we overfit and is it benign?
Figure 2 for Training shallow ReLU networks on noisy data using hinge loss: when do we overfit and is it benign?
Figure 3 for Training shallow ReLU networks on noisy data using hinge loss: when do we overfit and is it benign?
Viaarxiv icon

Mildly Overparameterized ReLU Networks Have a Favorable Loss Landscape

Add code
Bookmark button
Alert button
May 31, 2023
Kedar Karhadkar, Michael Murray, Hanna Tseran, Guido Montúfar

Figure 1 for Mildly Overparameterized ReLU Networks Have a Favorable Loss Landscape
Figure 2 for Mildly Overparameterized ReLU Networks Have a Favorable Loss Landscape
Figure 3 for Mildly Overparameterized ReLU Networks Have a Favorable Loss Landscape
Figure 4 for Mildly Overparameterized ReLU Networks Have a Favorable Loss Landscape
Viaarxiv icon

Characterizing the Spectrum of the NTK via a Power Series Expansion

Add code
Bookmark button
Alert button
Nov 15, 2022
Michael Murray, Hui Jin, Benjamin Bowman, Guido Montufar

Figure 1 for Characterizing the Spectrum of the NTK via a Power Series Expansion
Figure 2 for Characterizing the Spectrum of the NTK via a Power Series Expansion
Figure 3 for Characterizing the Spectrum of the NTK via a Power Series Expansion
Figure 4 for Characterizing the Spectrum of the NTK via a Power Series Expansion
Viaarxiv icon

Activation function design for deep networks: linearity and effective initialisation

Add code
Bookmark button
Alert button
May 17, 2021
Michael Murray, Vinayak Abrol, Jared Tanner

Figure 1 for Activation function design for deep networks: linearity and effective initialisation
Figure 2 for Activation function design for deep networks: linearity and effective initialisation
Figure 3 for Activation function design for deep networks: linearity and effective initialisation
Figure 4 for Activation function design for deep networks: linearity and effective initialisation
Viaarxiv icon

The Permuted Striped Block Model and its Factorization -- Algorithms with Recovery Guarantees

Add code
Bookmark button
Alert button
Apr 10, 2020
Michael Murray, Jared Tanner

Figure 1 for The Permuted Striped Block Model and its Factorization -- Algorithms with Recovery Guarantees
Figure 2 for The Permuted Striped Block Model and its Factorization -- Algorithms with Recovery Guarantees
Figure 3 for The Permuted Striped Block Model and its Factorization -- Algorithms with Recovery Guarantees
Figure 4 for The Permuted Striped Block Model and its Factorization -- Algorithms with Recovery Guarantees
Viaarxiv icon

Vision-and-Dialog Navigation

Add code
Bookmark button
Alert button
Jul 19, 2019
Jesse Thomason, Michael Murray, Maya Cakmak, Luke Zettlemoyer

Figure 1 for Vision-and-Dialog Navigation
Figure 2 for Vision-and-Dialog Navigation
Figure 3 for Vision-and-Dialog Navigation
Figure 4 for Vision-and-Dialog Navigation
Viaarxiv icon

Towards an understanding of CNNs: analysing the recovery of activation pathways via Deep Convolutional Sparse Coding

Add code
Bookmark button
Alert button
Jun 26, 2018
Michael Murray, Jared Tanner

Viaarxiv icon