Alert button
Picture for Zohar Ringel

Zohar Ringel

Alert button

Wilsonian Renormalization of Neural Network Gaussian Processes

Add code
Bookmark button
Alert button
May 09, 2024
Jessica N. Howard, Ro Jefferson, Anindita Maiti, Zohar Ringel

Viaarxiv icon

Towards Understanding Inductive Bias in Transformers: A View From Infinity

Add code
Bookmark button
Alert button
Feb 07, 2024
Itay Lavie, Guy Gur-Ari, Zohar Ringel

Viaarxiv icon

Droplets of Good Representations: Grokking as a First Order Phase Transition in Two Layer Networks

Add code
Bookmark button
Alert button
Oct 05, 2023
Noa Rubin, Inbar Seroussi, Zohar Ringel

Figure 1 for Droplets of Good Representations: Grokking as a First Order Phase Transition in Two Layer Networks
Figure 2 for Droplets of Good Representations: Grokking as a First Order Phase Transition in Two Layer Networks
Figure 3 for Droplets of Good Representations: Grokking as a First Order Phase Transition in Two Layer Networks
Viaarxiv icon

Speed Limits for Deep Learning

Add code
Bookmark button
Alert button
Jul 27, 2023
Inbar Seroussi, Alexander A. Alemi, Moritz Helias, Zohar Ringel

Figure 1 for Speed Limits for Deep Learning
Viaarxiv icon

Spectral-Bias and Kernel-Task Alignment in Physically Informed Neural Networks

Add code
Bookmark button
Alert button
Jul 12, 2023
Inbar Seroussi, Asaf Miron, Zohar Ringel

Figure 1 for Spectral-Bias and Kernel-Task Alignment in Physically Informed Neural Networks
Viaarxiv icon

Separation of scales and a thermodynamic description of feature learning in some CNNs

Add code
Bookmark button
Alert button
Dec 31, 2021
Inbar Seroussi, Zohar Ringel

Figure 1 for Separation of scales and a thermodynamic description of feature learning in some CNNs
Figure 2 for Separation of scales and a thermodynamic description of feature learning in some CNNs
Figure 3 for Separation of scales and a thermodynamic description of feature learning in some CNNs
Viaarxiv icon

A self consistent theory of Gaussian Processes captures feature learning effects in finite CNNs

Add code
Bookmark button
Alert button
Jun 08, 2021
Gadi Naveh, Zohar Ringel

Figure 1 for A self consistent theory of Gaussian Processes captures feature learning effects in finite CNNs
Figure 2 for A self consistent theory of Gaussian Processes captures feature learning effects in finite CNNs
Figure 3 for A self consistent theory of Gaussian Processes captures feature learning effects in finite CNNs
Figure 4 for A self consistent theory of Gaussian Processes captures feature learning effects in finite CNNs
Viaarxiv icon

Predicting the outputs of finite networks trained with noisy gradients

Add code
Bookmark button
Alert button
Apr 02, 2020
Gadi Naveh, Oded Ben-David, Haim Sompolinsky, Zohar Ringel

Figure 1 for Predicting the outputs of finite networks trained with noisy gradients
Figure 2 for Predicting the outputs of finite networks trained with noisy gradients
Figure 3 for Predicting the outputs of finite networks trained with noisy gradients
Figure 4 for Predicting the outputs of finite networks trained with noisy gradients
Viaarxiv icon

Learning Curves for Deep Neural Networks: A Gaussian Field Theory Perspective

Add code
Bookmark button
Alert button
Jun 12, 2019
Omry Cohen, Or Malka, Zohar Ringel

Figure 1 for Learning Curves for Deep Neural Networks: A Gaussian Field Theory Perspective
Viaarxiv icon

The role of a layer in deep neural networks: a Gaussian Process perspective

Add code
Bookmark button
Alert button
Feb 06, 2019
Oded Ben-David, Zohar Ringel

Figure 1 for The role of a layer in deep neural networks: a Gaussian Process perspective
Figure 2 for The role of a layer in deep neural networks: a Gaussian Process perspective
Figure 3 for The role of a layer in deep neural networks: a Gaussian Process perspective
Figure 4 for The role of a layer in deep neural networks: a Gaussian Process perspective
Viaarxiv icon