Alert button
Picture for Gil Shomron

Gil Shomron

Alert button

Post-Training Sparsity-Aware Quantization

Add code
Bookmark button
Alert button
May 23, 2021
Gil Shomron, Freddy Gabbay, Samer Kurzum, Uri Weiser

Figure 1 for Post-Training Sparsity-Aware Quantization
Figure 2 for Post-Training Sparsity-Aware Quantization
Figure 3 for Post-Training Sparsity-Aware Quantization
Figure 4 for Post-Training Sparsity-Aware Quantization
Viaarxiv icon

Post-Training BatchNorm Recalibration

Add code
Bookmark button
Alert button
Oct 12, 2020
Gil Shomron, Uri Weiser

Figure 1 for Post-Training BatchNorm Recalibration
Figure 2 for Post-Training BatchNorm Recalibration
Viaarxiv icon

Non-Blocking Simultaneous Multithreading: Embracing the Resiliency of Deep Neural Networks

Add code
Bookmark button
Alert button
Apr 17, 2020
Gil Shomron, Uri Weiser

Figure 1 for Non-Blocking Simultaneous Multithreading: Embracing the Resiliency of Deep Neural Networks
Figure 2 for Non-Blocking Simultaneous Multithreading: Embracing the Resiliency of Deep Neural Networks
Figure 3 for Non-Blocking Simultaneous Multithreading: Embracing the Resiliency of Deep Neural Networks
Figure 4 for Non-Blocking Simultaneous Multithreading: Embracing the Resiliency of Deep Neural Networks
Viaarxiv icon

Robust Quantization: One Model to Rule Them All

Add code
Bookmark button
Alert button
Feb 18, 2020
Moran Shkolnik, Brian Chmiel, Ron Banner, Gil Shomron, Yuri Nahshan, Alex Bronstein, Uri Weiser

Figure 1 for Robust Quantization: One Model to Rule Them All
Figure 2 for Robust Quantization: One Model to Rule Them All
Figure 3 for Robust Quantization: One Model to Rule Them All
Figure 4 for Robust Quantization: One Model to Rule Them All
Viaarxiv icon

Thanks for Nothing: Predicting Zero-Valued Activations with Lightweight Convolutional Neural Networks

Add code
Bookmark button
Alert button
Sep 17, 2019
Gil Shomron, Ron Banner, Moran Shkolnik, Uri Weiser

Figure 1 for Thanks for Nothing: Predicting Zero-Valued Activations with Lightweight Convolutional Neural Networks
Figure 2 for Thanks for Nothing: Predicting Zero-Valued Activations with Lightweight Convolutional Neural Networks
Figure 3 for Thanks for Nothing: Predicting Zero-Valued Activations with Lightweight Convolutional Neural Networks
Figure 4 for Thanks for Nothing: Predicting Zero-Valued Activations with Lightweight Convolutional Neural Networks
Viaarxiv icon

Exploiting Spatial Correlation in Convolutional Neural Networks for Activation Value Prediction

Add code
Bookmark button
Alert button
Jul 21, 2018
Gil Shomron, Uri Weiser

Figure 1 for Exploiting Spatial Correlation in Convolutional Neural Networks for Activation Value Prediction
Figure 2 for Exploiting Spatial Correlation in Convolutional Neural Networks for Activation Value Prediction
Figure 3 for Exploiting Spatial Correlation in Convolutional Neural Networks for Activation Value Prediction
Figure 4 for Exploiting Spatial Correlation in Convolutional Neural Networks for Activation Value Prediction
Viaarxiv icon