N-fold Superposition: Improving Neural Networks by Reducing the Noise in Feature Maps

May 03, 2018

Yang Liu, Qiang Qu, Chao Gao

May 03, 2018

Yang Liu, Qiang Qu, Chao Gao

* 7 pages, 5 figures, submitted to ICALIP 2018

**Click to Read Paper and Get Code**

Binary output layer of feedforward neural networks for solving multi-class classification problems

Jan 22, 2018

Sibo Yang, Chao Zhang, Wei Wu

Jan 22, 2018

Sibo Yang, Chao Zhang, Wei Wu

**Click to Read Paper and Get Code**

* accepted by ICCV 2017

**Click to Read Paper and Get Code**

**Click to Read Paper and Get Code**

A Lower Bound Analysis of Population-based Evolutionary Algorithms for Pseudo-Boolean Functions

Jun 10, 2016

Chao Qian, Yang Yu, Zhi-Hua Zhou

Evolutionary algorithms (EAs) are population-based general-purpose optimization algorithms, and have been successfully applied in various real-world optimization tasks. However, previous theoretical studies often employ EAs with only a parent or offspring population and focus on specific problems. Furthermore, they often only show upper bounds on the running time, while lower bounds are also necessary to get a complete understanding of an algorithm. In this paper, we analyze the running time of the ($\mu$+$\lambda$)-EA (a general population-based EA with mutation only) on the class of pseudo-Boolean functions with a unique global optimum. By applying the recently proposed switch analysis approach, we prove the lower bound $\Omega(n \ln n+ \mu + \lambda n\ln\ln n/ \ln n)$ for the first time. Particularly on the two widely-studied problems, OneMax and LeadingOnes, the derived lower bound discloses that the ($\mu$+$\lambda$)-EA will be strictly slower than the (1+1)-EA when the population size $\mu$ or $\lambda$ is above a moderate order. Our results imply that the increase of population size, while usually desired in practice, bears the risk of increasing the lower bound of the running time and thus should be carefully considered.
Jun 10, 2016

Chao Qian, Yang Yu, Zhi-Hua Zhou

**Click to Read Paper and Get Code**

A Parallel Way to Select the Parameters of SVM Based on the Ant Optimization Algorithm

May 20, 2014

Chao Zhang, Hong-cen Mei, Hao Yang

A large number of experimental data shows that Support Vector Machine (SVM) algorithm has obvious advantages in text classification, handwriting recognition, image classification, bioinformatics, and some other fields. To some degree, the optimization of SVM depends on its kernel function and Slack variable, the determinant of which is its parameters $\delta$ and c in the classification function. That is to say,to optimize the SVM algorithm, the optimization of the two parameters play a huge role. Ant Colony Optimization (ACO) is optimization algorithm which simulate ants to find the optimal path.In the available literature, we mix the ACO algorithm and Parallel algorithm together to find a well parameters.
May 20, 2014

Chao Zhang, Hong-cen Mei, Hao Yang

* 3 pages, 2 figures, 2 tables

**Click to Read Paper and Get Code**

Analyzing Evolutionary Optimization in Noisy Environments

Nov 20, 2013

Chao Qian, Yang Yu, Zhi-Hua Zhou

Many optimization tasks have to be handled in noisy environments, where we cannot obtain the exact evaluation of a solution but only a noisy one. For noisy optimization tasks, evolutionary algorithms (EAs), a kind of stochastic metaheuristic search algorithm, have been widely and successfully applied. Previous work mainly focuses on empirical studying and designing EAs for noisy optimization, while, the theoretical counterpart has been little investigated. In this paper, we investigate a largely ignored question, i.e., whether an optimization problem will always become harder for EAs in a noisy environment. We prove that the answer is negative, with respect to the measurement of the expected running time. The result implies that, for optimization tasks that have already been quite hard to solve, the noise may not have a negative effect, and the easier a task the more negatively affected by the noise. On a representative problem where the noise has a strong negative effect, we examine two commonly employed mechanisms in EAs dealing with noise, the re-evaluation and the threshold selection strategies. The analysis discloses that the two strategies, however, both are not effective, i.e., they do not make the EA more noise tolerant. We then find that a small modification of the threshold selection allows it to be proven as an effective strategy for dealing with the noise in the problem.
Nov 20, 2013

Chao Qian, Yang Yu, Zhi-Hua Zhou

**Click to Read Paper and Get Code**

Towards Analyzing Crossover Operators in Evolutionary Search via General Markov Chain Switching Theorem

Apr 26, 2012

Yang Yu, Chao Qian, Zhi-Hua Zhou

Apr 26, 2012

Yang Yu, Chao Qian, Zhi-Hua Zhou

**Click to Read Paper and Get Code**

Coherent Semantic Attention for Image Inpainting

Jun 12, 2019

Hongyu Liu, Bin Jiang, Yi Xiao, Chao Yang

Jun 12, 2019

Hongyu Liu, Bin Jiang, Yi Xiao, Chao Yang

**Click to Read Paper and Get Code**

Online Learning with Diverse User Preferences

Feb 04, 2019

Chao Gan, Jing Yang, Ruida Zhou, Cong Shen

Feb 04, 2019

Chao Gan, Jing Yang, Ruida Zhou, Cong Shen

* 10 pages, 3 figures

**Click to Read Paper and Get Code**

Perspective-Aware CNN For Crowd Counting

Jul 05, 2018

Miaojing Shi, Zhaohui Yang, Chao Xu, Qijun Chen

Jul 05, 2018

Miaojing Shi, Zhaohui Yang, Chao Xu, Qijun Chen

**Click to Read Paper and Get Code**

Deep Transfer Network with Joint Distribution Adaptation: A New Intelligent Fault Diagnosis Framework for Industry Application

Apr 18, 2018

Te Han, Chao Liu, Wenguang Yang, Dongxiang Jiang

Apr 18, 2018

Te Han, Chao Liu, Wenguang Yang, Dongxiang Jiang

* 10 pages, 10 figures

**Click to Read Paper and Get Code**

Cost-Aware Learning and Optimization for Opportunistic Spectrum Access

Apr 11, 2018

Chao Gan, Ruida Zhou, Jing Yang, Cong Shen

Apr 11, 2018

Chao Gan, Ruida Zhou, Jing Yang, Cong Shen

* 12 pages, 6 figures

**Click to Read Paper and Get Code**

Learning a No-Reference Quality Metric for Single-Image Super-Resolution

Dec 18, 2016

Chao Ma, Chih-Yuan Yang, Xiaokang Yang, Ming-Hsuan Yang

Numerous single-image super-resolution algorithms have been proposed in the literature, but few studies address the problem of performance evaluation based on visual perception. While most super-resolution images are evaluated by fullreference metrics, the effectiveness is not clear and the required ground-truth images are not always available in practice. To address these problems, we conduct human subject studies using a large set of super-resolution images and propose a no-reference metric learned from visual perceptual scores. Specifically, we design three types of low-level statistical features in both spatial and frequency domains to quantify super-resolved artifacts, and learn a two-stage regression model to predict the quality scores of super-resolution images without referring to ground-truth images. Extensive experimental results show that the proposed metric is effective and efficient to assess the quality of super-resolution images based on human perception.
Dec 18, 2016

Chao Ma, Chih-Yuan Yang, Xiaokang Yang, Ming-Hsuan Yang

* Accepted by Computer Vision and Image Understanding

**Click to Read Paper and Get Code**

Exact Hybrid Covariance Thresholding for Joint Graphical Lasso

Jun 18, 2015

Qingming Tang, Chao Yang, Jian Peng, Jinbo Xu

Jun 18, 2015

Qingming Tang, Chao Yang, Jian Peng, Jinbo Xu

**Click to Read Paper and Get Code**

Low-rank SIFT: An Affine Invariant Feature for Place Recognition

Aug 07, 2014

Chao Yang, Shengnan Caih, Jingdong Wang, Long Quan

Aug 07, 2014

Chao Yang, Shengnan Caih, Jingdong Wang, Long Quan

**Click to Read Paper and Get Code**

Apply Local Clustering Method to Improve the Running Speed of Ant Colony Optimization

Jul 07, 2009

Chao-Yang Pang, Wei Hu, Xia Li, Be-Qiong Hu

Jul 07, 2009

Chao-Yang Pang, Wei Hu, Xia Li, Be-Qiong Hu

* 21 pages, 5figures

**Click to Read Paper and Get Code**

Multi-objective Pruning for CNNs using Genetic Algorithm

Jun 02, 2019

Chuanguang Yang, Zhulin An, Chao Li, Boyu Diao, Yongjun Xu

Jun 02, 2019

Chuanguang Yang, Zhulin An, Chao Li, Boyu Diao, Yongjun Xu

* 6 pages,4 figures,under review as a conference paper at ICANN 2019

**Click to Read Paper and Get Code**