N-fold Superposition: Improving Neural Networks by Reducing the Noise in Feature Maps

May 03, 2018

Yang Liu, Qiang Qu, Chao Gao

May 03, 2018

Yang Liu, Qiang Qu, Chao Gao

**Click to Read Paper**

Binary output layer of feedforward neural networks for solving multi-class classification problems

Jan 22, 2018

Sibo Yang, Chao Zhang, Wei Wu

Considered in this short note is the design of output layer nodes of feedforward neural networks for solving multi-class classification problems with r (bigger than or equal to 3) classes of samples. The common and conventional setting of output layer, called "one-to-one approach" in this paper, is as follows: The output layer contains r output nodes corresponding to the r classes. And for an input sample of the i-th class, the ideal output is 1 for the i-th output node, and 0 for all the other output nodes. We propose in this paper a new "binary approach": Suppose r is (2^(q minus 1), 2^q] with q bigger than or equal to 2, then we let the output layer contain q output nodes, and let the ideal outputs for the r classes be designed in a binary manner. Numerical experiments carried out in this paper show that our binary approach does equally good job as, but uses less output nodes than, the traditional one-to-one approach.
Jan 22, 2018

Sibo Yang, Chao Zhang, Wei Wu

**Click to Read Paper**

**Click to Read Paper**

**Click to Read Paper**

A Lower Bound Analysis of Population-based Evolutionary Algorithms for Pseudo-Boolean Functions

Jun 10, 2016

Chao Qian, Yang Yu, Zhi-Hua Zhou

Evolutionary algorithms (EAs) are population-based general-purpose optimization algorithms, and have been successfully applied in various real-world optimization tasks. However, previous theoretical studies often employ EAs with only a parent or offspring population and focus on specific problems. Furthermore, they often only show upper bounds on the running time, while lower bounds are also necessary to get a complete understanding of an algorithm. In this paper, we analyze the running time of the ($\mu$+$\lambda$)-EA (a general population-based EA with mutation only) on the class of pseudo-Boolean functions with a unique global optimum. By applying the recently proposed switch analysis approach, we prove the lower bound $\Omega(n \ln n+ \mu + \lambda n\ln\ln n/ \ln n)$ for the first time. Particularly on the two widely-studied problems, OneMax and LeadingOnes, the derived lower bound discloses that the ($\mu$+$\lambda$)-EA will be strictly slower than the (1+1)-EA when the population size $\mu$ or $\lambda$ is above a moderate order. Our results imply that the increase of population size, while usually desired in practice, bears the risk of increasing the lower bound of the running time and thus should be carefully considered.
Jun 10, 2016

Chao Qian, Yang Yu, Zhi-Hua Zhou

**Click to Read Paper**

A Parallel Way to Select the Parameters of SVM Based on the Ant Optimization Algorithm

May 20, 2014

Chao Zhang, Hong-cen Mei, Hao Yang

A large number of experimental data shows that Support Vector Machine (SVM) algorithm has obvious advantages in text classification, handwriting recognition, image classification, bioinformatics, and some other fields. To some degree, the optimization of SVM depends on its kernel function and Slack variable, the determinant of which is its parameters $\delta$ and c in the classification function. That is to say,to optimize the SVM algorithm, the optimization of the two parameters play a huge role. Ant Colony Optimization (ACO) is optimization algorithm which simulate ants to find the optimal path.In the available literature, we mix the ACO algorithm and Parallel algorithm together to find a well parameters.
May 20, 2014

Chao Zhang, Hong-cen Mei, Hao Yang

**Click to Read Paper**

Analyzing Evolutionary Optimization in Noisy Environments

Nov 20, 2013

Chao Qian, Yang Yu, Zhi-Hua Zhou

Nov 20, 2013

Chao Qian, Yang Yu, Zhi-Hua Zhou

**Click to Read Paper**

Towards Analyzing Crossover Operators in Evolutionary Search via General Markov Chain Switching Theorem

Apr 26, 2012

Yang Yu, Chao Qian, Zhi-Hua Zhou

Apr 26, 2012

Yang Yu, Chao Qian, Zhi-Hua Zhou

**Click to Read Paper**

Perspective-Aware CNN For Crowd Counting

Jul 05, 2018

Miaojing Shi, Zhaohui Yang, Chao Xu, Qijun Chen

Jul 05, 2018

Miaojing Shi, Zhaohui Yang, Chao Xu, Qijun Chen

**Click to Read Paper**

Deep Transfer Network with Joint Distribution Adaptation: A New Intelligent Fault Diagnosis Framework for Industry Application

Apr 18, 2018

Te Han, Chao Liu, Wenguang Yang, Dongxiang Jiang

Apr 18, 2018

Te Han, Chao Liu, Wenguang Yang, Dongxiang Jiang

**Click to Read Paper**

Cost-Aware Learning and Optimization for Opportunistic Spectrum Access

Apr 11, 2018

Chao Gan, Ruida Zhou, Jing Yang, Cong Shen

Apr 11, 2018

Chao Gan, Ruida Zhou, Jing Yang, Cong Shen

**Click to Read Paper**

Learning a No-Reference Quality Metric for Single-Image Super-Resolution

Dec 18, 2016

Chao Ma, Chih-Yuan Yang, Xiaokang Yang, Ming-Hsuan Yang

Numerous single-image super-resolution algorithms have been proposed in the literature, but few studies address the problem of performance evaluation based on visual perception. While most super-resolution images are evaluated by fullreference metrics, the effectiveness is not clear and the required ground-truth images are not always available in practice. To address these problems, we conduct human subject studies using a large set of super-resolution images and propose a no-reference metric learned from visual perceptual scores. Specifically, we design three types of low-level statistical features in both spatial and frequency domains to quantify super-resolved artifacts, and learn a two-stage regression model to predict the quality scores of super-resolution images without referring to ground-truth images. Extensive experimental results show that the proposed metric is effective and efficient to assess the quality of super-resolution images based on human perception.
Dec 18, 2016

Chao Ma, Chih-Yuan Yang, Xiaokang Yang, Ming-Hsuan Yang

**Click to Read Paper**

Exact Hybrid Covariance Thresholding for Joint Graphical Lasso

Jun 18, 2015

Qingming Tang, Chao Yang, Jian Peng, Jinbo Xu

Jun 18, 2015

Qingming Tang, Chao Yang, Jian Peng, Jinbo Xu

**Click to Read Paper**

Low-rank SIFT: An Affine Invariant Feature for Place Recognition

Aug 07, 2014

Chao Yang, Shengnan Caih, Jingdong Wang, Long Quan

Aug 07, 2014

Chao Yang, Shengnan Caih, Jingdong Wang, Long Quan

**Click to Read Paper**

Apply Local Clustering Method to Improve the Running Speed of Ant Colony Optimization

Jul 07, 2009

Chao-Yang Pang, Wei Hu, Xia Li, Be-Qiong Hu

Jul 07, 2009

Chao-Yang Pang, Wei Hu, Xia Li, Be-Qiong Hu

**Click to Read Paper**

Analysis of Noisy Evolutionary Optimization When Sampling Fails

Oct 11, 2018

Chao Qian, Chao Bian, Yang Yu, Ke Tang, Xin Yao

In noisy evolutionary optimization, sampling is a common strategy to deal with noise. By the sampling strategy, the fitness of a solution is evaluated multiple times (called \emph{sample size}) independently, and its true fitness is then approximated by the average of these evaluations. Previous studies on sampling are mainly empirical. In this paper, we first investigate the effect of sample size from a theoretical perspective. By analyzing the (1+1)-EA on the noisy LeadingOnes problem, we show that as the sample size increases, the running time can reduce from exponential to polynomial, but then return to exponential. This suggests that a proper sample size is crucial in practice. Then, we investigate what strategies can work when sampling with any fixed sample size fails. By two illustrative examples, we prove that using parent or offspring populations can be better. Finally, we construct an artificial noisy example to show that when using neither sampling nor populations is effective, adaptive sampling (i.e., sampling with an adaptive sample size) can work. This, for the first time, provides a theoretical support for the use of adaptive sampling.
Oct 11, 2018

Chao Qian, Chao Bian, Yang Yu, Ke Tang, Xin Yao

**Click to Read Paper**

Robust Visual Tracking via Hierarchical Convolutional Features

Aug 11, 2018

Chao Ma, Jia-Bin Huang, Xiaokang Yang, Ming-Hsuan Yang

Aug 11, 2018

Chao Ma, Jia-Bin Huang, Xiaokang Yang, Ming-Hsuan Yang

**Click to Read Paper**

Adaptive Correlation Filters with Long-Term and Short-Term Memory for Object Tracking

Mar 23, 2018

Chao Ma, Jia-Bin Huang, Xiaokang Yang, Ming-Hsuan Yang

Mar 23, 2018

Chao Ma, Jia-Bin Huang, Xiaokang Yang, Ming-Hsuan Yang

**Click to Read Paper**

Shape Inpainting using 3D Generative Adversarial Network and Recurrent Convolutional Networks

Nov 17, 2017

Weiyue Wang, Qiangui Huang, Suya You, Chao Yang, Ulrich Neumann

Nov 17, 2017

Weiyue Wang, Qiangui Huang, Suya You, Chao Yang, Ulrich Neumann

**Click to Read Paper**

A Hybrid Both Filter and Wrapper Feature Selection Method for Microarray Classification

Dec 27, 2016

Li-Yeh Chuang, Chao-Hsuan Ke, Cheng-Hong Yang

Dec 27, 2016

Li-Yeh Chuang, Chao-Hsuan Ke, Cheng-Hong Yang

**Click to Read Paper**