N-fold Superposition: Improving Neural Networks by Reducing the Noise in Feature Maps

May 03, 2018

Yang Liu, Qiang Qu, Chao Gao

May 03, 2018

Yang Liu, Qiang Qu, Chao Gao

* 7 pages, 5 figures, submitted to ICALIP 2018

**Click to Read Paper**

Binary output layer of feedforward neural networks for solving multi-class classification problems

Jan 22, 2018

Sibo Yang, Chao Zhang, Wei Wu

Jan 22, 2018

Sibo Yang, Chao Zhang, Wei Wu

**Click to Read Paper**

* accepted by ICCV 2017

**Click to Read Paper**

**Click to Read Paper**

A Lower Bound Analysis of Population-based Evolutionary Algorithms for Pseudo-Boolean Functions

Jun 10, 2016

Chao Qian, Yang Yu, Zhi-Hua Zhou

Evolutionary algorithms (EAs) are population-based general-purpose optimization algorithms, and have been successfully applied in various real-world optimization tasks. However, previous theoretical studies often employ EAs with only a parent or offspring population and focus on specific problems. Furthermore, they often only show upper bounds on the running time, while lower bounds are also necessary to get a complete understanding of an algorithm. In this paper, we analyze the running time of the ($\mu$+$\lambda$)-EA (a general population-based EA with mutation only) on the class of pseudo-Boolean functions with a unique global optimum. By applying the recently proposed switch analysis approach, we prove the lower bound $\Omega(n \ln n+ \mu + \lambda n\ln\ln n/ \ln n)$ for the first time. Particularly on the two widely-studied problems, OneMax and LeadingOnes, the derived lower bound discloses that the ($\mu$+$\lambda$)-EA will be strictly slower than the (1+1)-EA when the population size $\mu$ or $\lambda$ is above a moderate order. Our results imply that the increase of population size, while usually desired in practice, bears the risk of increasing the lower bound of the running time and thus should be carefully considered.
Jun 10, 2016

Chao Qian, Yang Yu, Zhi-Hua Zhou

**Click to Read Paper**

A Parallel Way to Select the Parameters of SVM Based on the Ant Optimization Algorithm

May 20, 2014

Chao Zhang, Hong-cen Mei, Hao Yang

A large number of experimental data shows that Support Vector Machine (SVM) algorithm has obvious advantages in text classification, handwriting recognition, image classification, bioinformatics, and some other fields. To some degree, the optimization of SVM depends on its kernel function and Slack variable, the determinant of which is its parameters $\delta$ and c in the classification function. That is to say,to optimize the SVM algorithm, the optimization of the two parameters play a huge role. Ant Colony Optimization (ACO) is optimization algorithm which simulate ants to find the optimal path.In the available literature, we mix the ACO algorithm and Parallel algorithm together to find a well parameters.
May 20, 2014

Chao Zhang, Hong-cen Mei, Hao Yang

* 3 pages, 2 figures, 2 tables

**Click to Read Paper**

Analyzing Evolutionary Optimization in Noisy Environments

Nov 20, 2013

Chao Qian, Yang Yu, Zhi-Hua Zhou

Nov 20, 2013

Chao Qian, Yang Yu, Zhi-Hua Zhou

**Click to Read Paper**

Towards Analyzing Crossover Operators in Evolutionary Search via General Markov Chain Switching Theorem

Apr 26, 2012

Yang Yu, Chao Qian, Zhi-Hua Zhou

Apr 26, 2012

Yang Yu, Chao Qian, Zhi-Hua Zhou

**Click to Read Paper**

Online Learning with Diverse User Preferences

Feb 04, 2019

Chao Gan, Jing Yang, Ruida Zhou, Cong Shen

Feb 04, 2019

Chao Gan, Jing Yang, Ruida Zhou, Cong Shen

* 10 pages, 3 figures

**Click to Read Paper**

Perspective-Aware CNN For Crowd Counting

Jul 05, 2018

Miaojing Shi, Zhaohui Yang, Chao Xu, Qijun Chen

Jul 05, 2018

Miaojing Shi, Zhaohui Yang, Chao Xu, Qijun Chen

**Click to Read Paper**

Deep Transfer Network with Joint Distribution Adaptation: A New Intelligent Fault Diagnosis Framework for Industry Application

Apr 18, 2018

Te Han, Chao Liu, Wenguang Yang, Dongxiang Jiang

Apr 18, 2018

Te Han, Chao Liu, Wenguang Yang, Dongxiang Jiang

* 10 pages, 10 figures

**Click to Read Paper**

Cost-Aware Learning and Optimization for Opportunistic Spectrum Access

Apr 11, 2018

Chao Gan, Ruida Zhou, Jing Yang, Cong Shen

Apr 11, 2018

Chao Gan, Ruida Zhou, Jing Yang, Cong Shen

* 12 pages, 6 figures

**Click to Read Paper**

Learning a No-Reference Quality Metric for Single-Image Super-Resolution

Dec 18, 2016

Chao Ma, Chih-Yuan Yang, Xiaokang Yang, Ming-Hsuan Yang

Numerous single-image super-resolution algorithms have been proposed in the literature, but few studies address the problem of performance evaluation based on visual perception. While most super-resolution images are evaluated by fullreference metrics, the effectiveness is not clear and the required ground-truth images are not always available in practice. To address these problems, we conduct human subject studies using a large set of super-resolution images and propose a no-reference metric learned from visual perceptual scores. Specifically, we design three types of low-level statistical features in both spatial and frequency domains to quantify super-resolved artifacts, and learn a two-stage regression model to predict the quality scores of super-resolution images without referring to ground-truth images. Extensive experimental results show that the proposed metric is effective and efficient to assess the quality of super-resolution images based on human perception.
Dec 18, 2016

Chao Ma, Chih-Yuan Yang, Xiaokang Yang, Ming-Hsuan Yang

* Accepted by Computer Vision and Image Understanding

**Click to Read Paper**

Exact Hybrid Covariance Thresholding for Joint Graphical Lasso

Jun 18, 2015

Qingming Tang, Chao Yang, Jian Peng, Jinbo Xu

Jun 18, 2015

Qingming Tang, Chao Yang, Jian Peng, Jinbo Xu

**Click to Read Paper**

Low-rank SIFT: An Affine Invariant Feature for Place Recognition

Aug 07, 2014

Chao Yang, Shengnan Caih, Jingdong Wang, Long Quan

Aug 07, 2014

Chao Yang, Shengnan Caih, Jingdong Wang, Long Quan

**Click to Read Paper**

Apply Local Clustering Method to Improve the Running Speed of Ant Colony Optimization

Jul 07, 2009

Chao-Yang Pang, Wei Hu, Xia Li, Be-Qiong Hu

Jul 07, 2009

Chao-Yang Pang, Wei Hu, Xia Li, Be-Qiong Hu

* 21 pages, 5figures

**Click to Read Paper**

Analysis of Noisy Evolutionary Optimization When Sampling Fails

Oct 11, 2018

Chao Qian, Chao Bian, Yang Yu, Ke Tang, Xin Yao

In noisy evolutionary optimization, sampling is a common strategy to deal with noise. By the sampling strategy, the fitness of a solution is evaluated multiple times (called \emph{sample size}) independently, and its true fitness is then approximated by the average of these evaluations. Previous studies on sampling are mainly empirical. In this paper, we first investigate the effect of sample size from a theoretical perspective. By analyzing the (1+1)-EA on the noisy LeadingOnes problem, we show that as the sample size increases, the running time can reduce from exponential to polynomial, but then return to exponential. This suggests that a proper sample size is crucial in practice. Then, we investigate what strategies can work when sampling with any fixed sample size fails. By two illustrative examples, we prove that using parent or offspring populations can be better. Finally, we construct an artificial noisy example to show that when using neither sampling nor populations is effective, adaptive sampling (i.e., sampling with an adaptive sample size) can work. This, for the first time, provides a theoretical support for the use of adaptive sampling.
Oct 11, 2018

Chao Qian, Chao Bian, Yang Yu, Ke Tang, Xin Yao

**Click to Read Paper**

Robust Visual Tracking via Hierarchical Convolutional Features

Aug 11, 2018

Chao Ma, Jia-Bin Huang, Xiaokang Yang, Ming-Hsuan Yang

In this paper, we propose to exploit the rich hierarchical features of deep convolutional neural networks to improve the accuracy and robustness of visual tracking. Deep neural networks trained on object recognition datasets consist of multiple convolutional layers. These layers encode target appearance with different levels of abstraction. For example, the outputs of the last convolutional layers encode the semantic information of targets and such representations are invariant to significant appearance variations. However, their spatial resolutions are too coarse to precisely localize the target. In contrast, features from earlier convolutional layers provide more precise localization but are less invariant to appearance changes. We interpret the hierarchical features of convolutional layers as a nonlinear counterpart of an image pyramid representation and explicitly exploit these multiple levels of abstraction to represent target objects. Specifically, we learn adaptive correlation filters on the outputs from each convolutional layer to encode the target appearance. We infer the maximum response of each layer to locate targets in a coarse-to-fine manner. To further handle the issues with scale estimation and re-detecting target objects from tracking failures caused by heavy occlusion or out-of-the-view movement, we conservatively learn another correlation filter, that maintains a long-term memory of target appearance, as a discriminative classifier. We apply the classifier to two types of object proposals: (1) proposals with a small step size and tightly around the estimated location for scale estimation; and (2) proposals with large step size and across the whole image for target re-detection. Extensive experimental results on large-scale benchmark datasets show that the proposed algorithm performs favorably against state-of-the-art tracking methods.
Aug 11, 2018

Chao Ma, Jia-Bin Huang, Xiaokang Yang, Ming-Hsuan Yang

* To appear in T-PAMI 2018, project page at https://sites.google.com/site/chaoma99/hcft-tracking

**Click to Read Paper**

Adaptive Correlation Filters with Long-Term and Short-Term Memory for Object Tracking

Mar 23, 2018

Chao Ma, Jia-Bin Huang, Xiaokang Yang, Ming-Hsuan Yang

Mar 23, 2018

Chao Ma, Jia-Bin Huang, Xiaokang Yang, Ming-Hsuan Yang

* IJCV 2018, Project page: https://sites.google.com/site/chaoma99/cf-lstm

**Click to Read Paper**