**Click to Read Paper and Get Code**

Multi-objective Differential Evolution with Helper Functions for Constrained Optimization

Oct 01, 2015

Tao Xu, Jun He

Oct 01, 2015

Tao Xu, Jun He

* Accepted by The 15th UK Workshop on Computational Intelligence (UKCI 2015)

**Click to Read Paper and Get Code**

* Accepted by CVPR2019

**Click to Read Paper and Get Code**

Multi-View Stereo with Asymmetric Checkerboard Propagation and Multi-Hypothesis Joint View Selection

May 21, 2018

Qingshan Xu, Wenbing Tao

May 21, 2018

Qingshan Xu, Wenbing Tao

**Click to Read Paper and Get Code**

Frequency Principle: Fourier Analysis Sheds Light on Deep Neural Networks

Jan 26, 2019

Zhi-Qin John Xu, Yaoyu Zhang, Tao Luo, Yanyang Xiao, Zheng Ma

Jan 26, 2019

Zhi-Qin John Xu, Yaoyu Zhang, Tao Luo, Yanyang Xiao, Zheng Ma

* 7 pages, 4 figures, under review of ICML

**Click to Read Paper and Get Code**

**Click to Read Paper and Get Code**

GPU Accelerated Cascade Hashing Image Matching for Large Scale 3D Reconstruction

May 23, 2018

Tao Xu, Kun Sun, Wenbing Tao

May 23, 2018

Tao Xu, Kun Sun, Wenbing Tao

**Click to Read Paper and Get Code**

**Click to Read Paper and Get Code**

**Click to Read Paper and Get Code**

**Click to Read Paper and Get Code**

Image-Question-Answer Synergistic Network for Visual Dialog

Feb 26, 2019

Dalu Guo, Chang Xu, Dacheng Tao

Feb 26, 2019

Dalu Guo, Chang Xu, Dacheng Tao

* Accepted by cvpr2019

**Click to Read Paper and Get Code**

Follow Me at the Edge: Mobility-Aware Dynamic Service Placement for Mobile Edge Computing

Sep 14, 2018

Tao Ouyang, Zhi Zhou, Xu Chen

Sep 14, 2018

Tao Ouyang, Zhi Zhou, Xu Chen

* The paper is accepted by IEEE Journal on Selected Areas in Communications, Aug. 2018

**Click to Read Paper and Get Code**

Generative Cooperative Net for Image Generation and Data Augmentation

Feb 08, 2018

Qiangeng Xu, Zengchang Qin, Tao Wan

Feb 08, 2018

Qiangeng Xu, Zengchang Qin, Tao Wan

* 12 pages, 8 figures

**Click to Read Paper and Get Code**

Dimensionality-Dependent Generalization Bounds for $k$-Dimensional Coding Schemes

Apr 23, 2016

Tongliang Liu, Dacheng Tao, Dong Xu

The $k$-dimensional coding schemes refer to a collection of methods that attempt to represent data using a set of representative $k$-dimensional vectors, and include non-negative matrix factorization, dictionary learning, sparse coding, $k$-means clustering and vector quantization as special cases. Previous generalization bounds for the reconstruction error of the $k$-dimensional coding schemes are mainly dimensionality independent. A major advantage of these bounds is that they can be used to analyze the generalization error when data is mapped into an infinite- or high-dimensional feature space. However, many applications use finite-dimensional data features. Can we obtain dimensionality-dependent generalization bounds for $k$-dimensional coding schemes that are tighter than dimensionality-independent bounds when data is in a finite-dimensional feature space? The answer is positive. In this paper, we address this problem and derive a dimensionality-dependent generalization bound for $k$-dimensional coding schemes by bounding the covering number of the loss function class induced by the reconstruction error. The bound is of order $\mathcal{O}\left(\left(mk\ln(mkn)/n\right)^{\lambda_n}\right)$, where $m$ is the dimension of features, $k$ is the number of the columns in the linear implementation of coding schemes, $n$ is the size of sample, $\lambda_n>0.5$ when $n$ is finite and $\lambda_n=0.5$ when $n$ is infinite. We show that our bound can be tighter than previous results, because it avoids inducing the worst-case upper bound on $k$ of the loss function and converges faster. The proposed generalization bound is also applied to some specific coding schemes to demonstrate that the dimensionality-dependent bound is an indispensable complement to these dimensionality-independent generalization bounds.
Apr 23, 2016

Tongliang Liu, Dacheng Tao, Dong Xu

**Click to Read Paper and Get Code**

Simultaneous Codeword Optimization (SimCO) for Dictionary Update and Learning

Apr 18, 2012

Wei Dai, Tao Xu, Wenwu Wang

Apr 18, 2012

Wei Dai, Tao Xu, Wenwu Wang

* 13 pages

**Click to Read Paper and Get Code**

Sentence-wise Smooth Regularization for Sequence to Sequence Learning

Dec 12, 2018

Chengyue Gong, Xu Tan, Di He, Tao Qin

Maximum-likelihood estimation (MLE) is widely used in sequence to sequence tasks for model training. It uniformly treats the generation/prediction of each target token as multi-class classification, and yields non-smooth prediction probabilities: in a target sequence, some tokens are predicted with small probabilities while other tokens are with large probabilities. According to our empirical study, we find that the non-smoothness of the probabilities results in low quality of generated sequences. In this paper, we propose a sentence-wise regularization method which aims to output smooth prediction probabilities for all the tokens in the target sequence. Our proposed method can automatically adjust the weights and gradients of each token in one sentence to ensure the predictions in a sequence uniformly well. Experiments on three neural machine translation tasks and one text summarization task show that our method outperforms conventional MLE loss on all these tasks and achieves promising BLEU scores on WMT14 English-German and WMT17 Chinese-English translation task.
Dec 12, 2018

Chengyue Gong, Xu Tan, Di He, Tao Qin

* AAAI 2019

**Click to Read Paper and Get Code**

**Click to Read Paper and Get Code**

Multiobjective Optimization Differential Evolution Enhanced with Principle Component Analysis for Constrained Optimization

May 01, 2018

Wei Huang, Tao Xu, Jun He, Kangshun Li

May 01, 2018

Wei Huang, Tao Xu, Jun He, Kangshun Li

**Click to Read Paper and Get Code**

Perceptual Adversarial Networks for Image-to-Image Transformation

Jun 28, 2017

Chaoyue Wang, Chang Xu, Chaohui Wang, Dacheng Tao

Jun 28, 2017

Chaoyue Wang, Chang Xu, Chaohui Wang, Dacheng Tao

* 20 pages, 9 figures

**Click to Read Paper and Get Code**