**Click to Read Paper and Get Code**

**Click to Read Paper and Get Code**

Adversarial confidence and smoothness regularizations for scalable unsupervised discriminative learning

Jun 04, 2018

Yi-Qing Wang

Jun 04, 2018

Yi-Qing Wang

* 9 pages

**Click to Read Paper and Get Code**

* 3 pages, ACL94 student session

**Click to Read Paper and Get Code**

PFML-based Semantic BCI Agent for Game of Go Learning and Prediction

Jan 10, 2019

Chang-Shing Lee, Mei-Hui Wang, Li-Wei Ko, Bo-Yu Tsai, Yi-Lin Tsai, Sheng-Chi Yang, Lu-An Lin, Yi-Hsiu Lee, Hirofumi Ohashi, Naoyuki Kubota, Nan Shuo

Jan 10, 2019

Chang-Shing Lee, Mei-Hui Wang, Li-Wei Ko, Bo-Yu Tsai, Yi-Lin Tsai, Sheng-Chi Yang, Lu-An Lin, Yi-Hsiu Lee, Hirofumi Ohashi, Naoyuki Kubota, Nan Shuo

**Click to Read Paper and Get Code**

Coherent Point Drift Networks: Unsupervised Learning of Non-Rigid Point Set Registration

Jun 11, 2019

Lingjing Wang, Yi Fang

Jun 11, 2019

Lingjing Wang, Yi Fang

**Click to Read Paper and Get Code**

Elaboration Tolerant Representation of Markov Decision Process via Decision-Theoretic Extension of Probabilistic Action Language pBC+

Apr 01, 2019

Yi Wang, Joohyung Lee

We extend probabilistic action language pBC+ with the notion of utility as in decision theory. The semantics of the extended pBC+ can be defined as a shorthand notation for a decision-theoretic extension of the probabilistic answer set programming language LPMLN. Alternatively, the semantics of pBC+ can also be defined in terms of Markov Decision Process (MDP), which in turn allows for representing MDP in a succinct and elaboration tolerant way as well as to leverage an MDP solver to compute pBC+. The idea led to the design of the system pbcplus2mdp, which can find an optimal policy of a pBC+ action description using an MDP solver.
Apr 01, 2019

Yi Wang, Joohyung Lee

* 13 pages, to appear in LPNMR 2019

**Click to Read Paper and Get Code**

* More experiments are needed to prove this theory, but I'm not quite sure yet

**Click to Read Paper and Get Code**

Weight Learning in a Probabilistic Extension of Answer Set Programs

Oct 09, 2018

Joohyung Lee, Yi Wang

Oct 09, 2018

Joohyung Lee, Yi Wang

* Technical Report of the paper to appear in 16th International Conference on Principles of Knowledge Representation and Reasoning

**Click to Read Paper and Get Code**

* Paper presented at the 34nd International Conference on Logic Programming (ICLP 2018), Oxford, UK, July 14 to July 17, 2018 18 pages, LaTeX, 1 PDF figures (arXiv:YYMM.NNNNN)

**Click to Read Paper and Get Code**

Unsupervised 3D Reconstruction from a Single Image via Adversarial Learning

Nov 26, 2017

Lingjing Wang, Yi Fang

Nov 26, 2017

Lingjing Wang, Yi Fang

**Click to Read Paper and Get Code**

On the Semantic Relationship between Probabilistic Soft Logic and Markov Logic

Jun 28, 2016

Joohyung Lee, Yi Wang

Jun 28, 2016

Joohyung Lee, Yi Wang

* In Working Notes of the 6th International Workshop on Statistical Relational AI

**Click to Read Paper and Get Code**

Simple tree models for articulated objects prevails in the last decade. However, it is also believed that these simple tree models are not capable of capturing large variations in many scenarios, such as human pose estimation. This paper attempts to address three questions: 1) are simple tree models sufficient? more specifically, 2) how to use tree models effectively in human pose estimation? and 3) how shall we use combined parts together with single parts efficiently? Assuming we have a set of single parts and combined parts, and the goal is to estimate a joint distribution of their locations. We surprisingly find that no latent variables are introduced in the Leeds Sport Dataset (LSP) during learning latent trees for deformable model, which aims at approximating the joint distributions of body part locations using minimal tree structure. This suggests one can straightforwardly use a mixed representation of single and combined parts to approximate their joint distribution in a simple tree model. As such, one only needs to build Visual Categories of the combined parts, and then perform inference on the learned latent tree. Our method outperformed the state of the art on the LSP, both in the scenarios when the training images are from the same dataset and from the PARSE dataset. Experiments on animal images from the VOC challenge further support our findings.

* CVPR 2013

* CVPR 2013

**Click to Read Paper and Get Code*** IJCAI 2013

**Click to Read Paper and Get Code**

Fractional-order Backpropagation Neural Networks: Modified Fractional-order Steepest Descent Method for Family of Backpropagation Neural Networks

Jul 10, 2019

Yi-Fei PU, Jian Wang

Jul 10, 2019

Yi-Fei PU, Jian Wang

**Click to Read Paper and Get Code**

* EPTCS 215, 2016, pp. 31-50

* In Proceedings TARK 2015, arXiv:1606.07295

**Click to Read Paper and Get Code**

TrueLabel + Confusions: A Spectrum of Probabilistic Models in Analyzing Multiple Ratings

Jun 18, 2012

Chao Liu, Yi-Min Wang

Jun 18, 2012

Chao Liu, Yi-Min Wang

* ICML2012

**Click to Read Paper and Get Code**

Deep Neural Network for Semantic-based Text Recognition in Images

Aug 15, 2019

Yi Zheng, Qitong Wang, Margrit Betke

Aug 15, 2019

Yi Zheng, Qitong Wang, Margrit Betke

**Click to Read Paper and Get Code**