Alert button
Picture for Shengsheng Wang

Shengsheng Wang

Alert button

Training-Free Unsupervised Prompt for Vision-Language Models

Add code
Bookmark button
Alert button
Apr 25, 2024
Sifan Long, Linbin Wang, Zhen Zhao, Zichang Tan, Yiming Wu, Shengsheng Wang, Jingdong Wang

Viaarxiv icon

Unsupervised Sentence Representation Learning with Frequency-induced Adversarial Tuning and Incomplete Sentence Filtering

Add code
Bookmark button
Alert button
May 15, 2023
Bing Wang, Ximing Li, Zhiyao Yang, Yuanyuan Guan, Jiayin Li, Shengsheng Wang

Figure 1 for Unsupervised Sentence Representation Learning with Frequency-induced Adversarial Tuning and Incomplete Sentence Filtering
Figure 2 for Unsupervised Sentence Representation Learning with Frequency-induced Adversarial Tuning and Incomplete Sentence Filtering
Figure 3 for Unsupervised Sentence Representation Learning with Frequency-induced Adversarial Tuning and Incomplete Sentence Filtering
Figure 4 for Unsupervised Sentence Representation Learning with Frequency-induced Adversarial Tuning and Incomplete Sentence Filtering
Viaarxiv icon

Task-Oriented Multi-Modal Mutual Leaning for Vision-Language Models

Add code
Bookmark button
Alert button
Mar 30, 2023
Sifan Long, Zhen Zhao, Junkun Yuan, Zichang Tan, Jiangjiang Liu, Luping Zhou, Shengsheng Wang, Jingdong Wang

Figure 1 for Task-Oriented Multi-Modal Mutual Leaning for Vision-Language Models
Figure 2 for Task-Oriented Multi-Modal Mutual Leaning for Vision-Language Models
Figure 3 for Task-Oriented Multi-Modal Mutual Leaning for Vision-Language Models
Figure 4 for Task-Oriented Multi-Modal Mutual Leaning for Vision-Language Models
Viaarxiv icon

Beyond Attentive Tokens: Incorporating Token Importance and Diversity for Efficient Vision Transformers

Add code
Bookmark button
Alert button
Nov 21, 2022
Sifan Long, Zhen Zhao, Jimin Pi, Shengsheng Wang, Jingdong Wang

Figure 1 for Beyond Attentive Tokens: Incorporating Token Importance and Diversity for Efficient Vision Transformers
Figure 2 for Beyond Attentive Tokens: Incorporating Token Importance and Diversity for Efficient Vision Transformers
Figure 3 for Beyond Attentive Tokens: Incorporating Token Importance and Diversity for Efficient Vision Transformers
Figure 4 for Beyond Attentive Tokens: Incorporating Token Importance and Diversity for Efficient Vision Transformers
Viaarxiv icon

Next-item Recommendations in Short Sessions

Add code
Bookmark button
Alert button
Jul 20, 2021
Wenzhuo Song, Shoujin Wang, Yan Wang, Shengsheng Wang

Figure 1 for Next-item Recommendations in Short Sessions
Figure 2 for Next-item Recommendations in Short Sessions
Figure 3 for Next-item Recommendations in Short Sessions
Figure 4 for Next-item Recommendations in Short Sessions
Viaarxiv icon

Hyperbolic Node Embedding for Signed Networks

Add code
Bookmark button
Alert button
Oct 29, 2019
Wenzhuo Song, Shengsheng Wang

Figure 1 for Hyperbolic Node Embedding for Signed Networks
Figure 2 for Hyperbolic Node Embedding for Signed Networks
Figure 3 for Hyperbolic Node Embedding for Signed Networks
Figure 4 for Hyperbolic Node Embedding for Signed Networks
Viaarxiv icon

Reduced Ordered Binary Decision Diagram with Implied Literals: A New knowledge Compilation Approach

Add code
Bookmark button
Alert button
Mar 24, 2011
Yong Lai, Dayou Liu, Shengsheng Wang

Figure 1 for Reduced Ordered Binary Decision Diagram with Implied Literals: A New knowledge Compilation Approach
Figure 2 for Reduced Ordered Binary Decision Diagram with Implied Literals: A New knowledge Compilation Approach
Figure 3 for Reduced Ordered Binary Decision Diagram with Implied Literals: A New knowledge Compilation Approach
Figure 4 for Reduced Ordered Binary Decision Diagram with Implied Literals: A New knowledge Compilation Approach
Viaarxiv icon