Picture for Zhiyi Fu

Zhiyi Fu

TWIN: TWo-stage Interest Network for Lifelong User Behavior Modeling in CTR Prediction at Kuaishou

Add code
Feb 05, 2023
Figure 1 for TWIN: TWo-stage Interest Network for Lifelong User Behavior Modeling in CTR Prediction at Kuaishou
Figure 2 for TWIN: TWo-stage Interest Network for Lifelong User Behavior Modeling in CTR Prediction at Kuaishou
Figure 3 for TWIN: TWo-stage Interest Network for Lifelong User Behavior Modeling in CTR Prediction at Kuaishou
Figure 4 for TWIN: TWo-stage Interest Network for Lifelong User Behavior Modeling in CTR Prediction at Kuaishou
Viaarxiv icon

CodeEditor: Learning to Edit Source Code with Pre-trained Models

Add code
Oct 31, 2022
Figure 1 for CodeEditor: Learning to Edit Source Code with Pre-trained Models
Figure 2 for CodeEditor: Learning to Edit Source Code with Pre-trained Models
Figure 3 for CodeEditor: Learning to Edit Source Code with Pre-trained Models
Figure 4 for CodeEditor: Learning to Edit Source Code with Pre-trained Models
Viaarxiv icon

Contextual Representation Learning beyond Masked Language Modeling

Add code
Apr 08, 2022
Figure 1 for Contextual Representation Learning beyond Masked Language Modeling
Figure 2 for Contextual Representation Learning beyond Masked Language Modeling
Figure 3 for Contextual Representation Learning beyond Masked Language Modeling
Figure 4 for Contextual Representation Learning beyond Masked Language Modeling
Viaarxiv icon

A Survey on Green Deep Learning

Add code
Nov 10, 2021
Figure 1 for A Survey on Green Deep Learning
Figure 2 for A Survey on Green Deep Learning
Figure 3 for A Survey on Green Deep Learning
Figure 4 for A Survey on Green Deep Learning
Viaarxiv icon

Structure-aware Pre-training for Table Understanding with Tree-based Transformers

Add code
Nov 06, 2020
Figure 1 for Structure-aware Pre-training for Table Understanding with Tree-based Transformers
Figure 2 for Structure-aware Pre-training for Table Understanding with Tree-based Transformers
Figure 3 for Structure-aware Pre-training for Table Understanding with Tree-based Transformers
Figure 4 for Structure-aware Pre-training for Table Understanding with Tree-based Transformers
Viaarxiv icon

Code Generation as a Dual Task of Code Summarization

Add code
Oct 14, 2019
Figure 1 for Code Generation as a Dual Task of Code Summarization
Figure 2 for Code Generation as a Dual Task of Code Summarization
Figure 3 for Code Generation as a Dual Task of Code Summarization
Figure 4 for Code Generation as a Dual Task of Code Summarization
Viaarxiv icon

A Self-Attentional Neural Architecture for Code Completion with Multi-Task Learning

Add code
Oct 12, 2019
Figure 1 for A Self-Attentional Neural Architecture for Code Completion with Multi-Task Learning
Figure 2 for A Self-Attentional Neural Architecture for Code Completion with Multi-Task Learning
Figure 3 for A Self-Attentional Neural Architecture for Code Completion with Multi-Task Learning
Figure 4 for A Self-Attentional Neural Architecture for Code Completion with Multi-Task Learning
Viaarxiv icon