Picture for Yudai Pan

Yudai Pan

Logiformer: A Two-Branch Graph Transformer Network for Interpretable Logical Reasoning

Add code
May 02, 2022
Figure 1 for Logiformer: A Two-Branch Graph Transformer Network for Interpretable Logical Reasoning
Figure 2 for Logiformer: A Two-Branch Graph Transformer Network for Interpretable Logical Reasoning
Figure 3 for Logiformer: A Two-Branch Graph Transformer Network for Interpretable Logical Reasoning
Figure 4 for Logiformer: A Two-Branch Graph Transformer Network for Interpretable Logical Reasoning
Viaarxiv icon

MoCA: Incorporating Multi-stage Domain Pretraining and Cross-guided Multimodal Attention for Textbook Question Answering

Add code
Dec 06, 2021
Figure 1 for MoCA: Incorporating Multi-stage Domain Pretraining and Cross-guided Multimodal Attention for Textbook Question Answering
Figure 2 for MoCA: Incorporating Multi-stage Domain Pretraining and Cross-guided Multimodal Attention for Textbook Question Answering
Figure 3 for MoCA: Incorporating Multi-stage Domain Pretraining and Cross-guided Multimodal Attention for Textbook Question Answering
Figure 4 for MoCA: Incorporating Multi-stage Domain Pretraining and Cross-guided Multimodal Attention for Textbook Question Answering
Viaarxiv icon

Learning First-Order Rules with Relational Path Contrast for Inductive Relation Reasoning

Add code
Oct 17, 2021
Figure 1 for Learning First-Order Rules with Relational Path Contrast for Inductive Relation Reasoning
Figure 2 for Learning First-Order Rules with Relational Path Contrast for Inductive Relation Reasoning
Figure 3 for Learning First-Order Rules with Relational Path Contrast for Inductive Relation Reasoning
Figure 4 for Learning First-Order Rules with Relational Path Contrast for Inductive Relation Reasoning
Viaarxiv icon