Share this page:

GATE: Graph Attention Transformer Encoder for Cross-lingual Relation and Event Extraction

Wasi Ahmad, Nanyun Peng, and Kai-Wei Chang, in AAAI, 2021.

Code

Download the full text


Abstract

Prevalent approaches in cross-lingual relation and event extraction use graph convolutional networks (GCNs) with universal dependency parses to learn language-agnostic representations such that models trained on one language can be applied to other languages. However, GCNs lack in modeling long-range dependencies or disconnected words in the dependency tree. To address this challenge, we propose to utilize the self-attention mechanism where we explicitly fuse structural information to learn the dependencies between words at different syntactic distances. We introduce GATE, a \bf Graph \bf Attention \bf Transformer \bf Encoder, and test its cross-lingual transferability on relation and event extraction tasks. We perform rigorous experiments on the widely used ACE05 dataset that includes three typologically different languages: English, Chinese, and Arabic. The evaluation results show that GATE outperforms three recently proposed methods by a large margin. Our detailed analysis reveals that due to the reliance on syntactic dependencies, GATE produces robust representations that facilitate transfer across languages.



Bib Entry

@inproceedings{ahmad2021gate,
  author = {Ahmad, Wasi and Peng, Nanyun and Chang, Kai-Wei},
  title = {GATE: Graph Attention Transformer Encoder for Cross-lingual Relation and Event Extraction},
  booktitle = {AAAI},
  year = {2021}
}

Related Publications

  1. Contextual Label Projection for Cross-Lingual Structured Prediction, NAACL, 2024
  2. Multilingual Generative Language Models for Zero-Shot Cross-Lingual Event Argument Extraction, ACL, 2022
  3. Improving Zero-Shot Cross-Lingual Transfer Learning via Robust Training, EMNLP, 2021
  4. Syntax-augmented Multilingual BERT for Cross-lingual Transfer, ACL, 2021
  5. Evaluating the Values of Sources in Transfer Learning, NAACL, 2021
  6. Cross-Lingual Dependency Parsing by POS-Guided Word Reordering, EMNLP-Finding, 2020
  7. Cross-lingual Dependency Parsing with Unlabeled Auxiliary Languages, CoNLL, 2019
  8. Target Language-Aware Constrained Inference for Cross-lingual Dependency Parsing, EMNLP, 2019
  9. On Difficulties of Cross-Lingual Transfer with Order Differences: A Case Study on Dependency Parsing, NAACL, 2019