Share this page:

Learning from Explicit and Implicit Supervision Jointly For Algebra Word Problems

Shyam Upadhyay, Ming-Wei Chang, Kai-Wei Chang, and Wen-tau Yih, in EMNLP, 2016.

Download the full text


Abstract

Automatically solving algebra word problems has raised considerable interest recently. Existing state-of-the-art approaches mainly rely on learning from human annotated equations. In this paper, we demonstrate that it is possible to efficiently mine algebra problems and their numerical solutions with little to no manual effort. To leverage the mined dataset, we propose a novel structured-output learning algorithm that aims to learn from both explicit (e.g., equations) and implicit (e.g., solutions) supervision signals jointly. Enabled by this new algorithm, our model gains 4.6% absolute improvement in accuracy on the ALG-514 benchmark compared to the one without using implicit supervision. The final model also outperforms the current state-of-the-art approach by 3%. Dataset



Bib Entry

@inproceedings{BCWS16,
  author = {Upadhyay, Shyam and Chang, Ming-Wei and Chang, Kai-Wei and Yih, Wen-tau},
  title = {Learning from Explicit and Implicit Supervision Jointly For Algebra Word Problems},
  booktitle = {EMNLP},
  year = {2016}
}

Related Publications

  1. Relation-Guided Pre-Training for Open-Domain Question Answering, EMNLP-Finding, 2021
  2. An Integer Linear Programming Framework for Mining Constraints from Data, ICML, 2021
  3. Generating Syntactically Controlled Paraphrases without Using Annotated Parallel Pairs, EACL, 2021
  4. Clinical Temporal Relation Extraction with Probabilistic Soft Logic Regularization and Global Inference, AAAI, 2021
  5. PolicyQA: A Reading Comprehension Dataset for Privacy Policies, EMNLP-Finding (short), 2020
  6. GPT-GNN: Generative Pre-Training of Graph Neural Networks, KDD, 2020
  7. SentiBERT: A Transferable Transformer-Based Architecture for Compositional Sentiment Semantics, ACL, 2020
  8. Building Language Models for Text with Named Entities, ACL, 2018