Share this page:

EventPlus: A Temporal Event Understanding Pipeline

Mingyu Derek Ma, Jiao Sun, Mu Yang, Kung-Hsiang Huang, Nuan Wen, Shikhar Singh, Rujun Han, and Nanyun Peng, in 2021 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL), Demonstrations Track, 2021.

Download the full text


Abstract

We present EventPlus, a temporal event understanding pipeline that integrates various state-of-the-art event understanding components including event trigger and type detection, event argument detection, event duration and temporal relation extraction. Event information, especially event temporal knowledge, is a type of common sense knowledge that helps people understand how stories evolve and provides predictive hints for future events. EventPlus as the first comprehensive temporal event understanding pipeline provides a convenient tool for users to quickly obtain annotations about events and their temporal information for any user-provided document. Furthermore, we show EventPlus can be easily adapted to other domains (e.g., biomedical domain). We make EventPlus publicly available to facilitate event-related information extraction and downstream applications.



Bib Entry

@inproceedings{ma2021eventplus,
  title = {EventPlus: A Temporal Event Understanding Pipeline},
  author = {Ma, Mingyu Derek and Sun, Jiao and Yang, Mu and Huang, Kung-Hsiang and Wen, Nuan and Singh, Shikhar and Han, Rujun and Peng, Nanyun},
  booktitle = {2021 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL), Demonstrations Track},
  year = {2021}
}

Related Publications

  • EventPlus: A Temporal Event Understanding Pipeline

    Mingyu Derek Ma, Jiao Sun, Mu Yang, Kung-Hsiang Huang, Nuan Wen, Shikhar Singh, Rujun Han, and Nanyun Peng, in 2021 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL), Demonstrations Track, 2021.
    Full Text Slides Poster Video Code Abstract BibTeX Details
    We present EventPlus, a temporal event understanding pipeline that integrates various state-of-the-art event understanding components including event trigger and type detection, event argument detection, event duration and temporal relation extraction. Event information, especially event temporal knowledge, is a type of common sense knowledge that helps people understand how stories evolve and provides predictive hints for future events. EventPlus as the first comprehensive temporal event understanding pipeline provides a convenient tool for users to quickly obtain annotations about events and their temporal information for any user-provided document. Furthermore, we show EventPlus can be easily adapted to other domains (e.g., biomedical domain). We make EventPlus publicly available to facilitate event-related information extraction and downstream applications.
    @inproceedings{ma2021eventplus,
      title = {EventPlus: A Temporal Event Understanding Pipeline},
      author = {Ma, Mingyu Derek and Sun, Jiao and Yang, Mu and Huang, Kung-Hsiang and Wen, Nuan and Singh, Shikhar and Han, Rujun and Peng, Nanyun},
      booktitle = {2021 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL), Demonstrations Track},
      year = {2021}
    }
    
    Details
  • Domain Knowledge Empowered Structured Neural Net for End-to-End Event Temporal Relation Extraction

    Rujun Han, Yichao Zhou, and Nanyun Peng, in the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2020.
    Full Text Slides Code Abstract BibTeX Details
    Extracting event temporal relations is a critical task for information extraction and plays an important role in natural language understanding. Prior systems leverage deep learning and pre-trained language models to improve the performance of the task. However, these systems often suffer from two shortcomings: 1) when performing maximum a posteriori (MAP) inference based on neural models, previous systems only used structured knowledge that is assumed to be absolutely correct, i.e., hard constraints; 2) biased predictions on dominant temporal relations when training with a limited amount of data. To address these issues, we propose a framework that enhances deep neural network with distributional constraints constructed by probabilistic domain knowledge. We solve the constrained inference problem via Lagrangian Relaxation and apply it to end-to-end event temporal relation extraction tasks. Experimental results show our framework is able to improve the baseline neural network models with strong statistical significance on two widely used datasets in news and clinical domains.
    @inproceedings{han2020knowledge,
      title = {Domain Knowledge Empowered Structured Neural Net for End-to-End Event Temporal Relation Extraction},
      author = {Han, Rujun and Zhou, Yichao and Peng, Nanyun},
      booktitle = {the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)},
      publisher = {Association for Computational Linguistics},
      pages = {5717--5729},
      slideslive_id = {38939236},
      year = {2020}
    }
    
    Details
  • TORQUE: A Reading Comprehension Dataset of Temporal Ordering Questions

    Qiang Ning, Hao Wu, Rujun Han, Nanyun Peng, Matt Gardner, and Dan Roth, in the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2020.
    Full Text Code Abstract BibTeX Details
    A critical part of reading is being able to understand the temporal relationships between events described in a passage of text, even when those relationships are not explicitly stated. However, current machine reading comprehension benchmarks have practically no questions that test temporal phenomena, so systems trained on these benchmarks have no capacity to answer questions such as "what happened before/after [some event]?" We introduce TORQUE, a new English reading comprehension benchmark built on 3.2k news snippets with 21k human-generated questions querying temporal relationships. Results show that RoBERTa-large achieves an exact-match score of 51% on the test set of TORQUE, about 30% behind human performance.
    @inproceedings{ning2020torque,
      title = {TORQUE: A Reading Comprehension Dataset of Temporal Ordering Questions},
      author = {Ning, Qiang and Wu, Hao and Han, Rujun and Peng, Nanyun and Gardner, Matt and Roth, Dan},
      booktitle = {the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)},
      publisher = {Association for Computational Linguistics},
      pages = {1158--1172},
      slideslive_id = {38938807},
      year = {2020}
    }
    
    Details
  • Joint Event and Temporal Relation Extraction with Shared Representations and Structured Prediction

    Rujun Han, Qiang Ning, and Nanyun Peng, in 2019 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2019.
    Full Text BibTeX Details
    @inproceedings{han2019joint,
      title = {Joint Event and Temporal Relation Extraction with Shared Representations and Structured Prediction},
      author = {Han, Rujun and Ning, Qiang and Peng, Nanyun},
      booktitle = {2019 Conference on Empirical Methods in Natural Language Processing (EMNLP)},
      year = {2019}
    }
    
    Details
  • Deep Structured Neural Network for Event Temporal Relation Extraction

    Rujun Han, I. Hsu, Mu Yang, Aram Galstyan, Ralph Weischedel, and Nanyun Peng, in The 2019 SIGNLL Conference on Computational Natural Language Learning (CoNLL), 2019.
    Full Text BibTeX Details
    @inproceedings{han2019deep,
      title = {Deep Structured Neural Network for Event Temporal Relation Extraction},
      author = {Han, Rujun and Hsu, I and Yang, Mu and Galstyan, Aram and Weischedel, Ralph and Peng, Nanyun},
      booktitle = {The 2019 SIGNLL Conference on Computational Natural Language Learning (CoNLL)},
      year = {2019}
    }
    
    Details
  • Contextualized Word Embeddings Enhanced Event Temporal Relation Extraction for Story Understanding

    Rujun Han, Mengyue Liang, Bashar Alhafni, and Nanyun Peng, in 2019 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL-HLT 2019), Workshop on Narrative Understanding, 2019.
    Full Text BibTeX Details
    @inproceedings{han2019contextualized,
      title = {Contextualized Word Embeddings Enhanced Event Temporal Relation Extraction for Story Understanding},
      author = {Han, Rujun and Liang, Mengyue and Alhafni, Bashar and Peng, Nanyun},
      booktitle = {2019 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL-HLT 2019), Workshop on Narrative Understanding},
      year = {2019}
    }
    
    Details