What is it about?
People write dates and times in a wide variety of ways, so it has been difficult for computers to translate this human language into a computer-interpretable form. We have shown that neural network models, coupled with a linguistic theory for how pieces of a time expression are composed together, can learn to accurately parse the words of English time expressions into intervals on the timeline.
Featured Image
Why is it important?
This is the first work that successfully trains machine learning models to read text, character-by-character, and extract and normalize time expressions from the text.
Read the Original
This page is a summary of: From Characters to Time Intervals: New Paradigms for Evaluation and Neural Parsing of Time Normalizations, Transactions of the Association for Computational Linguistics, January 2018, The MIT Press,
DOI: 10.1162/tacl_a_00025.
You can read the full text:
Resources
Semantically Compositional Annotation of Time Expressions (SCATE) corpus
Corpus of manually-annotated time expressions. First reported in "A Semantically Compositional Annotation Scheme for Time Normalization": http://www.lrec-conf.org/proceedings/lrec2016/pdf/288_Paper.pdf.
TimeNorm Code
Code for the time expression normalization model
Contributors
The following have contributed to this page







