Abstract

Traditional noun phrase coreference resolution systems represent features only of pairs of noun phrases. In this paper, we propose a machine learning method that enables features over sets of noun phrases, resulting in a first-order probabilistic model for coreference. We outline a set of approximations that make this approach practical, and apply our method to the ACE coreference dataset, achieving a 45% error reduction over a comparable method that only considers features of pairs of noun phrases. This result demonstrates an example of how a first-order logic representation can be incorporated into a probabilistic model and scaled efficiently.

Citation

@inproceedings{culotta07first,
  author = {Aron Culotta and Michael Wick and Robert Hall and Andrew McCallum},
  title = {First-Order Probabilistic Models for Coreference Resolution},
  booktitle = {Human Language Technology Conference of the North American Chapter of the Association of Computational Linguistics (HLT/NAACL)},
  year = {2007},
  location = {Rochester, NY},
  pages = {81--88},
}