چكيده به لاتين
Coreference resolution is one of the major cores in natural language processing. There are important applications in areas such as answering questions, machine translation, automatic abstraction, and extraction of a well-known entity. The task of detecting homogeneous is the resolution of noun and pronouns in the text that refer to the same entity. Therefore, it is necessary to resolve the reference in order to understand the text documents or even the words. Methods coreference resolution can be divided into two categories of linguistic methods and machine learning methods. Linguistic techniques need more linguistic information, but the problem with these methods is that they are more likely to be error-prone. Also, implementation of these methods take time consuming, while machine learning methods require less linguistic information and their results are more reliable. Recognition of coreference expressions using machine learning and body-based learning algorithms has flourished in recent years in reference selection operations that has received allowable results. You need a well-sized labeled entity for using such methods. In this thesis, we are trying to study the process of joint reference detection. For this task, the base of the work is marked and the proposed algorithm to predict nominal expressions. In the first step, according to using the PCAC-2008 construct which has both pointing and coreference symbols, we present a system that identifies both the coreference names of the text. Next step we enumerate the specified attributes, and extract positive and negative samples from the corpus by changing the attributes. In this step we follow the purpose that which specified features have an impressive effect on raising or lowering precision, recall and F measure. In last step we using the basic learning algorithm of the neural network for comparing the obtained samples. The results show that the learner of the neural network has a better performance than other work by considering specific syntactic features.