SciELO - Scientific Electronic Library Online

 
vol.23 número3Promoting the Knowledge of Source Syntax in Transformer NMT Is Not NeededRefining Concepts by Machine Learning índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Journal

Artigo

Indicadores

Links relacionados

  • Não possue artigos similaresSimilares em SciELO

Compartilhar


Computación y Sistemas

versão On-line ISSN 2007-9737versão impressa ISSN 1405-5546

Resumo

ALI BATITA, Mohamed; AYADI, Rami  e  ZRIGUI, Mounir. Reasoning over Arabic WordNet Relations with Neural Tensor Network. Comp. y Sist. [online]. 2019, vol.23, n.3, pp.935-942.  Epub 09-Ago-2021. ISSN 2007-9737.  https://doi.org/10.13053/cys-23-3-3240.

Arabic WordNet is an important resource for many tasks of natural language processing. However, it suffers from many problems. In this paper, we address the problem of the unseen relationships between words in Arabic WordNet. More precisely, we focus on the ability for new relationships to be learned 'automatically' in Arabic WordNet from existing relationships. Using the Neural Tensor Network, we investigate how it can be an advantageous technique to fill the relationship gaps between Arabic WordNet words. With minimum resources, this model delivers meaningful results. The critical component is how to represent the entities of Arabic WordNet. For that, we use AraVec, a set of pre-trained distributed word representation for the Arabic language. We show how much it helps to use these vectors for initialization. We evaluated the model, using a number of tests which reveal that semantically-initialized vectors provide considerable greater accuracy than randomly initialized ones.

Palavras-chave : Arabic WordNet; natural language processing; neural tensor network; AraVec; word representation; word embedding.

        · texto em Inglês     · Inglês ( pdf )