SciELO - Scientific Electronic Library Online

 
vol.26 número2Deep Learning and Feature Extraction for Covid 19 DiagnosiswPOI: Weather-Aware POI Recommendation Engine índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Journal

Artigo

Indicadores

Links relacionados

  • Não possue artigos similaresSimilares em SciELO

Compartilhar


Computación y Sistemas

versão On-line ISSN 2007-9737versão impressa ISSN 1405-5546

Resumo

URIBE, Diego  e  CUAN, Enrique. How Much Deep is Deep Enough?. Comp. y Sist. [online]. 2022, vol.26, n.2, pp.921-938.  Epub 10-Mar-2023. ISSN 2007-9737.  https://doi.org/10.13053/cys-26-2-4260.

Typical deep learning models defined in terms of multiple layers are based on the assumption that a better representation is obtained with a hierarchical model rather than with a shallow one. Nevertheless, increasing the depth of the model by increasing the number of layers can lead to the model being lost or stuck during the optimization process.This paper investigates the impact of linguistic complexity characteristics from text on a deep learning model defined in terms of a stacked architecture. As the optimal number of stacked recurrent neural layers is specific to each application, we examine the optimal number of stacked recurrent layers corresponding to each linguistic characteristic. Last but not least, we also analyze the computational cost demanded by increasing the depth of a stacked recurrent architecture implemented for a linguistic characteristic.

Palavras-chave : Recurrent neural networks; stacked architectures; linguistic characteristics.

        · texto em Inglês     · Inglês ( pdf )