SciELO - Scientific Electronic Library Online

 
vol.26 número2Deep Learning and Feature Extraction for Covid 19 DiagnosiswPOI: Weather-Aware POI Recommendation Engine índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Revista

Articulo

Indicadores

Links relacionados

  • No hay artículos similaresSimilares en SciELO

Compartir


Computación y Sistemas

versión On-line ISSN 2007-9737versión impresa ISSN 1405-5546

Resumen

URIBE, Diego  y  CUAN, Enrique. How Much Deep is Deep Enough?. Comp. y Sist. [online]. 2022, vol.26, n.2, pp.921-938.  Epub 10-Mar-2023. ISSN 2007-9737.  https://doi.org/10.13053/cys-26-2-4260.

Typical deep learning models defined in terms of multiple layers are based on the assumption that a better representation is obtained with a hierarchical model rather than with a shallow one. Nevertheless, increasing the depth of the model by increasing the number of layers can lead to the model being lost or stuck during the optimization process.This paper investigates the impact of linguistic complexity characteristics from text on a deep learning model defined in terms of a stacked architecture. As the optimal number of stacked recurrent neural layers is specific to each application, we examine the optimal number of stacked recurrent layers corresponding to each linguistic characteristic. Last but not least, we also analyze the computational cost demanded by increasing the depth of a stacked recurrent architecture implemented for a linguistic characteristic.

Palabras llave : Recurrent neural networks; stacked architectures; linguistic characteristics.

        · texto en Inglés     · Inglés ( pdf )