Convolutional neural network training with dynamic epoch ordering


Abstract:

The paper presented exposes a novel approach to feed data to a Convolutional Neural Network (CNN) while training. Normally, neural networks are fed with shuffled data without any control of what type of examples contains a mini-batch. For situations where data are abundant and there does not exist an unbalancing between classes, shuffling the training data is enough to ensure a balanced mini-batch. On the contrary, most real-world problems end up with databases where some classes are predominant vs others, ill-conditioning the training network to learn those classes forgetting the others. For those conditioned cases, most common methods simply discard a certain number of samples until the data is balanced, but this paper proposes an ordered method of feeding data while preserving randomness in the mini-batch composition and using all available samples. This method has proven to solve the problem with unbalanced data-sets while competing with other methods. Moreover, the paper will focus its attention to a well know CNN network structure, named Deep Residual Networks.

Año de publicación:

2019

Keywords:

  • deep learning
  • Data management
  • Supervised learning
  • convolutional neural networks
  • Deep residual networks

Fuente:

scopusscopus

Tipo de documento:

Conference Object

Estado:

Acceso restringido

Áreas de conocimiento:

  • Aprendizaje automático
  • Ciencias de la computación

Áreas temáticas:

  • Ciencias de la computación