Increase attractor capacity using an ensembled neural network
Abstract:
This work presents an ensemble of Attractor Neural Networks (ANN) modules, that increases the patterns’ storage, at similar computational cost when compared with a single-module ANN system. We build the ensemble of ANN components, and divide the uniform random patterns’ set into disjoint subsets during the learning stage, such that each subset is assigned to a different component. In this way, a larger overall number of patterns can be stored by the ANN ensemble, where each of its modules has a moderate pattern load, being able to retrieve its corresponding assigned subset with the desired quality. Allowing some noise in the retrieval, we are able to recall a larger number of patterns while discriminating between pattern subsets assigned to each component in the ensemble. We showed that the ANN ensemble system with N=104 units is able to approximately triple the maximal capacity of the single ANN, with similar wiring costs. We tested the modularized ANN ensemble for different levels of component dilution, by keeping constant the wiring costs. This approach could be implemented, for instance, with parallel computing in order to tackle computational costly real-world problems.
Año de publicación:
2017
Keywords:
- Synaptic dilution
- Ensemble of Attractor Neural Networks
- Hopfield-type network
- Network wiring cost
- Divide-and-conquer parallelizing
- Storage capacity
Fuente:
Tipo de documento:
Article
Estado:
Acceso restringido
Áreas de conocimiento:
- Aprendizaje automático
- Ciencias de la computación
Áreas temáticas:
- Ciencias de la computación