Signature recognition: Human performance analysis vs. automatic system and feature extraction via crowdsourcing


Abstract:

This paper presents discriminative features as a result of comparing the authenticity of signatures, between standardized responses from a group of people with no experience in signature recognition through a manual system based on crowdsourcing, as well as the performance of the human vs. an automatic system with two classifiers. For which an experimental protocol is implemented through interfaces programmed in HTML and published on the platform Amazon Mechanical Turk. This platform allows obtaining responses from 500 workers on the veracity of signatures shown to them. By normalizing the responses, several general features which serve for the extraction of discriminative features are obtained in signature recognition. The comparison analysis in terms of False Acceptance Rate and False Rejection Rate founds the presented features, which will serve as a future study of performance analysis in the implementation of automatic and semiautomatic signature recognition systems that will support financial, legal and security applications.

Año de publicación:

2016

Keywords:

  • workers
  • FRR
  • Amazon Mechanical Turk
  • crowdsourcing
  • FAR

Fuente:

scopusscopus

Tipo de documento:

Conference Object

Estado:

Acceso restringido

Áreas de conocimiento:

  • Aprendizaje automático
  • Ciencias de la computación
  • Ciencias de la computación

Áreas temáticas:

  • Ciencias de la computación
  • Métodos informáticos especiales
  • Funcionamiento de bibliotecas y archivos