The robust reading competition annotation and evaluation platform


Abstract:

The ICDAR Robust Reading Competition (RRC), initiated in 2003 and re-established in 2011, has become a de-facto evaluation standard for robust reading systems and algorithms. Concurrent with its second incarnation in 2011, a continuous effort started to develop an on-line framework to facilitate the hosting and management of competitions. This paper outlines the Robust Reading Competition Annotation and Evaluation Platform, the backbone of the competitions. The RRC Annotation and Evaluation Platform is a modular framework, fully accessible through on-line interfaces. It comprises a collection of tools and services for managing all processes involved with defining and evaluating a research task, from dataset definition to annotation management, evaluation specification and results analysis. Although the framework has been designed with robust reading research in mind, many of the provided tools are generic by design. All aspects of the RRC Annotation and Evaluation Framework are available for research use.

Año de publicación:

2018

Keywords:

  • ONLINE PLATFORM
  • Robust reading
  • Performance evaluation
  • data annotation
  • ground truthing

Fuente:

scopusscopus

Tipo de documento:

Conference Object

Estado:

Acceso restringido

Áreas de conocimiento:

    Áreas temáticas:

    • Funcionamiento de bibliotecas y archivos
    • Medios documentales, educativos, informativos; periodismo
    • Retórica y colecciones literarias