Please use this identifier to cite or link to this item:
Title: The Multistage Approach to Information Extraction in Degraded Document Images
Contributor(s): Chen, Yan (author); Leedham, Graham  (author)
Publication Date: 2004
DOI: 10.1109/ICPR.2004.1334154
Handle Link:
Abstract: Global and local adaptive thresholding techniques have been shown effective on particular types of documents. None produces consistently good results on all types of documents. In this paper a novel method, called the multistage-approach, is presented and compared against some existing single-stage algorithms. The multistage approach recursively breaks down an image into sub-regions using quad-tree decomposition and extracts local features from each sub-region until an appropriate thresholding method can be applied to each sub-region. Quantitative analysis using word recall and on 300 degraded historical images obtained from the Library of Congress demonstrate the method is superior to any existing single methods.
Publication Type: Conference Publication
Conference Details: ICPR'04: 17th International Conference on Pattern Recognition, Cambridge, United Kingdom, 23rd - 26th August, 2004
Source of Publication: Proceedings of the 17th International Conference on Pattern Recognition (ICPR'04), v.1, p. 445-449
Publisher: IEEE: Institute of Electrical and Electronics Engineers
Place of Publication: Piscataway, United States of America
ISSN: 1051-4651
Field of Research (FOR): 080106 Image Processing
Socio-Economic Objective (SEO): 810107 National Security
890299 Computer Software and Services not elsewhere classified
Peer Reviewed: Yes
HERDC Category Description: E1 Refereed Scholarly Conference Publication
Statistics to Oct 2018: Visitors: 146
Views: 146
Downloads: 0
Appears in Collections:Conference Publication

Files in This Item:
2 files
File Description SizeFormat 
Show full item record


checked on Nov 26, 2018

Page view(s)

checked on Feb 7, 2019
Google Media

Google ScholarTM



Items in Research UNE are protected by copyright, with all rights reserved, unless otherwise indicated.