TY - JOUR AU - Preethi, Padmaprabha AU - Mamatha, Hosahalli Ramappa PY - 2022/09/06 Y2 - 2024/03/29 TI - Region-Based Convolutional Neural Network for Segmenting Text in Epigraphical Images JF - Artificial Intelligence and Applications JA - AIA VL - 1 IS - 2 SE - Research Article DO - 10.47852/bonviewAIA2202293 UR - https://ojs.bonviewpress.com/index.php/AIA/article/view/293 SP - 119-127 AB - <p>Indian history is derived from ancient writings on the inscriptions, palm leaves, copper plates, coins, and many more mediums.<br />Epigraphers read these inscriptions and produce meaningful interpretations. Automating the process of reading is the interest of our study, and in this paper, segmentation to detect text on digitized inscriptional images is dealt in detail. Character segmentation from epigraphical images helps in optical character recognizer in training and recognition of old regional scripts. Epigraphical images are drawn from estampages containing scripts from various periods starting from Brahmi in the 3rd century BC to the medieval period of the 15th century AD. The scripts or characters present in digitized epigraphical images are illegible and have complex noisy background textures. To achieve script/text segmentation, region-based convolutional neural network (CNN) is employed to detect characters in the images. Proposed method uses selective search to identify text regions and forwards them to trained CNN models for drawing feature vectors. These feature vectors are fed to support vector machine classifiers for classification and recognize text by drawing a bounding box based on confidence score. Alexnet, VGG16, Resnet50, and InceptionV3 are used as CNN models for experimentation, and InceptionV3 performed well with good results. A total of 197 images are used for experimentation, out of which 70 samples are of printed denoised epigraphical images, 40 denoised estampage images, and 87 noisy estampage images. The segmentation result of 74.79% for printed denoised epigraphical images, 71.53 % for denoised estampage epigraphical images, and 18.11% for noisy estampage images are recorded by InceptionV3. The segmented characters are used for epigraphical applications like period/era prediction and recognition of characters. FAST and FASTER region-based design approach was also tested and illustrated in this paper.</p><p> </p><p><strong>Received:</strong> 1 July 2022 <strong>| Revised:</strong> 31 August 2022 <strong>| Accepted:</strong> 6 September 2022</p><p> </p><p><strong>Conflicts of Interest</strong></p><p>The authors declare that they have no conflicts of interest to this work.</p> ER -