Detection on Cell Cancer Using the Deep Transfer Learning and Histogram Based Image Focus Quality Assessment

Citation

Bhuiyan, Md Roman and Abdullah, Junaidi (2022) Detection on Cell Cancer Using the Deep Transfer Learning and Histogram Based Image Focus Quality Assessment. Sensors, 22 (18). p. 7007. ISSN 1424-8220

[img] Text
28.pdf - Published Version
Restricted to Repository staff only

Download (3MB)

Abstract

In recent years, the number of studies using whole-slide imaging (WSIs) of histopathology slides has expanded significantly. For the development and validation of artificial intelligence (AI) systems, glass slides from retrospective cohorts including patient follow-up data have been digitized. It has become crucial to determine that the quality of such resources meets the minimum requirements for the development of AI in the future. The need for automated quality control is one of the obstacles preventing the clinical implementation of digital pathology work processes. As a consequence of the inaccuracy of scanners in determining the focus of the image, the resulting visual blur can render the scanned slide useless. Moreover, when scanned at a resolution of 20× or higher, the resulting picture size of a scanned slide is often enormous. Therefore, for digital pathology to be clinically relevant, computational algorithms must be used to rapidly and reliably measure the picture’s focus quality and decide if an image requires re-scanning. We propose a metric for evaluating the quality of digital pathology images that uses a sum of even-derivative filter bases to generate a human visual-system-like kernel, which is described as the inverse of the lens’ point spread function. This kernel is then used for a digital pathology image to change high-frequency image data degraded by the scanner’s optics and assess the patch-level focus quality. Through several studies, we demonstrate that our technique correlates with ground-truth z-level data better than previous methods, and is computationally efficient. Using deep learning techniques, our suggested system is able to identify positive and negative cancer cells in images. We further expand our technique to create a local slide-level focus quality heatmap, which can be utilized for automated slide quality control, and we illustrate our method’s value in clinical scan quality control by comparing it to subjective slide quality ratings. The proposed method, GoogleNet, VGGNet, and ResNet had accuracy values of 98.5%, 94.5%, 94.00%, and 95.00% respectively.

Item Type: Article
Uncontrolled Keywords: Cancer detection, positive cell, negative cell, deep transfer learning
Subjects: R Medicine > RA Public aspects of medicine > RA421-790.95 Public health. Hygiene. Preventive medicine
Divisions: Faculty of Computing and Informatics (FCI)
Depositing User: Ms Nurul Iqtiani Ahmad
Date Deposited: 31 Oct 2022 07:26
Last Modified: 31 Oct 2022 07:26
URII: http://shdl.mmu.edu.my/id/eprint/10583

Downloads

Downloads per month over past year

View ItemEdit (login required)