Please use this identifier to cite or link to this item: http://idr.nitk.ac.in/jspui/handle/123456789/14699
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAboobacker S.
dc.contributor.authorVijayasenan D.
dc.contributor.authorSumam David S.
dc.contributor.authorSuresh P.K.
dc.contributor.authorSreeram S.
dc.date.accessioned2021-05-05T10:15:40Z-
dc.date.available2021-05-05T10:15:40Z-
dc.date.issued2020
dc.identifier.citationICSPCC 2020 - IEEE International Conference on Signal Processing, Communications and Computing, Proceedings , Vol. , , p. -en_US
dc.identifier.urihttps://doi.org/10.1109/ICSPCC50002.2020.9259490
dc.identifier.urihttp://idr.nitk.ac.in/jspui/handle/123456789/14699-
dc.description.abstractThe excessive accumulation of fluid between layers of pleura covering lungs is known as pleural effusion. Pleural effusion may be due to various infections, inflammations or malignancy. The cytologists visually examine the microscopic slide to detect the malignant cells. The process is time-consuming, and interpretation of reactive cells and cells with ambiguous levels of atypia may differ between pathologists. Considerable research is happening towards the automation of fluid cytology reporting. We propose an integrated approach based on deep learning, where the network learns directly to detect the malignant cells in effusion cytology images. Architecture U-Net is used to learn the malignant and benign cells from the images and to detect the images that contain malignant cells. The model gives a precision of 0.96, recall of 0.96, and specificity of 0.97. The AUC of the ROC curve is 0.97. The model can be used as a screening tool and has a malignant cell detection rate of 0.96 with a low false alarm rate of 0.03. © 2020 IEEE.en_US
dc.titleA Deep Learning Model for the Automatic Detection of Malignancy in Effusion Cytologyen_US
dc.typeConference Paperen_US
Appears in Collections:2. Conference Papers

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.