An automated deep learning based approach for nuclei segmentation of renal digital histopathology image analysis

Author Identifiers

Md Shamim Hossain


Date of Award


Degree Type

Thesis - ECU Access Only

Degree Name

Doctor of Philosophy


School of Science

First Advisor

Leisa Armstrong

Second Advisor

David Cook


Renal clear cell carcinoma affects the kidneys by abnormal cell division which spreads to other organs through the bloodstream and lymphatic system. The number of renal cancer cases grows whilst rapid and accurate diagnoses are required for early intervention. Biopsies are critical for cancer diagnosis. Pathologists look beyond manual evaluation to include computer-based analysis to develop accurate cancer diagnostics. Pathologists render diagnostic reports to assist with treatment whilst expert analysis is time consuming and restricts early diagnosis.

The process of manual expert pathology reporting is prohibitive and poor and repetitive concentration can lead to misdiagnosis. The probability of observational error increases along with the increased workload of the average pathologist and the demand for histopathology image analysis. Advances in computational and computer-assisted applications can provide accurate and timely analysis of histopathology images. Manual annotation now looks to machine learning algorithms. The nuclei segmentation technique is one possible approach. It uses deep learning-based nuclei segmentation approaches to train networks. This assists expert pathology which is expensive and time-consuming. Overlapping nuclei segmentation is a challenging issue for automated histopathology image analysis. Deep observation is required in digital images to identify the overlapping nuclei and variations in segmentation errors can mislead expert pathologists.

The aim of this research study was to perform a literature review of existing nuclei segmentation techniques including overlapping splitting algorithms; identify the limitations and knowledge gaps; and propose computerised deep learning based individual nuclei segmentation and analysis of histopathology images. A mixed method research study was performed with sequential research experiments in data collection; image pre-processing; synthetic image generation; segmentation of nuclei regions; overlapping nuclei; and the validation of a proposed framework. A series of experiments were executed to find the most viable approach.

An improved approach was designed for synthetic image generation using a cycle-consistent GAN network. The network created synthetic backgrounds and allowed for a CNN filtering method to separate the initial synthetic backgrounds. Nuclei shapes were collected to create transformed shapes. These transformed shapes were placed on the refined synthetic backgrounds to generate complete synthetic images. The similarity of original and synthetic images established and viable, valid pathway. A nuclei mask of synthetic images was collected to train a modified U-net segmentation network for better segmentation accuracy. These synthetic images performed better than original images. Accurately delineating the individual nucleus boundary helped to generatean automated system divide the nuclei clumps into individual nuclei in histopathology images. Using the nuclei ground truth of original images, it was possible to validate an application that informed manual expert pathology and to process multiple images and minimise histopathology image analysis.

The novelty of this research is the creation of an automated deep learning based individual nuclei segmentation system for renal histopathology images. The synthetic images and corresponding nuclei masks were trained with a modified U-net nuclei segmentation network. The trained network provides better nuclei segmentation performance in original images. The research developed a robust application which allows for the analysis of multiple histopathology images.

Access Note

Access to this thesis is embargoed until 12 December 2027.

Access to this thesis is restricted. Please see the Access Note below for access details.