Please use this identifier to cite or link to this item:
https://hdl.handle.net/1959.11/59294
Title: | Shallow U-Net deep learning approach for phase retrieval in propagation-based phase-contrast imaging |
Contributor(s): | Li, Samuel Z (author); French, Matthew G (author); Pavlov, Konstantin M (author) ; Li, Heyang Thomas (author) |
Publication Date: | 2022 |
DOI: | 10.1117/12.2644579 |
Handle Link: | https://hdl.handle.net/1959.11/59294 |
Abstract: | | X-Ray Computed Tomography (CT) has revolutionised modern medical imaging. However, X-Ray CT imaging requires patients to be exposed to radiation, which can increase the risk of cancer. Therefore there exists an aim to reduce radiation doses for CT imaging without sacrificing image accuracy. This research combines phase retrieval with the ShallowU-Net CNN method to achieve the aim. This paper shows that a significant change in existing machine learning neural network algorithms could improve the X-ray phase retrieval in propagation-based phase-contrast imaging. This paper applies deep learning methods, through a variant of the existing U-Net architecture, named ShallowU-Net, to show that it is possible to perform two distance X-ray phase retrieval on composite materials by predicting a portion of the required data. ShallowU-Net is faster in training and in deployment. This method also performs data stretching and pre-processing, to reduce the numerical instability of the U-Net algorithm thereby improving the phase retrieval images.
Publication Type: | Conference Publication |
Source of Publication: | DEVELOPMENTS IN X-RAY TOMOGRAPHY XIV, v.12242 |
Publisher: | Spie-Int Soc Optical Engineering |
Place of Publication: | BELLINGHAM |
ISSN: | 0277-786X |
Fields of Research (FoR) 2020: | 5105 Medical and biological physics |
Socio-Economic Objective (SEO) 2020: | tbd |
HERDC Category Description: | E2 Non-Refereed Scholarly Conference Publication |
Appears in Collections: | Conference Publication
|
Show full item record
Items in Research UNE are protected by copyright, with all rights reserved, unless otherwise indicated.