• J. Nucl. Med. · Dec 2014

    PET attenuation correction using synthetic CT from ultrashort echo-time MR imaging.

    • Snehashis Roy, Wen-Tung Wang, Aaron Carass, Jerry L Prince, John A Butman, and Dzung L Pham.
    • Center for Neuroscience and Regenerative Medicine, Henry Jackson Foundation, Bethesda, Maryland snehashis.roy@gmail.com.
    • J. Nucl. Med. 2014 Dec 1; 55 (12): 2071-7.

    UnlabelledIntegrated PET/MR systems are becoming increasingly popular in clinical and research applications. Quantitative PET reconstruction requires correction for γ-photon attenuations using an attenuation coefficient map (μ map) that is a measure of the electron density. One challenge of PET/MR, in contrast to PET/CT, lies in the accurate computation of μ maps. Unlike CT, MR imaging measures physical properties not directly related to electron density. Previous approaches have computed the attenuation coefficients using a segmentation of MR images or using deformable registration of atlas CT images to the space of the subject MR images.MethodsIn this work, we propose a patch-based method to generate whole-head μ maps from ultrashort echo-time (UTE) MR imaging sequences. UTE images are preferred to other MR sequences because of the increased signal from bone. To generate a synthetic CT image, we use patches from a reference dataset, which consists of dual-echo UTE images and a coregistered CT scan from the same subject. Matching of patches between the reference and target images allows corresponding patches from the reference CT scan to be combined via a Bayesian framework. No registration or segmentation is required.ResultsFor evaluation, UTE, CT, and PET data acquired from 5 patients under an institutional review board-approved protocol were used. Another patient (with UTE and CT data only) was selected to be the reference to generate synthetic CT images for these 5 patients. PET reconstructions were attenuation-corrected using the original CT, our synthetic CT, Siemens Dixon-based μ maps, Siemens UTE-based μ maps, and deformable registration-based CT. Our synthetic CT-based PET reconstruction showed higher correlation (average ρ = 0.996, R(2) = 0.991) to the original CT-based PET, as compared with the segmentation- and registration-based methods. Synthetic CT-based reconstruction had minimal bias (regression slope, 0.990), as compared with the segmentation-based methods (regression slope, 0.905). A peak signal-to-noise ratio of 35.98 dB in the reconstructed PET activity was observed, compared with 29.767, 29.34, and 27.43 dB for the Siemens Dixon-, UTE-, and registration-based μ maps.ConclusionA patch-matching approach to synthesize CT images from dual-echo UTE images leads to significantly improved accuracy of PET reconstruction as compared with actual CT scans. The PET reconstruction is improved over segmentation- (Dixon and Siemens UTE) and registration-based methods, even in subjects with pathologic findings.© 2014 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

      Pubmed     Free full text   Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…

What will the 'Medical Journal of You' look like?

Start your free 21 day trial now.

We guarantee your privacy. Your email address will not be shared.