Alert button
Picture for Paul E. Kinahan

Paul E. Kinahan

Alert button

Issues and Challenges in Applications of Artificial Intelligence to Nuclear Medicine -- The Bethesda Report (AI Summit 2022)

Nov 07, 2022
Arman Rahmim, Tyler J. Bradshaw, Irène Buvat, Joyita Dutta, Abhinav K. Jha, Paul E. Kinahan, Quanzheng Li, Chi Liu, Melissa D. McCradden, Babak Saboury, Eliot Siegel, John J. Sunderland, Richard L. Wahl

Figure 1 for Issues and Challenges in Applications of Artificial Intelligence to Nuclear Medicine -- The Bethesda Report (AI Summit 2022)
Figure 2 for Issues and Challenges in Applications of Artificial Intelligence to Nuclear Medicine -- The Bethesda Report (AI Summit 2022)

The SNMMI Artificial Intelligence (SNMMI-AI) Summit, organized by the SNMMI AI Task Force, took place in Bethesda, MD on March 21-22, 2022. It brought together various community members and stakeholders from academia, healthcare, industry, patient representatives, and government (NIH, FDA), and considered various key themes to envision and facilitate a bright future for routine, trustworthy use of AI in nuclear medicine. In what follows, essential issues, challenges, controversies and findings emphasized in the meeting are summarized.

Viaarxiv icon

Synthetic PET via Domain Translation of 3D MRI

Jun 11, 2022
Abhejit Rajagopal, Yutaka Natsuaki, Kristen Wangerin, Mahdjoub Hamdi, Hongyu An, John J. Sunderland, Richard Laforest, Paul E. Kinahan, Peder E. Z. Larson, Thomas A. Hope

Figure 1 for Synthetic PET via Domain Translation of 3D MRI
Figure 2 for Synthetic PET via Domain Translation of 3D MRI
Figure 3 for Synthetic PET via Domain Translation of 3D MRI
Figure 4 for Synthetic PET via Domain Translation of 3D MRI

Historically, patient datasets have been used to develop and validate various reconstruction algorithms for PET/MRI and PET/CT. To enable such algorithm development, without the need for acquiring hundreds of patient exams, in this paper we demonstrate a deep learning technique to generate synthetic but realistic whole-body PET sinograms from abundantly-available whole-body MRI. Specifically, we use a dataset of 56 $^{18}$F-FDG-PET/MRI exams to train a 3D residual UNet to predict physiologic PET uptake from whole-body T1-weighted MRI. In training we implemented a balanced loss function to generate realistic uptake across a large dynamic range and computed losses along tomographic lines of response to mimic the PET acquisition. The predicted PET images are forward projected to produce synthetic PET time-of-flight (ToF) sinograms that can be used with vendor-provided PET reconstruction algorithms, including using CT-based attenuation correction (CTAC) and MR-based attenuation correction (MRAC). The resulting synthetic data recapitulates physiologic $^{18}$F-FDG uptake, e.g. high uptake localized to the brain and bladder, as well as uptake in liver, kidneys, heart and muscle. To simulate abnormalities with high uptake, we also insert synthetic lesions. We demonstrate that this synthetic PET data can be used interchangeably with real PET data for the PET quantification task of comparing CT and MR-based attenuation correction methods, achieving $\leq 7.6\%$ error in mean-SUV compared to using real data. These results together show that the proposed synthetic PET data pipeline can be reasonably used for development, evaluation, and validation of PET/MRI reconstruction methods.

* under review 
Viaarxiv icon