List of Papers Browse by Subject Areas Author List
Abstract
Analysis of high-resolution micro-ultrasound data using deep learning presents a promising avenue for the accurate detection of prostate cancer (PCa). While previous efforts have focused on designing specialized architectures and training them from scratch, they are challenged by limited data availability. Medical foundation models, pre-trained on large and diverse datasets, offer a robust knowledge base that can be adapted to downstream tasks, reducing the need for large task specific datasets. However, their lack of specialized domain knowledge hinders their success: our initial research indicates that even with extensive fine-tuning, existing foundation models falls short of surpassing specialist models’ performance for PCa detection. To address this gap, we propose ProstNFound, a method that empowers foundation models with domain-specific knowledge pertinent to ultrasound imaging and PCa. In this approach, while ultrasound images are fed to a foundation model, specialized auxiliary networks embed high-resolution textural features and clinical markers which are then presented to the network as prompts. Using a multi-center micro-ultrasound dataset with 693 patients, we demonstrate significant improvements over the state-of-the-art in PCa detection. ProstNFound achieves 90% sensitivity at 40% specificity, performance that is competitive with that of expert radiologists reading multi-parametric MRI or micro-ultrasound images, suggesting significant promise for clinical application. Our code will be made available at github.com.
Links to Paper and Supplementary Materials
Main Paper (Open Access Version): https://papers.miccai.org/miccai-2024/paper/2644_paper.pdf
SharedIt Link: https://rdcu.be/dV5yn
SpringerLink (DOI): https://doi.org/10.1007/978-3-031-72089-5_47
Supplementary Material: https://papers.miccai.org/miccai-2024/supp/2644_supp.pdf
Link to the Code Repository
https://github.com/pfrwilson/prostNfound
Link to the Dataset(s)
n/a
BibTex
@InProceedings{Wil_ProstNFound_MICCAI2024,
author = { Wilson, Paul F. R. and To, Minh Nguyen Nhat and Jamzad, Amoon and Gilany, Mahdi and Harmanani, Mohamed and Elghareb, Tarek and Fooladgar, Fahimeh and Wodlinger, Brian and Abolmaesumi, Purang and Mousavi, Parvin},
title = { { ProstNFound: Integrating Foundation Models with Ultrasound Domain Knowledge and Clinical Context for Robust Prostate Cancer Detection } },
booktitle = {proceedings of Medical Image Computing and Computer Assisted Intervention -- MICCAI 2024},
year = {2024},
publisher = {Springer Nature Switzerland},
volume = {LNCS 15006},
month = {October},
page = {499 -- 509}
}
Reviews
Review #1
- Please describe the contribution of the paper
This work proposed ProstNFound which empowers foundation models with domain-specific knowledge pertinent to ultrasound imaging and PCa. In this approach, while ultrasound images are fed to a foundation model, specialized auxiliary networks embed high-resolution textural features and clinical markers which are then presented to the network as prompts.
- Please list the main strengths of the paper; you should write about a novel formulation, an original way to use data, demonstration of clinical feasibility, a novel application, a particularly strong evaluation, or anything else that is a strong aspect of this work. Please provide details, for instance, if a method is novel, explain what aspect is novel and why this is interesting.
This study proposed a novel architecture to integrate local texture features and clinical context with foundation models. In experiment, this method significantly outperforms previous SOTA in micro-US PCa detection, achieving 90% sensitivity at 40% specificity.
- Please list the main weaknesses of the paper. Please provide details, for instance, if you think a method is not novel, explain why and provide a reference to prior work.
- Have you conducted k-fold cross validation in Prostate cancer detection? Have you repeated the experiment multiple times? The standard deviation should be added in the table.
- The sensitivity analysis should be also added for the selection of hyperparameters and the layers and dimension of ancoder and decoder modules.
- Please rate the clarity and organization of this paper
Very Good
- Please comment on the reproducibility of the paper. Please be aware that providing code and data is a plus, but not a requirement for acceptance.
The authors claimed to release the source code and/or dataset upon acceptance of the submission.
- Do you have any additional comments regarding the paper’s reproducibility?
N/A
- Please provide detailed and constructive comments for the authors. Please also refer to our Reviewer’s guide on what makes a good review. Pay specific attention to the different assessment criteria for the different paper categories (MIC, CAI, Clinical Translation of Methodology, Health Equity): https://conferences.miccai.org/2024/en/REVIEWER-GUIDELINES.html
see above
- Rate the paper on a scale of 1-6, 6 being the strongest (6-4: accept; 3-1: reject). Please use the entire range of the distribution. Spreading the score helps create a distribution for decision-making
Weak Accept — could be accepted, dependent on rebuttal (4)
- Please justify your recommendation. What were the major factors that led you to your overall score for this paper?
The experiment is promissing to demonstrate the performance of proposed method.
- Reviewer confidence
Confident but not absolutely certain (3)
- [Post rebuttal] After reading the author’s rebuttal, state your overall opinion of the paper if it has been changed
N/A
- [Post rebuttal] Please justify your decision
N/A
Review #2
- Please describe the contribution of the paper
In this paper, the authors proposed ProstNFound which is a method that uses foundation models with domain-specific knowledge of ultrasound and prostate cancer (PCa). The authors noted ProstNFound achieves 90% sensitivity at 40% specificity, performance that is competitive with expert radiologists.
- Please list the main strengths of the paper; you should write about a novel formulation, an original way to use data, demonstration of clinical feasibility, a novel application, a particularly strong evaluation, or anything else that is a strong aspect of this work. Please provide details, for instance, if a method is novel, explain what aspect is novel and why this is interesting.
- The data from this study are impressive with proprietary data from 693 patients across five medical centers, particularly it should be noted the importance of cross-medical center data to ensure the results are not biased towards a particular site.
- Table 1 shows strong results comparing to other methods.
- Please list the main weaknesses of the paper. Please provide details, for instance, if you think a method is not novel, explain why and provide a reference to prior work.
- While leave-one-center out evaluation study is one method here, can authors comments on other varieties that may offer additional evaluation data points such as holding a certain percentage from each and all centers?
- It appears adding RF provides some improvements, can authors comment on the clinical usefulness of this margin in consideration of having to add RF in pipeline?
- Please rate the clarity and organization of this paper
Excellent
- Please comment on the reproducibility of the paper. Please be aware that providing code and data is a plus, but not a requirement for acceptance.
The authors claimed to release the source code and/or dataset upon acceptance of the submission.
- Do you have any additional comments regarding the paper’s reproducibility?
The authors stated that code will be made public; data appear to be proprietary.
- Please provide detailed and constructive comments for the authors. Please also refer to our Reviewer’s guide on what makes a good review. Pay specific attention to the different assessment criteria for the different paper categories (MIC, CAI, Clinical Translation of Methodology, Health Equity): https://conferences.miccai.org/2024/en/REVIEWER-GUIDELINES.html
- Can the authors comment on, while currently using as model evaluations, the leave-one-center out idea being examined to assess the usefulness of federated learning?
- Can the authors comment on additional prompts of interest?
- Rate the paper on a scale of 1-6, 6 being the strongest (6-4: accept; 3-1: reject). Please use the entire range of the distribution. Spreading the score helps create a distribution for decision-making
Accept — should be accepted, independent of rebuttal (5)
- Please justify your recommendation. What were the major factors that led you to your overall score for this paper?
This is a strong paper with competitive results and addressing an important clinical problem using foundation models with ultrasound and disease domain knowledge.
- Reviewer confidence
Confident but not absolutely certain (3)
- [Post rebuttal] After reading the author’s rebuttal, state your overall opinion of the paper if it has been changed
N/A
- [Post rebuttal] Please justify your decision
N/A
Review #3
- Please describe the contribution of the paper
This work integrates a foundation model with task specific micro-ultrasound data to build a model for detection of prostate cancer.
- Please list the main strengths of the paper; you should write about a novel formulation, an original way to use data, demonstration of clinical feasibility, a novel application, a particularly strong evaluation, or anything else that is a strong aspect of this work. Please provide details, for instance, if a method is novel, explain what aspect is novel and why this is interesting.
- Very interesting work that takes advantage of foundation models for ultrasound and integrates with micro ultrasound data and clinical data to detect prostate cancer.
- Novel architecture to combine foundation models, clinical data and texture features.
- Beyond SOTA results
- Very clearly written and a pleasure to read.
- Please list the main weaknesses of the paper. Please provide details, for instance, if you think a method is not novel, explain why and provide a reference to prior work.
The only weakness I noticed was the organization of Fig1. The flow is hard to see/follow. The foundation model box could be placed at the bottom left and the arrows should be in a different color.
- Please rate the clarity and organization of this paper
Excellent
- Please comment on the reproducibility of the paper. Please be aware that providing code and data is a plus, but not a requirement for acceptance.
The authors claimed to release the source code and/or dataset upon acceptance of the submission.
- Do you have any additional comments regarding the paper’s reproducibility?
The foundation model used is openly available, but the task specific data are proprietary. This limits the exact reproduction of the results.
- Please provide detailed and constructive comments for the authors. Please also refer to our Reviewer’s guide on what makes a good review. Pay specific attention to the different assessment criteria for the different paper categories (MIC, CAI, Clinical Translation of Methodology, Health Equity): https://conferences.miccai.org/2024/en/REVIEWER-GUIDELINES.html
Make a few modifications to Fig1 to make it easier to understand and with a flow from left to right.
- Rate the paper on a scale of 1-6, 6 being the strongest (6-4: accept; 3-1: reject). Please use the entire range of the distribution. Spreading the score helps create a distribution for decision-making
Strong Accept — must be accepted due to excellence (6)
- Please justify your recommendation. What were the major factors that led you to your overall score for this paper?
The integration of the foundation model with task specific clinical and imaging data giving beyond SOTA results.
- Reviewer confidence
Confident but not absolutely certain (3)
- [Post rebuttal] After reading the author’s rebuttal, state your overall opinion of the paper if it has been changed
N/A
- [Post rebuttal] Please justify your decision
N/A
Author Feedback
N/A
Meta-Review
Meta-review not available, early accepted paper.