Abstract

Recent advancements in deep learning have shifted the development of brain imaging analysis. However, several challenges remain, such as heterogeneity, individual variations, and the contradiction between the high dimensionality and small size of brain imaging datasets. These issues complicate the learning process, preventing models from capturing intrinsic, meaningful patterns and potentially leading to suboptimal performance due to biases and overfitting. Curriculum learning (CL) presents a promising solution by organizing training examples from simple to complex, mimicking the human learning process, and potentially fostering the development of more robust and accurate models. Despite its potential, the inherent limitations posed by small initial training datasets present significant challenges, including overfitting and poor generalization. In this paper, we introduce the Progressive Self-Paced Distillation (PSPD) framework, employing an adaptive and progressive pacing and distillation mechanism. This allows for dynamic curriculum adjustments based on the states of both past and present models. The past model serves as a teacher, guiding the current model with gradually refined curriculum knowledge and helping prevent the loss of previously acquired knowledge. We validate PSPD’s efficacy and adaptability across various convolutional neural networks using the Alzheimer’s Disease Neuroimaging Initiative (ADNI) dataset, underscoring its superiority in enhancing model performance and generalization capabilities

Links to Paper and Supplementary Materials

Main Paper (Open Access Version): https://papers.miccai.org/miccai-2024/paper/0173_paper.pdf

SharedIt Link: pending

SpringerLink (DOI): pending

Supplementary Material: https://papers.miccai.org/miccai-2024/supp/0173_supp.pdf

Link to the Code Repository

https://github.com/Hrychen7/PSPD

Link to the Dataset(s)

N/A

BibTex

@InProceedings{Yan_Advancing_MICCAI2024,
        author = { Yang, Yanwu and Chen, Hairui and Hu, Jiesi and Guo, Xutao and Ma, Ting},
        title = { { Advancing Brain Imaging Analysis Step-by-step via Progressive Self-paced Learning } },
        booktitle = {proceedings of Medical Image Computing and Computer Assisted Intervention -- MICCAI 2024},
        year = {2024},
        publisher = {Springer Nature Switzerland},
        volume = {LNCS 15011},
        month = {October},
        page = {pending}
}


Reviews

Review #1

  • Please describe the contribution of the paper

    The paper proposes a scheme for curriculum training for classification problems. The proposed Progressive Self-Paced Distillation (PSPD) framework uses an adaptive and progressive pacing along with distillation.

  • Please list the main strengths of the paper; you should write about a novel formulation, an original way to use data, demonstration of clinical feasibility, a novel application, a particularly strong evaluation, or anything else that is a strong aspect of this work. Please provide details, for instance, if a method is novel, explain what aspect is novel and why this is interesting.

    DNN training is plagued with local minima and the paper proposes some schemes to help with the nonlinear optimization.

  • Please list the main weaknesses of the paper. Please provide details, for instance, if you think a method is not novel, explain why and provide a reference to prior work.

    The proposed scheme is mainly heuristic. it has several free parameters. Tests for robustness of the results to combinations of free parameter values has not been done. Similarly, when comparing against baselines who also have several free parameters, it is not clear if the claimed benefits of the method will survive upon free parameter tuning of the baselines as well. this is because it seems that the performance of the method is or can be very sensitive to the specific choice of the free parameters, and this may also be true for the baselines.

  • Please rate the clarity and organization of this paper

    Good

  • Please comment on the reproducibility of the paper. Please be aware that providing code and data is a plus, but not a requirement for acceptance.

    The authors claimed to release the source code and/or dataset upon acceptance of the submission.

  • Do you have any additional comments regarding the paper’s reproducibility?

    N/A

  • Please provide detailed and constructive comments for the authors. Please also refer to our Reviewer’s guide on what makes a good review. Pay specific attention to the different assessment criteria for the different paper categories (MIC, CAI, Clinical Translation of Methodology, Health Equity): https://conferences.miccai.org/2024/en/REVIEWER-GUIDELINES.html

    A very rigorous empirical analysis covering free parameter, tuning of the propose method, and all the base lines would need to be performed.

  • Rate the paper on a scale of 1-6, 6 being the strongest (6-4: accept; 3-1: reject). Please use the entire range of the distribution. Spreading the score helps create a distribution for decision-making

    Reject — should be rejected, independent of rebuttal (2)

  • Please justify your recommendation. What were the major factors that led you to your overall score for this paper?

    Please see the comments above

  • Reviewer confidence

    Confident but not absolutely certain (3)

  • [Post rebuttal] After reading the author’s rebuttal, state your overall opinion of the paper if it has been changed

    N/A

  • [Post rebuttal] Please justify your decision

    N/A



Review #2

  • Please describe the contribution of the paper

    This paper proposed a self-paced learning framework that introduced a student-teacher model for self-distillation. The proposed method adaptively selects samples for classification and self-distillation at each epoch. This experiment showed that the proposed method outperforms the existing methods, and the ablation study demostrated that the effectiveness of each component of the proposed method.

  • Please list the main strengths of the paper; you should write about a novel formulation, an original way to use data, demonstration of clinical feasibility, a novel application, a particularly strong evaluation, or anything else that is a strong aspect of this work. Please provide details, for instance, if a method is novel, explain what aspect is novel and why this is interesting.

    #1 Although the dataset was limited, other experimental conditions were diverse. #2 Fig. 1 is easy to understand, so it is useful for understanding the overview of the proposed method.

  • Please list the main weaknesses of the paper. Please provide details, for instance, if you think a method is not novel, explain why and provide a reference to prior work.

    #3 The explanation for the soft training is unclear. This paper described ``All the samples are fed into the model training’’, but in Equation (3), they are masked by a threshold lambda. These description are inconsistent. In addition, Equation (3) may be mistake because there is an omega on both sides of the second equation.

    #4 The proposed method is a simple combination of self-paced learning and teacher-student model. It is necessary to discuss why the proposed mthod is suitable for brain imaging task.

    #5 Some information regarding experimental conditions is missing. For example, in Table 1, is the proposed method soft training or hard training?

  • Please rate the clarity and organization of this paper

    Good

  • Please comment on the reproducibility of the paper. Please be aware that providing code and data is a plus, but not a requirement for acceptance.

    The submission has provided an anonymized link to the source code, dataset, or any other dependencies.

  • Do you have any additional comments regarding the paper’s reproducibility?

    N/A

  • Please provide detailed and constructive comments for the authors. Please also refer to our Reviewer’s guide on what makes a good review. Pay specific attention to the different assessment criteria for the different paper categories (MIC, CAI, Clinical Translation of Methodology, Health Equity): https://conferences.miccai.org/2024/en/REVIEWER-GUIDELINES.html

    #6 In Fig. 1, the number of samples in the past is larger than the current sample, but I think this is different from the expected scene and may cause some confusion for the reader. Therefore, Fig. 1 should be revised.

  • Rate the paper on a scale of 1-6, 6 being the strongest (6-4: accept; 3-1: reject). Please use the entire range of the distribution. Spreading the score helps create a distribution for decision-making

    Weak Reject — could be rejected, dependent on rebuttal (3)

  • Please justify your recommendation. What were the major factors that led you to your overall score for this paper?

    The explanation for the soft training of the proposed method is unclear. In addition, the proposed method has little technical novelty.

  • Reviewer confidence

    Confident but not absolutely certain (3)

  • [Post rebuttal] After reading the author’s rebuttal, state your overall opinion of the paper if it has been changed

    Weak Accept — could be accepted, dependent on rebuttal (4)

  • [Post rebuttal] Please justify your decision

    This rebuttal corrected the mistakes in the paper and emphasized the importance of the study. Therefore, my score has been corrected.



Review #3

  • Please describe the contribution of the paper

    This work introduces the Progressive Self-Paced Distillation framework by employing an adaptive and progressive pacing and distillation mechanism, which allows for dynamic curriculum adjustments based on the states of both past and present models.

  • Please list the main strengths of the paper; you should write about a novel formulation, an original way to use data, demonstration of clinical feasibility, a novel application, a particularly strong evaluation, or anything else that is a strong aspect of this work. Please provide details, for instance, if a method is novel, explain what aspect is novel and why this is interesting.

    The description is very clear and the experiments are also enough to support the claim.

  • Please list the main weaknesses of the paper. Please provide details, for instance, if you think a method is not novel, explain why and provide a reference to prior work.
    1. The visual examples in this work is limited.
    2. A discussion Section should be added before Conclusion Section.
    3. There are many hyper-parameters in Eq.7-Eq.9. How did you choose the value for these parameters?
  • Please rate the clarity and organization of this paper

    Good

  • Please comment on the reproducibility of the paper. Please be aware that providing code and data is a plus, but not a requirement for acceptance.

    The authors claimed to release the source code and/or dataset upon acceptance of the submission.

  • Do you have any additional comments regarding the paper’s reproducibility?

    N/A

  • Please provide detailed and constructive comments for the authors. Please also refer to our Reviewer’s guide on what makes a good review. Pay specific attention to the different assessment criteria for the different paper categories (MIC, CAI, Clinical Translation of Methodology, Health Equity): https://conferences.miccai.org/2024/en/REVIEWER-GUIDELINES.html
    1. The visual examples in this work is limited.
    2. A discussion Section should be added before Conclusion Section.
    3. There are many hyper-parameters in Eq.7-Eq.9. How did you choose the value for these parameters?
  • Rate the paper on a scale of 1-6, 6 being the strongest (6-4: accept; 3-1: reject). Please use the entire range of the distribution. Spreading the score helps create a distribution for decision-making

    Weak Accept — could be accepted, dependent on rebuttal (4)

  • Please justify your recommendation. What were the major factors that led you to your overall score for this paper?

    This paper indeed has some merits, but some limitations cannot be ignored during the review process, hoping the authors can address it in the rebuttal process.

  • Reviewer confidence

    Confident but not absolutely certain (3)

  • [Post rebuttal] After reading the author’s rebuttal, state your overall opinion of the paper if it has been changed

    N/A

  • [Post rebuttal] Please justify your decision

    N/A




Author Feedback

We thank the reviewers for their comments and thoughtful suggestions. Your feedback has provided us with valuable insights to improve the manuscript.

For reviewer 1: Q: The samples are masked by a threshold in Equation (3), which is inconsistent. And a mistake. A: Thank you for your comments. You are correct that there is a mistake in the equation. The value w_i was typed incorrectly and should be l_i. Thus, w_i=1 – 1/lambda*l_i. In the hard training paradigm, only a few samples are fed into the training process, while in the soft training paradigm, all samples are included and assigned weights. The primary difference between these paradigms lies in the weights assigned to the samples. As shown in Equation (2), the weight w_i is binarized into 0 or 1. In contrast, in Equation (3), the weight w_i takes on a continuous value. Additionally, we apply a threshold in Equation (3) to ensure that the weight does not fall below zero, as we threshold it to zero as a lower bound. Therefore, the equation remains consistent with the principles of the soft training paradigm.

Q: Why the proposed method is suitable for brain imaging task. A: Deep learning methods for brain imaging often encounter high-dimensional features with limited samples, leading to overfitting and poor generalization. Therefore, an efficient learning paradigm is of practical value. Our study aims to address these challenges by proposing a novel approach to enhance model performance and robustness.

Q: Some experimental settings are missing. Fig. 1 should be revised. A: Some important parameters are shown in the implementation part (12th line, 6th page), including alpha_w,0, alpha_fi,0, alpha_w and alpha_fi. The results in Table 1 were obtained using hard PCL and soft PCD training, as indicated in the sensitivity analysis. We apologize for the omission of this description and supply the description. Regarding Fig. 1, we will modify the number of samples in the figure to address the potential for misunderstanding.

Q: Novelty. A: The integration is not a straightforward combination. Instead, it involves a carefully designed and innovative optimization process. Specifically, we introduce a novel self-paced strategy that enhances the model’s ability to handle samples of varying difficulty levels more intelligently and efficiently, thereby improving overall learning performance. Furthermore, our approach demonstrates significant performance improvements, particularly in handling complex 3D datasets and enhancing model generalization capabilities. The experimental results highlight its effectiveness and practical value.

For reviewer 3 & 4: Thank you for your astute comments. Regarding the parameter settings, there are two parameters for each curriculum setting (PCL and PCD), resulting in a total of four parameters. We believe that optimizing these four parameters is acceptable. Additionally, we have validated these parameters using a grid search and found that two increasing rate parameters (alpha) do not significantly affect the results. Our method consistently outperforms others when these two parameters are set within a proper range, indicating that our method is not highly sensitive to the increasing parameters. For the other two parameters, i.e., the initial values of the pace parameters (lambda_0), all baselines also face the value selection issue. In this study, we determined these values by a grid search with a step of 0.1. To address this, we unified the selection rule of all baselines using the grid search. We would clarify this in the experimental settings in the revised version. And developing an automatic and robust initial value selection setting is still challenging for the paradigm of curriculum learning, which is one of our future works.

Others: We would like to place a clearer visual example in the revised version for a better comparison.




Meta-Review

Meta-review #1

  • After you have reviewed the rebuttal and updated reviews, please provide your recommendation based on all reviews and the authors’ rebuttal.

    Reject

  • Please justify your recommendation. You may optionally write justifications for ‘accepts’, but are expected to write a justification for ‘rejects’

    The concerns from reviewers may not be fully addressed.

  • What is the rank of this paper among all your rebuttal papers? Use a number between 1/n (best paper in your stack) and n/n (worst paper in your stack of n papers). If this paper is among the bottom 30% of your stack, feel free to use NR (not ranked).

    The concerns from reviewers may not be fully addressed.



Meta-review #2

  • After you have reviewed the rebuttal and updated reviews, please provide your recommendation based on all reviews and the authors’ rebuttal.

    Accept

  • Please justify your recommendation. You may optionally write justifications for ‘accepts’, but are expected to write a justification for ‘rejects’

    The paper addresses the challenge of high-dimensional, low-sample-size brain imaging data by proposing an efficient algorithm. The authors provided convincing explanations in the rebuttal regarding the hyper-parameter selection issue raised by two reviewers, which I believe adequately address their concerns. Despite some missing details, the paper tackles a significant problem and offers a valuable contribution to the field. Therefore, I recommend acceptance.

  • What is the rank of this paper among all your rebuttal papers? Use a number between 1/n (best paper in your stack) and n/n (worst paper in your stack of n papers). If this paper is among the bottom 30% of your stack, feel free to use NR (not ranked).

    The paper addresses the challenge of high-dimensional, low-sample-size brain imaging data by proposing an efficient algorithm. The authors provided convincing explanations in the rebuttal regarding the hyper-parameter selection issue raised by two reviewers, which I believe adequately address their concerns. Despite some missing details, the paper tackles a significant problem and offers a valuable contribution to the field. Therefore, I recommend acceptance.



Meta-review #3

  • After you have reviewed the rebuttal and updated reviews, please provide your recommendation based on all reviews and the authors’ rebuttal.

    Accept

  • Please justify your recommendation. You may optionally write justifications for ‘accepts’, but are expected to write a justification for ‘rejects’

    The main concerns are missed details and novelty, that are well addressed in rebuttal. I think this paper has achieved a challenging tasks in brain imaging analysis (high-dimentional but limited samples) by using the existing cutting-edge technology. Therefore, I recommend accept.

  • What is the rank of this paper among all your rebuttal papers? Use a number between 1/n (best paper in your stack) and n/n (worst paper in your stack of n papers). If this paper is among the bottom 30% of your stack, feel free to use NR (not ranked).

    The main concerns are missed details and novelty, that are well addressed in rebuttal. I think this paper has achieved a challenging tasks in brain imaging analysis (high-dimentional but limited samples) by using the existing cutting-edge technology. Therefore, I recommend accept.



back to top