August 26, 2022
June 30, 2022
Nanqing Dong, Michael Kampffmeyer, Irina Voiculescu
Using decentralized data for federated training is one promising emerging research direction for alleviating data scarcity in the medical domain. However, in contrast to large-scale fully labeled data commonly seen in general object recognition tasks, the local medical datasets are more likely to only have images annotated for a subset of classes of interest due to high annotation costs. In this paper, we consider a practical yet under-explored problem, where underrepresented classes only have few labeled instances available and only exist in a few clients of the federated system. We show that standard federated learning approaches fail to learn robust multi-label classifiers with extreme class imbalance and address it by proposing a novel federated learning framework, FedFew. FedFew consists of three stages, where the first stage leverages federated self-supervised learning to learn class-agnostic representations. In the second stage, the decentralized partially labeled data are exploited to learn an energy-based multi-label classifier for the common classes. Finally, the underrepresented classes are detected based on the energy and a prototype-based nearest-neighbor model is proposed for few-shot matching. We evaluate FedFew on multi-label thoracic disease classification tasks and demonstrate that it outperforms the federated baselines by a large margin.
Learning Underrepresented Classes from Decentralized Partially Labeled Medical Images
Nanqing Dong, Michael Kampffmeyer, Irina Voiculescu
International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI 2022)
June 30, 2022
Nanqing Dong, Michael Kampffmeyer, Irina Voiculescu
International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI 2022)
June 30, 2022