December 11, 2023
October 12, 2023
Chlaily, Saloua; Ratha, Debanshu; Lozou, Pigi; Marinoni, Andrea
Uncertainty is unavoidable in classification tasks and might originate from data (e.g., due to noise or wrong labeling), or the model (e.g., due to erroneous assumptions, etc). Providing an assessment of uncertainty associated with each outcome is of paramount importance in assessing the reliability of classification algorithms, especially on unseen data. In this work, we propose two measures of uncertainty in classification. One of the measures is developed from a geometrical perspective and quantifies a classifier's distance from a random guess. In contrast, the second proposed uncertainty measure is homophily-based since it takes into account the similarity between the classes. Accordingly, it reflects the type of mistaken classes. The proposed measures are not aggregated, i.e., they provide an uncertainty assessment to each data point. Moreover, they do not require label information. Using several datasets, we demonstrate the proposed measures’ differences and merit in assessing uncertainty in classification. The source code is available at github.com/pioui/uncertainty .
On Measures of Uncertainty in Classification
Chlaily, Saloua; Ratha, Debanshu; Lozou, Pigi; Marinoni, Andrea
IEEE Transactions on Signal Processing 2023 ;Volum 71. s.3710-3725
October 12, 2023
Chlaily, Saloua; Ratha, Debanshu; Lozou, Pigi; Marinoni, Andrea
IEEE Transactions on Signal Processing 2023 ;Volum 71. s.3710-3725
October 12, 2023