Abstract
In this paper, we introduce coverage vs acceptance-error graphs as a visualization tool for comparing the performance of conformal predictors at a given significance level ? for any k-class classification task with k = 2. We show that by plotting the performance of each predictor for different significance levels in ? ? [0, 1], we receive a coverage vs acceptance-error curve for that predictor. The area under this curve represents the probability that the p-value of randomly chosen true class-label of any test instance is greater than the p-value of any other false class-label for the same or any other test instance. This area can be used as a metric for predictive efficiency of a conformal predictor, when the validity has been established. The new metric is unique in that it is related to the empirical coverage rate, and extensive experiments confirmed its utility and difference from existing predictive efficiency criteria.
Original language | English |
---|---|
Title of host publication | Proceedings of Machine Learning Research |
Editors | Neil Lawrence |
Pages | 534-545 |
Number of pages | 12 |
Volume | 204 |
Publication status | Published - 1 Jan 2023 |
Event | 12th Symposium on Conformal and Probabilistic Prediction with Applications, COPA 2023 - Limassol, Cyprus Duration: 13 Sept 2023 → 15 Sept 2023 https://cml.rhul.ac.uk/copa2023/ https://copa-conference.com/ |
Publication series
Series | Proceedings of Machine Learning Research |
---|---|
ISSN | 2640-3498 |
Conference
Conference | 12th Symposium on Conformal and Probabilistic Prediction with Applications, COPA 2023 |
---|---|
Abbreviated title | COPA 2023 |
Country/Territory | Cyprus |
City | Limassol |
Period | 13/09/23 → 15/09/23 |
Internet address |
Keywords
- Conformal Prediction
- Metrics
- Performance Curves
- Predictive Efficiency