Abstract
By generating prediction intervals (PIs) to quantify the uncertainty of each prediction in deep learning regression, the risk of wrong predictions can be effectively controlled. High-quality PIs need to be as narrow as possible, whilst covering a preset proportion of real labels. At present, many approaches to improve the quality of PIs can effectively reduce the width of PIs, but they do not ensure that enough real labels are captured. Inductive Conformal Predictor (ICP) is an algorithm that can generate effective PIs which is theoretically guaranteed to cover a preset proportion of data. However, typically ICP is not directly optimized to yield minimal PI width. In this study, we propose Directly Optimized Inductive Conformal Regression (DOICR) for neural networks that takes only the average width of PIs as the loss function and increases the quality of PIs through an optimized scheme, under the validity condition that sufficient real labels are captured in the PIs. Benchmark experiments show that DOICR outperforms current state-of-the-art algorithms for regression problems using underlying Deep Neural Network structures for both tabular and image data.
Original language | English |
---|---|
Pages (from-to) | 194-205 |
Number of pages | 12 |
Journal | Neural Networks |
Volume | 168 |
DOIs | |
Publication status | Published - Nov 2023 |
Keywords
- Conformal prediction
- Deep learning
- Neural networks
- Prediction intervals
- Uncertainty estimation
ASJC Scopus subject areas
- Cognitive Neuroscience
- Artificial Intelligence