Certainty Groups: A Practical Approach to Distinguish Confidence Levels in Neural Networks

##plugins.themes.bootstrap3.article.main##

##plugins.themes.bootstrap3.article.sidebar##

Published Jun 29, 2022
Lukas Lodes Alexander Schiendorfer

Abstract

Machine Learning (ML), in particular classification with deep neural nets, can be applied to a variety of industrial tasks. It can augment established methods for controlling manufacturing processes such as statistical process control (SPC) to detect non-obvious patterns in high-dimensional input data. However, due to the widespread issue of model miscalibration in neural networks, there is a need for estimating the predictive uncertainty of these models. Many established approaches for uncertainty estimation output scores that are difficult to put into actionable insight. We therefore introduce the concept of certainty groups which distinguish the predictions of a neural network into the normal group and the certainty group. The certainty group contains only predictions with a very high accuracy that can be set up to 100%. We present an approach to compute these certainty groups and demonstrate our approach on two datasets from a PHM setting.

How to Cite

Lodes, L., & Schiendorfer, A. (2022). Certainty Groups: A Practical Approach to Distinguish Confidence Levels in Neural Networks. PHM Society European Conference, 7(1), 294–305. https://doi.org/10.36001/phme.2022.v7i1.3331
Abstract 444 | PDF Downloads 301

##plugins.themes.bootstrap3.article.details##

Keywords

Machine Learning, classification, uncertainty, estimaton, neural network

Section
Technical Papers