Publication: Uncertainty propagation based MINLP approach for artificial neural network structure reduction
Files
Program
KU-Authors
KU Authors
Co-Authors
Şıldır, Hasan
Sarrafi, Şahin
Advisor
Publication Date
Language
English
Type
Journal Title
Journal ISSN
Volume Title
Abstract
The performance of artificial neural networks (ANNs) is highly influenced by the selection of input variables and the architecture defined by hyper parameters such as the number of neurons in the hidden layer and connections between network variables. Although there are some black-box and trial and error based studies in the literature to deal with these issues, it is fair to state that a rigorous and systematic method providing global and unique solution is still missing. Accordingly, in this study, a mixed integer nonlinear programming (MINLP) formulation is proposed to detect the best features and connections among the neural network elements while propagating parameter and output uncertainties for regression problems. The objective of the formulation is to minimize the covariance of the estimated parameters while by (i) detecting the ideal number of neurons, (ii) synthesizing the connection configuration between those neurons, inputs and outputs, and (iii) selecting optimum input variables in a multi variable data set to design and ensure identifiable ANN architectures. As a result, suggested approach provides a robust and optimal ANN architecture with tighter prediction bounds obtained from propagation of parameter uncertainty, and higher prediction accuracy compared to the traditional fully connected approach and other benchmarks. Furthermore, such a performance is obtained after elimination of approximately 85% and 90% of the connections, for two case studies respectively, compared to traditional ANN in addition to significant reduction in the input subset.
Source:
Processes
Publisher:
Multidisciplinary Digital Publishing Institute (MDPI)
Keywords:
Subject
Engineering, Chemical engineering