This paper formulates a fuzzy logic neuron that uses n-uninorms to construct uni-nullneurons. A fuzzy neural network (FNN) composed of these neurons is easy to operate with nullnorms and uninorms at different times, which results in high accuracy of the model outputs and increases the flexibility in connecting the rule antecedents (enabling AND and OR connections within one rule). This, in turn, may allow experts/operators to extract more knowledge from data. The FNN uses a one-versus-rest classifier learning scheme for multi-class classification problems, where neuron activation levels construct the (indicator) regression matrix; this results in a non-linear regression by indicator, which can resolve the inherent class masking problem apparent in the linear case. We propose a specific neuron-selection strategy in the learning stage that applies Lasso to bootstrap replications in order to ensure that the rule base is as compact as possible and induced by a low number of neurons. To evaluate the new neuron acting in FNNs, we performed pattern classification, and regression tests. Compared with traditional FNN in the literature, our variant showed improved model accuracies for several high-dimensional real-world datasets in binary and multi-class classification and regression problems. Combined with the ability to generate human-readable rules, this offers the ability to generate parsimonious responses with a high degree of confidence.
An advanced interpretable Fuzzy Neural Network model based on uni-nullneuron constructed from n-uninorms
de Campos Souza, P. V.;
2022-01-01
Abstract
This paper formulates a fuzzy logic neuron that uses n-uninorms to construct uni-nullneurons. A fuzzy neural network (FNN) composed of these neurons is easy to operate with nullnorms and uninorms at different times, which results in high accuracy of the model outputs and increases the flexibility in connecting the rule antecedents (enabling AND and OR connections within one rule). This, in turn, may allow experts/operators to extract more knowledge from data. The FNN uses a one-versus-rest classifier learning scheme for multi-class classification problems, where neuron activation levels construct the (indicator) regression matrix; this results in a non-linear regression by indicator, which can resolve the inherent class masking problem apparent in the linear case. We propose a specific neuron-selection strategy in the learning stage that applies Lasso to bootstrap replications in order to ensure that the rule base is as compact as possible and induced by a low number of neurons. To evaluate the new neuron acting in FNNs, we performed pattern classification, and regression tests. Compared with traditional FNN in the literature, our variant showed improved model accuracies for several high-dimensional real-world datasets in binary and multi-class classification and regression problems. Combined with the ability to generate human-readable rules, this offers the ability to generate parsimonious responses with a high degree of confidence.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.