Combining thresholded real values for designing an artificial neuron in a neural network
Résumé
Among all of the available non-linear aggregation functions, here we are interested in aggregations based on weighted minimum and weighted maximum operations [8]. As these operators were originally developed within a possibility theory and fuzzy rule framework, such operators cannot be easily integrated into a neural network because the values that are usually considered belong to [0, 1]. For gradient descent based learning, a neuron must be an aggregation function derivable with respect to its inputs and synaptic weights, whose variables (synaptic weights, inputs and outputs) must all be signed real values. We thus propose an extension of weighted maximum based aggregation to enable this learning process. We show that such an aggregation can be seen as a combination of four Sugeno integrals. Finally, we compare this type of approach with the classical one.
Origine | Fichiers produits par l'(les) auteur(s) |
---|