In this paper a new tool is proposed as a possible aid to study differences and similarities between the human and the artificial neural network (NN) learning of some verbal and mathematical elementary abilities. For this purpose, simple NNs of the multi layer kind (MLNN) have been build. These MLNNs are able to recognize some graphemes and/or to make additions of integers up to 1000. An algorithm based on dynamic character recognition has allowed to limit significantly the data size, making easier the NN optimization phase of training. The adopted method of grapheme encoding has allowed to generate automatically large training sets upon which the MLNNs have been trained. Then, a test set has been generated to evaluate the MLNN prediction capacity. The analysis of results has shown some interesting characteristics of the trained nets, such as, for example, the possible appearance of very rudimentary symptoms analogous to dyslexia. The specialization of the function of some groups of neurons in the neural system has been also investigated by procuring an artificial damage to the MLNN (in one or more neurons) and by evaluating the MLNN response. ©2010 IEEE.
Simulation of verbal and mathematical learning by means of simple neural networks / Belfiore, Nicola Pio; Imre J., Rudas; Matrisciano, Apollonia. - ELETTRONICO. - (2010), pp. 52-59. (Intervento presentato al convegno 2010 9th International Conference on Information Technology Based Higher Education and Training, ITHET 2010 tenutosi a Cappadocia; Turkey nel 29 April 2010 through 1 May 2010) [10.1109/ithet.2010.5480067].
Simulation of verbal and mathematical learning by means of simple neural networks
BELFIORE, Nicola Pio;MATRISCIANO, APOLLONIA
2010
Abstract
In this paper a new tool is proposed as a possible aid to study differences and similarities between the human and the artificial neural network (NN) learning of some verbal and mathematical elementary abilities. For this purpose, simple NNs of the multi layer kind (MLNN) have been build. These MLNNs are able to recognize some graphemes and/or to make additions of integers up to 1000. An algorithm based on dynamic character recognition has allowed to limit significantly the data size, making easier the NN optimization phase of training. The adopted method of grapheme encoding has allowed to generate automatically large training sets upon which the MLNNs have been trained. Then, a test set has been generated to evaluate the MLNN prediction capacity. The analysis of results has shown some interesting characteristics of the trained nets, such as, for example, the possible appearance of very rudimentary symptoms analogous to dyslexia. The specialization of the function of some groups of neurons in the neural system has been also investigated by procuring an artificial damage to the MLNN (in one or more neurons) and by evaluating the MLNN response. ©2010 IEEE.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.