A network of artificial neurons learns to speak with humans without knowing the rules of language
Researchers of the University of Sassari (Italy) and the University of Plymouth (United Kingdom) have developed a cognitive model formed by two million of interconnected artificial neurons. This network is able to learn how to communicate with humans
A group of researches of the University of Sassari (Italy) and the University of Plymouth (United Kingdom) have developed a cognitive model formed by two million of interconnected artificial neurons. This network is able to learn how to communicate with humans without knowing any language rule, just throughout communicating with a human interlocutor.
The model is called Annabell (Artificial Neural Network with Adaptive Behavior Exploited for Language Learning), points out the press note of the University of Sassari, compiled by EurekAlert! and Tendencias 21.
Nowadays, there are many researchers who believe that our brain is able to develop higher cognitive skills simply by interacting with the environment, starting from very little innate knowledge. The Annabell model appears to confirm this viewpoint.
Annabell does not have pre-coded language knowledge, it learns simply by interacting with a human interlocutor, thanks to two essential mechanisms. These mechanisms are also present in the brain: synaptic plasticity and neural gating.
Synaptic plasticity is the ability of connection between two neurons to increase its efficiency when the two neurons are often active simultaneously, or almost simultaneously. This mechanism is essential for learning and for long-term memory.
Neural gating mechanisms are based on the properties of certain neurons (called bistable neurons) to behave as switches that can be turned "on" or "off" by a control signal that comes from other neurons.
When turned on, bistable neurons transmit the signal from a part of the brain to another, otherwise neurons block it. The model is able to learn, thanks to synaptic plasticity, to control the signals that open and close the neural gates, so as to control the flow of information among different areas.
The cognitive model has been validated using a database of about 1,500 sentences, based on scientific literature on early language development, and has produced a total of 500 sentences, containing nouns, verbs, adjectives, pronouns, and other classes of words, demonstrating its ability on human language processing.
An international research in which the Department of Artificial Intelligence of the Faculty of Informatics of the Polytechnic University of Madrid was involved concluded, four years ago, that collaboration and interaction are decisive in language evolution.
The model is called Annabell (Artificial Neural Network with Adaptive Behavior Exploited for Language Learning), points out the press note of the University of Sassari, compiled by EurekAlert! and Tendencias 21.
Nowadays, there are many researchers who believe that our brain is able to develop higher cognitive skills simply by interacting with the environment, starting from very little innate knowledge. The Annabell model appears to confirm this viewpoint.
Annabell does not have pre-coded language knowledge, it learns simply by interacting with a human interlocutor, thanks to two essential mechanisms. These mechanisms are also present in the brain: synaptic plasticity and neural gating.
Synaptic plasticity is the ability of connection between two neurons to increase its efficiency when the two neurons are often active simultaneously, or almost simultaneously. This mechanism is essential for learning and for long-term memory.
Neural gating mechanisms are based on the properties of certain neurons (called bistable neurons) to behave as switches that can be turned "on" or "off" by a control signal that comes from other neurons.
When turned on, bistable neurons transmit the signal from a part of the brain to another, otherwise neurons block it. The model is able to learn, thanks to synaptic plasticity, to control the signals that open and close the neural gates, so as to control the flow of information among different areas.
The cognitive model has been validated using a database of about 1,500 sentences, based on scientific literature on early language development, and has produced a total of 500 sentences, containing nouns, verbs, adjectives, pronouns, and other classes of words, demonstrating its ability on human language processing.
An international research in which the Department of Artificial Intelligence of the Faculty of Informatics of the Polytechnic University of Madrid was involved concluded, four years ago, that collaboration and interaction are decisive in language evolution.
Cátedra UNESCO de Tecnologías Lingüísticas