Meta will be building AI a system that can respond like humans when they listen or read any sentence. This is after its new research project to build an AI system. It will comprehend complex texts by learning from human responses to texts and speeches. Meta which was formerly known as Facebook has announced a long-term plan on which it is working. It announced an artificial intelligence research initiative that the company is going to take to enhance the understanding of the human brain. The specific purpose of the initiative is to understand how the human brain processes speech and text and they are also trying to build AI systems that learn like people.
The company is planning to do this in collaboration with neuroimaging center Neurospin (CEA) and Inria. It says that they are comparing how AI language models and the rain respond to the spoken or the written sentences.
Meta building AI that processes speech & text like humans
The company made this statement that they will use insights from the platform to guide the development of AI that will process the text and speeches as effectively as people do. For the past two years, Meta successfully applies deep learning techniques to the public euro-imaging data. It will analyze how the human brain processes when come across words and sentences.
Children can easily learn that “orange” can be referred to both as a fruit and a color. But for AI this is not easy to recognize the distinction. The AI system cannot do this efficiently as humans do.
The researchers in Meta have found that the language models that resemble brain activity are those languages that can predict the next word from the context. For instance once upon a…..time). The brain anticipates the words and ideas far ahead in time and in most languages are like that you could easily predict the very next word. In order to unlock this long-range prediction capability, the AI model will help it to function like the human brain. The brain prediction range still challenges today’s language models. Today’s language models are not able to anticipate complex ideas, narratives, and plots like human beings do.
In collaboration with Inria, meta compared a variety of language models with the response of the brain of 345 volunteers. The process was simple: the volunteers have to listen to complex narratives and anticipate them. The result of the research showed that specific parts of our brain are best accounted for by language only. The regions enhances to comprehend complex sentences and ideas.