A team of Chinese scientists has established a novel brain-inspired network model based on internal complexity to address challenges faced by traditional models, such as high consumption of computing resources, according to the Institute of Automation under the Chinese Academy of Sciences on Friday.
One of the key objectives in the current development of the artificial intelligence (AI) is to build more general-purpose AI, enabling models to possess a broader and more general cognitive ability.
The current popular approach used by large models is to build larger, deeper, and wider neural networks based on the law of scale, which can be referred to as a method for achieving general intelligence based on "external complexity," said Li Guoqi, a researcher from the Institute of Automation.
However, this approach faces challenges such as unsustainable consumption of computing resources and energy, as well as a lack of interpretability.
On the other hand, a human brain has about 100 billion neurons and nearly 1,000 trillion synaptic connections, with each neuron having a rich and diverse internal structure. But the power consumption of a human brain is only around 20 watts.
Inspired by the dynamics of brain neurons, the scientists, from the Institute of Automation and other research institutions such as Tsinghua University and Peking University, used an "internal complexity" approach to achieve general intelligence.
Their experiments verified the effectiveness and reliability of the internal complexity model in handling complex tasks, providing new methods and theoretical support for integrating the dynamic characteristics of neuroscience into AI, and also offering feasible solutions for optimizing and enhancing the practical performance of AI models.
The study was published recently in the journal Nature Computational Science. (Xinhua)
86-10-68597521 (day)
86-10-68597289 (night)
86-10-68511095 (day)
86-10-68512458 (night)
cas_en@cas.cn
52 Sanlihe Rd., Xicheng District,
Beijing, China (100864)