Imperial researchers have found that variability between brain cells might speed up learning and improve the performance of the brain and future artificial intelligence (AI).
The new study found that by tweaking the electrical properties of individual cells in simulations of brain networks, the networks learned faster than simulations with identical cells. They also found that the networks needed fewer of the tweaked cells to get the same results, and that the method is less energy intensive than models with identical cells. The authors say that their findings could teach us about why our brains are so good at learning, and might also help us to build better artificially intelligent systems, such as digital assistants that can recognise voices and faces, or self-driving car technology. First author Nicolas Perez, a PhD student at Imperial College London's Department of Electrical and Electronic Engineering, said: "The brain needs to be energy efficient while still being able to excel at solving complex tasks. Our work suggests that having a diversity of neurons in both brains and AI systems fulfils both these requirements and could boost learning." The research is published in Nature Communications. Read more...
No comments:
Post a Comment