Brain-Based Circuitry Just Made Artificial Intelligence A Whole Lot Faster
We take the vast computing power of our brains for granted. But scientists are still trying to get computers to the brain’s level.
This is how we ended up with artificial intelligence algorithms that learn through virtual neurons — the neural net.
Now a team of engineers has taken another step closer to emulating the computers in our noggins: they’ve built a physical neural network, with circuits that even more closely resemble neurons. When they tested an AI algorithm on the new type of circuitry, they found that it performed as well as conventional neural nets already in use. But! the new integrated neural net system completed the task with 100 times less energy than a conventional AI algorithm.
If these new neuron-based circuits take off, artificial intelligence researchers will soon be able to do a lot more computing with a lot less energy. Like using a tin can to communicate with an actual telephone, computer chips and neural net algorithms just speak two different languages, and work slower as a result. But in the new system, the hardware and software were built to work perfectly together. So the new AI system completed the tasks much faster than a conventional system, without any drop in accuracy.
This is a step up from previous attempts to make silicon-based neural networks. Usually, the AI systems built on these sorts of neuron-inspired chips don’t usually work as well as conventional artificial intelligence. But the new research modeled two types of neurons: one that was geared for quick computations and another that was designed to store long-term memory, the researchers explained to MIT Technology Review.
There’s good reason to be skeptical of any researcher who claims that the answer to truly comprehensive, general artificial intelligence and consciousness is to recreate the human brain. That’s because, fundamentally, we know very little about how the brain works. And chances are, there are lots of things in our brains that a computer would find useless.
But even so, the researchers behind the new artificial neural hardware have been able to glean important lessons from how our brains work and apply it to computer science. In that sense, they have figured out how to further artificial intelligence by cherry-picking what our brains have to offer without getting weighed down trying to rebuild the whole darn thing.
As technology sucks up more and more power, the hundred-fold improvement to energy efficiency in this AI system means scientists will be able to pursue big questions without leaving such a huge footprint on the environment.
Source: Futurism