Why can’t a computer both play chess and recognize images familiar to most people? It’s a simple question that cuts to the core of the biggest challenges in computing today: Despite their immense processing power, today’s computers still fail when confronted with some of the most basic human tasks.
The problem stems from computers’ lack of general intelligence, or the ability to excel at more than just one task. Despite our range of niche obsessions, us humans tend to be pretty good at this — but we can’t say the same for machines. While the Deep Blue supercomputer bested humans at chess over two decades ago, it would have utterly failed at, say, comprehending the meaning of a handshake (so it doesn’t sound too fun to hang out with).
Now, powerful machine learning algorithms are edging closer toward general intelligence by demonstrating their ability to recognize patterns and emulate human speech. But true general intelligence remains difficult for devices to achieve.
To tackle this feat, researchers have recently proposed computer designs inspired by the human brain’s structure, specifically its tens of billions of neurons that are laced together into intricate, interrelated networks.
Take, for instance, the SpiNNaker supercomputer from the University of Manchester in England. The high-tech machine can emulate tens of thousands of neurons to mimic the way a brain works.
But that’s still only a fraction of the number of neurons contained in our powerful heads, and SpiNNaker is a long way from being human. Instead, Axel Hoffmann, a materials scientist at the University of Illinois at Urbana-Champaign, hopes that the solution lies in futuristic quantum materials.
In a paper published in the journal APL Materials last month, Hoffmann and his co-authors explore how these materials would enable computer chips to behave like human neurons. These chips could carry out functions far more efficiently than most computers, and even form networks that behave like regions of the brain.
The Power Problem — It’s difficult to create computers with human-like cognition because we need massive amounts of power to emulate the brain. While our minds only require about 20 watts of power to do their thing, a supercomputer like China’s Tianhe-2 sucks up 17.8 million watts (enough to power a small town) and still hasn’t reached general intelligence.
Clearly, throwing more processors at the problem isn’t a sustainable solution. That’s why scientists like Hoffmann are rethinking the basic architecture of a computer, which encodes information using long strings of ones and zeros.
Maintaining those ones and zeros takes a lot of energy, partly because computers need to keep them strictly separated, Hoffmann says.
And unlike the mind, traditional computers carry out their processing separately from their memory. That means they use a lot of energy simply carrying information back and forth from memory to processor, which sounds pretty exhausting.
Recreating Neurons — To circumvent this problem, Hoffmann and other researchers want to make computer chips inspired by the basic mechanics of our brains’ neurons and synapses. In their APL Materials paper, Hoffmann and his co-authors lay out an innovative approach that would incorporate circuits made not of silicon, the current standard ingredient, but of quantum materials.
A quantum material may sound far-fetched, but Hoffmann says it’s simply an umbrella term for materials with properties that traditional models of physics can’t quite explain. (Quantum materials are also distinct from quantum computers, which rely on units of information called qubits that hold superpositions of a one and a zero simultaneously.)
Specifically, Hoffmann is most interested in materials that can change state — from, for instance, a zero to a one — with very little energy input, also called “non-linear responses.” This property is found in substances such as vanadium dioxide, a dark blue compound that can transition efficiently from a conductor to an insulator, and do so at nearly room temperature.
Hoffmann likens such responses to what goes down in water when it’s heated or cooled. “When we change the temperature of water, not much happens until suddenly it either freezes or starts boiling,” he says.
Crossing the threshold — The neurons in our brains rely on similar tipping points, also called thresholds. Mimicking that property within a computer circuit made of vanadium oxide could unlock super-powerful computing abilities at a fraction of the energy cost. “This could be a big step forward in establishing energy-efficient brain-like systems,” Hoffmann says.
To take advantage of these tipping points, researchers can utilize materials that can change how they're magnetized. Theoretically, these materials would oscillate between different magnetic states, benefitting from the kind of non-linear responses Hoffmann is searching for. While it's still an area of developing research, scientists have observed these kinds of magnetic oscillations in layered combinations of metals like iron and rhodium, along with cobalt and nickel.
“It appears that these magnetic oscillators can resemble a lot of the properties that we know from natural neurons,” he says.
Along with providing computers with brain-inspired efficiency, Hoffman and his colleagues see additional possibilities for machines concocted from quantum materials. For example, when hooked together, magnetic oscillators seem to influence each other, much in the same way that networks of neurons work in sync to perform complex tasks. This behavior could eventually pave the way for general intelligence — and perhaps even consciousness.
“We believe that larger networks of coupled magnetic oscillators may provide similarly complex dynamics as the natural brain,” Hoffmann says.
All in all, this development could mark a major step toward forging artificial intelligence that can rival our own minds. And while they’re at it, computers may even become better conversationalists.