This advance, led by Professor J. Joshua Yang and his team, doesn’t just simulate the brain’s behavior digitally, it physically recreates the electrochemical process of biological neurons. The new devices might eventually help reduce the enormous energy consumption of today’s AI systems, while also helping scientists better understand how the brain itself functions.
The technology is based on neuromorphic computing, an approach that mimics the brain’s design and mechanisms to improve hardware. Unlike traditional chips that use only fast but energy-hungry electrons, neuromorphic systems can integrate learning directly into the hardware. This means they have the potential to process information using far less energy, similar to how the human brain functions at only around 20 watts. According to Nature Electronics, where the research was published, these neurons occupy the footprint of a single transistor—compared to the dozens or hundreds usually required.
How a Diffusive Memristor Enables Brain-Like Behavior
At the core of this breakthrough is the diffusive memristor, a device that controls the movement of silver ions within a thin oxide layer. When a voltage is applied, the ions form a conductive channel that allows current to pass, triggering a spike, just like a biological neuron fires.
The process mirrors what happens in the human brain. Electrical signals travel along a neuron, turn into chemical signals at the synapse, and then back into electrical signals in the next neuron. This switch is what Yang’s team set out to replicate using ion dynamics. “Silver is easy to diffuse and gives us the dynamics we need to emulate the biosystem so that we can achieve the function of the neurons, with a very simple structure,” Yang explained.

While not using the exact same ions as the human body, the physics behind the ion movement is remarkably similar. The result is a hardware-based learning system, which avoids the limitations of software-based learning and simulates brain activity more faithfully than current digital systems.
Neurons That Learn, Forget, and Adapt
The artificial neurons successfully demonstrated six distinct properties seen in their biological counterparts. These include leaky integration (gradual signal decay), threshold firing (spike generation after signal accumulation), cascaded propagation, intrinsic plasticity, a refractory period, and stochasticity (random firing behavior).
These features are critical for building AI that behaves more like natural intelligence. For instance, intrinsic plasticity allows the neuron to adapt based on its recent activity, while stochastic behavior prevents it from getting stuck in repetitive patterns. All of this is achieved using just one memristor, one transistor, and one resistor, enabling much smaller and potentially scalable neuromorphic circuits.
In simulations, the device-powered network achieved 91.35 percent accuracy on a standard task involving spoken digit recognition. This level of performance underscores the potential of ion-based neurons as building blocks for energy-efficient neural networks.
Toward Sustainable and Scalable AI
As Yang put it, “It’s not that our chips or computers are not powerful enough for whatever they are doing. It’s that they aren’t efficient enough. They use too much energy.” Large AI models today demand massive processing power and electricity, often scaling into megawatts.
These neurons could change that. By operating with a footprint as small as a single transistor, they promise a dramatic reduction in chip size and energy use. The next goal is to integrate thousands of these neurons into a large-scale network and test their performance on broader cognitive tasks.
One remaining challenge is that silver, used for ion motion in this prototype, isn’t easily compatible with existing semiconductor manufacturing. The team plans to explore alternative materials that offer similar properties but better industrial integration.
This work doesn’t just offer a new route for AI hardware, it also opens the door to understanding more about how real neurons function. As Yang said, this innovation brings researchers a step closer to replicating the efficiency and adaptability of the human brain.
									 
					