Combining compute and storage in a single unit with low-power electronics

Researchers from Graphene Flagship Partners École Polytechnique Fédérale de Lausanne (EPFL, Switzerland) and the University of Pisa (Italy), in collaboration with the University of Messina (Italy), developed a brain-inspired low-power model based on ultra-thin semiconducting molybdenum disulfide (MoS2). This includes circuits that behave like artificial neurons (perceptrons) and can potentially outperform their silicon counterparts.

The transfer of data between a computer’s processor and memory accounts for over a third of all the energy consumed by our technological devices. This wasted time and energy is one of the limitations of the current silicon-based computer architecture, where logic and memory reside in different components. Researchers are looking for new materials that could revolutionize our computers by building processors that also function as storage devices. These systems, which mimic how neurons work, could speed up computationally intensive applications like image processing, machine learning, and artificial neural networks.

Extension of previous research, published in Nature In 2020, the Graphene Flagship team developed a processor incorporating a series of floating gate memories – devices that can trap electrons for long periods of time – using monolayer MoS2 as an active channel. Floating gate transistors are already present in flash memory systems used in cameras, smartphones and computers. The new transistors based on MoS2 be built much smaller and consume less energy. Because they’re so thin, they’re also more sensitive than silicon to differences in the number of electrons entering and exiting the floating gate. By adding or removing electrons from the floating gate, the Graphene Flagship team achieved 16 different conductivity levels in the MoS2 Channel. This allows access to greater possibilities compared to silicon-based memory devices, where it is difficult to program more than two levels: conducting or non-conducting.

The researchers evaluated the device’s performance in recognizing numbers on liquid crystal displays (LCDs), such as digits in digital clocks, and achieved an experimental accuracy of up to 91.5%. To do this, the team used what is known as an artificial neural network, a circuit resembling a cluster of interconnected neurons. The ability of MoS2 Memories storing different levels of electrical conductivity have been used to mimic the strength of connections between neurons.

“We went beyond computer simulation and produced a working circuit using high-quality ultra-thin materials,” says Andras Kis of Graphene Flagship Partner EPFL, who led the study.

Recognizing these digital numbers can be considered one of the simplest forms of AI-based image recognition. The Graphene Flagship team also showed that if these circuits were upgraded and trained to recognize specific objects in ordinary photos, they would use nearly 40 times less power than state-of-the-art silicon-based circuits.

This processor was built with memory chips that are just 180 nanometers long, and the team claims they could shrink them down to 50 nanometers without significant performance loss. Here again this MoS2-capable device could beat silicon: Reducing the size of silicon memory devices is difficult because electrons tend to leak out of the floating gate.

“Layered materials, particularly the polycrystalline or amorphous forms of BN and transition-metal dichalcogenides, are now being seriously considered for the development of innovative memristive devices, with potential implications for low-power information storage and processing. This work is a new milestone in the fabrication of energy-efficient devices that mimic artificial neurons, outperform competing silicon devices, and pave the way to future neuromorphic computing architectures,” said Stephan Roche, Graphene Flagship Leader of the Enabling Science & Materials Division.

Andrea C. Ferrari, Science and Technology Officer of the Graphene Flagship, adds: “Neuromorphic computing devices with low power consumption are at the heart of the Graphene Flagship’s activities, which focus on electronics based on graphene and related materials. This work provides a strong foundation for future efforts to translate these results into innovative devices and for the Horizon Europe phase of the Graphene flagship.”

references

Migliato Marega, Guilherme, et al. “Low-Power Artificial Neural Network Perceptron Based on Monolayer MoS2.” ACS nano (2022). https://pubs.acs.org/doi/abs/10.1021/acsnano.1c07065

Marega, Guilherme Migliato, et al. “Logic-in-Memory based on an atomically thin semiconductor.” Nature 587.7832 (2020): 72-77. https://www.nature.com/articles/s41586-020-2861-0

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *