Communication-aware neural networks could advance edge computing

Spread the love
Communication-aware neural networks that could advance edge computing
An overview of communication-aware in-memory wireless neural networks. Credit: Yang et al.

Edge computing is an emerging IT architecture that enables the processing of data locally by smartphones, autonomous vehicles, local servers, and other IoT devices instead of sending it to be processed at a centralized large data center. This approach could allow artificial intelligence (AI) models and other computational systems to perform tasks rapidly, while consuming less power.

Despite the potential of this approach, typically local devices have a limited battery capacity and restricted computing capabilities. This means they often need to send data to remote cloud servers via the internet to complete complex calculations. This transmission of information via wireless communication can consume significant amounts of energy, while also slowing down the rates of transmission.

Researchers at Nanjing University recently introduced a new approach that could potentially boost the speed of communication between edge devices and cloud servers, while also reducing energy consumption. Their proposed strategy, introduced in a paper published in Nature Electronics, relies on newly developed communication-aware in-memory wireless neural networks, new computational tools that combine computing, memory, and wireless communication into a single AI-powered system.

“The inspiration for this paper came from our belief that the future will be filled with a vast number of intelligent terminal devices,” Feng Miao, senior author of the paper, told Tech Xplore. “These devices are expected to possess remarkable levels of intelligence, which will require enormous computational capability while operating under very limited energy resources. This challenge led us to consider how in-memory computing technologies could help support such a future technological landscape.”

Low power AI that adapts to wireless conditions
The main objective of the recent work by Miao and his colleagues was to explore how in-memory computing architectures, hardware systems that both process and store information, could pave the way for smarter systems that work well under strict energy constraints. They specifically examined this in the context of edge computing applications, where data is processed by electronic devices at the edge of a network, also known as intelligent terminal devices.

“The energy consumption of intelligent terminal devices comes from two main sources: the computation required to perform neural network inference, and the wireless communication between the terminal and the cloud server,” explained Miao. “Our approach reduces the energy consumption of intelligent terminals from two perspectives: hardware design and algorithmic optimization.”

From a hardware standpoint, the researchers introduced a computing-in-memory design that optimizes both computing operations and communications. Their proposed hardware system could process, store, and transmit information while consuming significantly less energy.

“On the algorithmic side, we introduced a communication-aware training method,” said Miao. “In this framework, wireless communication is no longer treated as an independent process that aims for lossless data transmission. Instead, it allows lossy transmission and is integrated into the neural network as an optimizable module.”

Essentially, Miao and his colleagues developed a new approach that can be used to train artificial neural networks to perform inference tasks with good accuracy even when wireless signals are weak and transmission is slower. All this is achieved while minimizing the power-related costs of wireless communication and thus boosting energy-efficiency.

Reducing the power required to transmit AI data
The team’s communication-aware in-memory computing architecture was initially tested on an image classification task, and it was found to perform remarkably well. The neural network achieved an accuracy of 93.71%, which was maintained even when transmission conditions were not ideal.

“Our approach breaks down the traditional barrier between the fields of wireless communications and artificial intelligence, offering new perspectives that could inspire researchers in both communities,” said Miao.

“From a practical standpoint, the method can significantly reduce the wireless communication cost of intelligent terminal devices. In the paper, we provide an example based on the ImageNet dataset: when a terminal device and the cloud collaboratively perform an inference task, our method can reduce the required wireless transmission power by up to 95%.”

Notably, the team’s approach could be applied to various scenarios, with varying wireless channel conditions and modulation schemes. Eventually, it could contribute to the advancement of edge computing applications, allowing intelligent terminal devices to perform well even if they rely on mobile communication technology or other highly dynamic wireless communication sources.

“In our future work, we plan to further improve the practicality of this approach and carry out additional engineering optimizations based on our current results,” added Miao. “For example, we plan to extend the method to more commonly used MIMO communication systems and explore on-chip integration to move the technology closer to real-world deployment.” https://techxplore.com/news/2026-03-communication-aware-neural-networks-advance.html

Neural Networks x 5