Exclusives : Network AI Evolution, 5G to 6G

Network AI Evolution, 5G to 6G

While recently AI has been touted as ‘the most over-hyped technology’ currently in public consciousness, telecoms has used different forms of automation and machine learning [ML] for years. However, using “Network AI” as a term is not necessarily helpful or descriptive when we’re talking about change from the current situation towards something more pervasive.

6GWorld took a masterclass on AI and ML in telecoms networks – where we stand today in 5G and then what’s next – with Rohde & Schwarz’s Andreas Roessler. The company has recently joined the AI-RAN Alliance, and unsurprisingly has a stake in making sure they know in how this might shape the future telecoms ecosystem.

The State of Play

“At first, AI/ML was mainly used in core network operations, starting from Release 15,” Roessler began. In other words, AI was a part of the first 5G release in 2018.

“This involved gathering lots of data to help manage network functions and simplify operations using advanced data analytics. While the 3GPP standard continues to adopt AI/ML methods, how they are implemented is usually up to the network operators.”

The first step in Release 15 was the addition of a Network Data Analytics Function, or NWDAF.

“NWDAF offered analysis of network slices that greatly improves operational intelligence across network functions,” Roessler explained.

In 2022’s Release 17 the NWDAF gained two additional elements: The Analytics Logical function [AnLF] and the Model Training Logical Function [MTLF].

“The MTLF is responsible for creating and improving ML models and supporting the launch of new training services that enhance the network’s adaptability. Meanwhile, the AnLF focuses on conducting detailed interference analysis and data analytics, improving network reliability and performance,” Roessler said.

Release 17 also saw the beginning of work on AI in the RAN. Studies on AI-enabled RAN covered several possible uses including reducing energy consumption, improving load balancing and more.

“Through these efforts, 3GPP paved the way for a smarter, more efficient, and adaptable 5G ecosystem,” Roessler commented.

“These advancements indicate a move towards more autonomous, AI-driven network operations, promising major enhancements in operational efficiency and user experience.”

The State of the Art

While 5G has started introducing some fundamentals for an AI-enabled network, 5G-Advanced takes that further. 3GPP Release 18, the first for 5G-Advanced, was frozen in March 2024 so there are plenty of capabilities which have not yet hit the market. Roessler highlighted a variety of 3GPP working groups who have all been developing new specifications related to AI and ML.

  • The System and Service Aspects Working Group 1 (SA1) has explored methods for sharing AI/ML models across the 5G network, paving the way for a more interconnected and intelligent network fabric.
  • SA2 looked at architectural support for AI/ML-based services within the 5G system, laying the groundwork for improved service delivery and operational efficiency.
  • The SA4 group, which focuses on media services, examined potential uses of AI/ML for 5G media codecs, offering significant improvements in media quality and delivery.
  • Meanwhile, SA5 tackled AI/ML management, aiming to harmonise AI/ML functions across 5G systems for easier management, orchestration, and charging.

6G: Going Native

Discussion of future networks tends to raise the spectre of an “AI-native network.” Given that we have had AI functionality built into the specifications of 5G from the start, what is this supposed to mean? Is it just a buzzword?

“The term “AI-native” implies that AI is a fundamental part of the design and operation of the air interface, rather than just an added enhancement,” Roessler commented.

“Unlike 5G, where AI and ML are applied in specific use cases like network optimization, predictive maintenance, and improved user experience, an AI-native approach for 6G integrates AI as an integral part of the network. This means network protocols, signal processing, and system optimisation in 6G will inherently use AI capabilities. The shift is from using AI as a performance booster to using AI as a key technology component within the future standard.”

This is all very well, but what does that actually translate to in practice? Roessler identified three areas.

“Firstly, the seamless transfer of AI models between network entities (like base stations and user equipment) or across different layers of the network architecture becomes essential. This allows for dynamic updates and optimisations, ensuring that AI models match current operational conditions and requirements,” he said.

Picking up and expanding upon the work in Release 18, he also highlighted the lifecycle management of AI/ML systems.

Roessler commented, “This includes version control, performance monitoring, and continuous learning mechanisms to adapt to new data and scenarios, ensuring the models remain effective over time.”

Perhaps not surprisingly for someone with a test and measurement background, the third aspect he highlighted relates to this. However, it is largely because AI in a complex environment can behave unexpectedly, as highlighted in a recent article on “The Many Faces of Network AI.”

“More than traditional deterministic testing methods may be needed for AI,” Roessler commented.

“We need new strategies that account for the probabilistic nature of AI decisions, the variability of training data, and the potential for model drift over time. This requires a more flexible, adaptive approach to testing and validation, possibly incorporating real-world data in addition to thorough simulations to ensure robustness and reliability.”

Testing, Testing

Roessler, and Rohde & Schwarz more generally, has been wrestling with some of this in 2024 and showcased a testbed for “neural receiver architectures” at MWC in partnership with Nvidia.

The idea of an AI-native 6G air interface is leading some researchers to explore what happens if we replace traditional deterministic signal processing blocks with machine learning concepts and they are designing specific neural networks for these tasks.

“Blocks that logically belong together are trained and replaced by a single machine-learning model. A prominent example is the combination of channel estimation, channel equalization, and demapping into a ‘neural receiver,’” Roessler explained.

While this sounds fun to technical ears, Roessler was quick to point out the practical problem it can solve.

“Wireless systems often involve dynamic and rapidly changing environments. Traditional channel estimation, based on a standardised pilot signal pattern, may not always be effective – for example, in high-mobility scenarios where channel estimation quickly becomes outdated.”

The challenge here is that of accuracy versus the cost to deliver it. It is theoretically possible to deliver a denser pilot signal scheme and reduce estimation errors, but it also creates more overhead and reduces spectral efficiency.

“Some scientific papers suggest that neural receivers can learn the time behaviour of the wireless channel and handle these situations better,” Roessler commented.

“However, these assessments are usually based on simulations and are often compared with non-optimised implementations of traditional signal processing concepts.”

Roessler has been using the testbed to verify (or otherwise) these ideas and upgrading it on an ongoing basis.

“Integrating the transmitter into the training process was a major advancement. We expect that ML-based signal processing will first be implemented in network infrastructure, like the base station, in the initial version of a future 6G standard. This is due to the complexity and computational effort required, which increases power consumption,” he said.

If this can be improved to be a much more lightweight and power-efficient process then we could see this being implemented at the device as well, which would improve performance further.

Perhaps more interestingly, earlier this year Rohde & Schwarz highlighted how a neural receiver can be trained to work well in problematic environments.

“We showed that the neural receiver model must be trained with impaired data to handle analog impairments such as carrier frequency offset (CFO) during inference,” Roessler commented.

“In this latest demonstration, we trained the model to handle a CFO of 0.5 parts-per-million as defined in the 3GPP specification for 5G NR. We showed that the neural receiver still performs well when receiving data with a slight frequency offset, unlike a model untrained to handle such impairments.”

Next: Gen-AI?

We have all heard a great deal about generative AI over the last couple of years, not least as a way to make interfaces between people and computers much more intuitive. Does it have a use within 6G as well? Should we be thinking of 6G as “Gen-AI Native”?

“In the context of a 6G AI-native interface, Generative AI can greatly improve various aspects of wireless systems, including the development of channel models,” Roessler suggested.

In the past, channel models have been developed through extensive data measurement campaigns, picking up information on specific environments and frequency bands. These have held up well enough so far in 4G and 5G, but they do only reflect particular situations. Roessler flagged up ways for Gen-AI to improve on this.

He said, “Researchers can use Generative Adversarial Networks (GANs) or other methods to create realistic and dynamic channel models that capture the complex behaviours of wireless radio channels. Unlike traditional models, AI/ML-generated models may provide a more detailed and adaptable representation of channel environments.”

In fact, work on this is already under way.

“Recent studies suggest that AI-generated channels are superior in training neural receivers. They improve performance in various mobility conditions and adapt dynamically to changing environments, significantly reducing error rates. This improves the neural receiver’s ability to perform channel estimation, equalisation, and demapping effectively,” Roessler commented.

There are drawbacks, of course. These tend to be the same challenges that Gen-AI more broadly pose: the complexity and resource demands of training, and the need for large, high-quality datasets. Roessler is confident that these challenges can be mitigated and overcome.

“The future looks bright.”

Image by patrypguerreiro from Pixabay.

SPONSORED BY:1
Share:
Share:

Insights

Registration

To reserve your ticket please fill out the registration form