To gain a full view of the day’s activities – and indeed all 16 sessions in the 6GSymposium – visit the 6GSymposium website from next week for the complete set of video recordings, available to anyone registered (if you haven’t registered yet, don’t worry – registration is staying open. It’s free and takes just a couple of minutes).
Alternatively, join us for our 6GSymposium recap webinar on 27th June, when 6GWorld will spend a hectic hour exploring the event’s major themes and findings with key stakeholders from 6GFlagship, University of Surrey and InterDigital.
The final day of the 6GSymposium focussed on the technology transformation coming up over the 2020s. This is driven by two factors.
First, existing technology is reaching its physical limits. While many of us have heard of the way Moore’s Law no longer applies in silicon development, there are a variety of other issues.
Dr Qi Bi, CTO of China Telecom’s research labs, observed that densification of a traditional network architecture to meet 6G service demands would not be financially viable. “We are close to the end for cellular architecture,” he observed. Instead, he proposed that the industry adapts sideloading and mesh networking protocols to create an automated device-to-device network that would work as an auxiliary and, sometimes, replacement of cellular networks, which he described as proximity-RAN or P-RAN.
When questioned about what incentive people would have to share their device’s resources with others, Qi responded that the evolution of blockchain and Web 3.0 techniques offered some clues, but that it would be up to the service providers to innovate business models and incentives to make it work.
The other element is that trends in technology and the industry today will drive changes in networks at a fundamental level as we transition from 5G through 5G-Advanced to what lies beyond. As these technologies mature we can anticipate new capabilities, but also the need for further enabling technologies that will allow for the scale, sustainability and performance demanded by 6G visions.
Richard Li of Futurewei highlighted the threefold challenges of providing a unified ‘network of networks’ across cloud/telecom/satellite and more, at high-performance KPIs and lower environmental costs – demands which seem to be pulling in different directions. He proposed a new IP packet format including metadata on how the packet should be delivered. This ‘contract’ would let the network respond in the most appropriate way to applications with all kinds of different demands for quality of service at the least energy cost.
Moreover, while researchers dive into specific and individual solutions, we need to combine them into one equipment that actually works. Antennas, which are the speciality of Wonbin Hong, professor at Pohang University of Science and Technology, are a good example.
“I don’t think having antennas prepared in time will be an issue. The actual challenge will be how we can confirm that this piece of hardware fits with the rest,” he said in the session “How far can hardware technology stretch and scale on the way to 6G?”
Beyond The Limits of Space and Time
One of the important implications of this evolution lies in space. While 5G specifications to date enable the use of satellites and high-altitude platforms for backhauling terrestrial traffic, the expected evolution will be towards complete and seamless integration of non-terrestrial connectivity and computing with terrestrial, enabling simple handovers to provide complete coverage and continuity of services.
While this does not sound like a revolutionary technical breakthrough, actually being able to guarantee that an end-user can access data and voice services wherever they happen to be is unprecedented and opens up markets and use-cases that were previously impossible to service.
However, to accomplish this will demand much better time-synchronisation between terrestrial and non-terrestrial networks. Anas Al Rawi of Ofcom emphasised the difficulty of managing synchronisation and timing when using static ground-based towers and non-terrestrial moving nodes.
“We need to start taking account of doppler effects causing inter carrier interference in this environment,” Al Rawi pointed out, highlighting the need for an evolution of synchronisation as a concept, which may rely on disruptive technology advantages to achieve.
The concept of technology evolution picked up on discussions during the previous day’s commercial focus. As network technology moves away from hardware-orientation to being software-centric, it becomes possible to simply overlay new capabilities onto previously existing hardware.
This was the central message of a session hosted by the University of Surrey’s Pei Xiao, whose participants are exploring what waveforms the 6G era may bring. They were at pains to underline that the OFDM used in 4G and 5G would not become “legacy” in favour of a different waveform. Instead, we are likely to find a “DevOps” approach emerging where new kinds of waveforms specialised for specific demands are introduced as and when needed.
For example, Ronny Hadani of Cohere Technologies pointed out that OTFS waveforms were well suited to extreme mobility situations. “The signal doesn’t fade and there’s no intercarrier interference. Second, it allows for joint and sensing communication, with high-resolution of reflectors,” he explained. According to him, other advantages include predictability of input-output relations.
Technology With a Purpose
If the observations above seem unusual, insofar as the discussion was dominated by concepts that are simple, or inexpensive, or cost efficient to apply, you’re right. While there are a hundred quiet technology revolutions ongoing, the focus is firmly on practical applicability and commercial viability. Research for its own sake, or with a “build it and they will come” mentality, was not on show.
This was exemplified well in a session exploring a capability that has been highlighted as a possible advantage of 6G – joint sensing and communications. While this appears to offer a number of commercial opportunities, such as for managing interactions between machines and their environment in a much more precise way than ever before, speakers were keen to underline the need for efficiency and commercial practicability.
For example, Rui Yang of the University of Surrey outlined a project working with the IEEE to deliver good indoor positioning through future iterations of WiFi from 2024, the aim of which is to be backward compatible with existing WiFi rather than replacing it.
Meanwhile, Merouane Debbah of UAE’s Technology Innovation Institute and Angeliki Alexiou of the University of Piraeus were tackling the challenge of sensing at long range using high-frequency signals, even into the THz range, when they suffer from extensive path loss. Again, the intent of this would be to enable high-quality sensing without building extensive new infrastructure.
While there were a hundred important ideas, such as how to build native AI into networks; signal optimisation for individual devices; the application of quantum cryptography; and going beyond the theoretical limitations of Shannon’s law, there were two strong undercurrents that stretched across different sessions and speakers.
Volker Ziegler of Nokia Bell Labs summarised the first when he described the 6G technology landscape. “Two years ago it was like fresh snow with no tracks. Today there are lots of tracks, but we don’t yet have a proper piste for skiing on. We need to follow some of the more promising tracks. The framing part is done and now we need to go deeper, and this is where industry and academia can collaborate – for example on energy efficiency, which will cut across many specialisations”.
Secondly, 6G’s form is going to follow its function: in this case, enabling telecoms providers to become more flexible in their service offerings, customers and business models than ever before, while serving more of the world more effectively. If technology research has ever been ‘the tail wagging the dog’, it was clear that this is no longer the case. Today’s discussions were all about technology with a purpose.
Alex Lawrence is Managing Editor at 6GWorld. His mission is to bring together stakeholders from across industries, countries and disciplines to make sure that, as technology evolves in the coming decade, it’s meeting the changing demands of society, government and business.
He has been involved as a professional nosy person in the telecoms sphere since 2004, with short detours through industrial O&M and marketing.
If you’d like to talk to Alex about your ideas or projects he’d love to hear from you. @animalawrence or firstname.lastname@example.org.