Exclusives : New Directions in Cybersecurity Needed

New Directions in Cybersecurity Needed

Cybersecurity has often taken a back seat in commercial conversations and product development for many reasons. Not least of which is that “security guys” tend to be a different group of specialists from those who are employed in product creation.

Happily, in the recent 6GSymposium, a group of professionals were able to have a frank conversation about the challenges with current 5G security and the scope to bake it in from the start for 6G.

Challenges

The conversation didn’t shy away from the problems faced by the industry in 5G. Greg Young, VP Cybersecurity at Trend Micro, pointed out the essential challenges in a 40-year-old cybersecurity environment.

“A lot of the models that were used for security have been creaking and creaking under this kind of scale and they haven’t worked well. So this is why the White House Executive Order mentioned things like zero trust, which really comes down to saying ‘Hey, let’s not keep doing what we’ve been doing and let’s break the models a bit.’”

Establishing an environment with zero-trust is demanding, to say the least. As networks open up through APIs to support interworking with cloud, private networks and enterprise IT, bilateral mechanisms for establishing trust won’t work.

“We’re talking about tens of thousands of factories and businesses. How are we going to establish that bilateral trust between, say, Cisco and a company in Germany that you’ve never heard of?” asked David Rogers, CEO of Copper Horse. 

Jaya Baloo, CISO at Avast, spelled out the need to develop “trusted mechanisms and processes where we can continuously verify all those partners we need to play with to deliver a product.”

Bob Everson, Cisco’s Senior Director for 5G Architecture, was able to pick up on this concept and noted that “our teams have been working a lot on trust assurance at the hardware layer, so as to establish trust and ensure that the hardware has not been tampered with as well. As technologies like that become more pervasive, we’ll see partnerships across the supply chain to standardise as much as possible of that.”

One challenge to master is that technology, today and tomorrow, is distributed unevenly. Baloo pointed out that “the world matures at different rates and different countries are on different generations. People are different in terms of their cyber maturity as well.”

 “It’s not like generation by generation it’s a magical, hard cutoff and everything before it is gone,” Everson noted. “4G is actually still growing. The last estimates I saw was through 2023, 5G is not even going to overtake 4G.”

As a result, any considerations of cybersecurity in 6G will need to be, at least, backward-compatible with previous generations of technology. Young suggested that this could actually be a benefit.

“I think even recognizing the security posture of the handset or the turbine that’s connected – even if it’s on a legacy 5G network at that time – and adding value to the 5G security with new models […] I think we can accommodate it. In fact, we can maybe make 5G better through good 6G security,” he commented.

Nevertheless, the fact remains that today – and for the foreseeable future – consistent cybersecurity has been struggling due to the uneven nature of technology distribution, including in the hands of nation-state actors.

As Baloo noted, critical infrastructure of different states is continuously being attacked. According to her, the geopolitical nature of  the TMT industries has led to state-sponsored attacks and the lines blurring between state actors and cybercriminals. “Kim Jong Un is like the OG gangster of ransomware. He netted something like $2 billion [in 2020], so I don’t think we’ve been that successful.”

New Approaches Needed

One important consideration is what Young called a disappearing line between our software defined networks, the applications they use, and the users. This will be one of the enablers that allows us to improve 5G security through 6G, but it also has a variety of different ramifications.

The first is the opportunity to leverage open systems, which means getting better-documented interfaces and more ability to build security around it instead of trusting in the black box.

“I know that’s going to help us harden the systems more holistically [by] being able to leverage not just 3GPP-centric security practices for the mobile network, but leverage cloud practices, IP networking practices and more IT security practices as well to help secure the system,” Everson commented.

Young also noted that a richer understanding of the context of individual devices and applications will help to reframe the way that security is managed. During the 6GSymposium, he pointed out that the market must change the model between “putting things in buckets and hoping everything in that bucket has the same risk level”, and instead start looking at risk level individually and make decisions based on that.

On the other side from open organisations, questions around the role of cybersecurity in standards development was thorny.

“In the past few generations of standards, security is too often left behind as a ‘To Be Determined’”, Young noted. With concerns voiced by governments and end users, however, he believes that security can pull ahead and force itself to be accommodated rather than just left aside or waiting to be determined later, and still not hinder the functional elements of 6G deployment.

Part of this will include baking in new approaches to physical security. As Everson pointed out, “at one point all of this infrastructure was almost at the top of the tower. And now – this is not necessarily 6G-specific – we’ve brought down some of the most intelligent parts of the equipment.”

Rogers highlighted the risks this opens up, using the example that professional cyclists have found ways to subvert sensors on their bikes for gain. “So the stuff that’s on the sensing peripheral – if it’s easy to subvert, what effects does that have as it goes further into the network? How much do we rely on that data and how can we fix that problem?”

“I’m really encouraged by the work that the automobile manufacturers are doing with connected car,” Young replied. “They’re taking this kind of attack very seriously, and it’s a good sort of roadmap for wireless connectivity to devices that may be in very hostile environments.”

That said, Everson acknowledged that the network will need to assess context for the devices as well. “The binary sort of ‘good/bad’ has to go. We have to understand security posture and risk better to say that if the device is suspect, or looking at us with side-eye, that is great information. We can make risk decisions based on what kinds of traffic we’re going to put through it.”

Data and Cryptographic Challenges

However, the experts see difficulties in data that still need to be addressed. While we can access richer data than ever before, the quantity and velocity of the data is also increasing. “We’re talking about terabit or petabit core network speeds. How are we going to list stuff for analysis, and how are we going to process it quickly enough through AI and ML algorithms?” Rogers asked.

Part of the challenge will be in using data in a smarter way. Young observed that “we’re not utilizing a lot of even the mid-sized data lakes now for cybersecurity, telemetry, and information… we need much richer telemetry and information than we’re capturing today.”

According to Baloo, sampling flow traffic is also not that straightforward, because it’s too much dependent on the company’s sampling rate and its ability to process near-real-time. Partly, immaturity in this area is the problem.

“In a perfect world we’re able to have this production traffic go through one channel. We’re able to pipe off a certain percentage of that – only the smart stuff. Then you want to have intelligence detection with MITRE and all kinds of interesting IOC captures and feeds hooked in to know what is happening. We are so far away from that utopia,” she said.

However, in some ways immaturity is a positive sign. There are clearly ways forward to be developed and improved, given the appetite. A similar point was made in conversations around the oncoming development of quantum cryptography and the need for building in quantum-resistant algorithms to protect data.

Baloo, also Vice-Chair of the European Quantum Flagship programme, observed that many cryptographic algorithms currently in use would be vulnerable to cracking with quantum computers in the coming decade, and finding alternative solutions is a critical process.

“And this is something that we need to start playing with now, even when there isn’t an initial deployment. Also [early solutions] will be potentially broken or buggy. So the name of the game there is just cryptographic agility. We need to roll with it and do a better job of swapping algorithms than we have done,” she commented.

“All algorithms have a certain ‘bake time’ necessary to make sure that they’re as good as we think they are. And just because we can’t break them now, doesn’t mean that there isn’t a computational attack that doesn’t even need quantum theory a few years down the road. So again, crypto agility assumes that we need multiple layers of security.”

Outcomes

It is clear that there is a long road ahead and few very definite solutions yet. However, overall the panel gave a sense of cautious optimism about the opportunities for improvement. As Baloo said, there’s nothing like a good crisis to do that so maybe this is the perfect storm, and it’s amazing what we can accomplish when no one cares who gets credit.

“I think things are bad enough right now that we don’t need to make the case for security, hopefully,” Young commented. “That should give us some space to do better in security and try to break out of some old models.”

Featured image by Photo by Philipp Katzenberger/Unsplash

SPONSORED BY:1
Share:
Share:

Insights

Registration

To reserve your ticket please fill out the registration form