Ramanath Mallikarjuna, Chief Strategist at Akamai, is talking with 6GWorld about the fact that he lives in a datacentre.
Before getting bogged down in mental images of a sleeping bag among racks of servers, there is a striking point underlying the security expert’s comment.
“The client-server model and the old datacentre that we’re familiar with has gone. It has completely changed, because it’s no longer a one-way request and a one-way response. Traffic is completely bi-directional,” Mallikarjuna explained. “Clients are now requesting action data from the server saying, ‘Hey, I gave you this data. What do I do now? Do I need to do anything?’”
This constant two-way traffic is only likely to develop further. At the recent 6GSymposium experts such as China Telecom’s Bi Qi have been proposing increasingly-distributed network architectures, including involving sideload functions to create device-to-device mesh networks.
As a result, Mallikarjuna contends, it changes the nature of many devices in the home and elsewhere – and, in particular, the approach to their security. Describing the ten connected devices around his home, he says “Now I cannot treat these devices just as clients or as simply an end point, because these are all servers generating data and they need to be protected like we do a datacentre.
“That’s the biggest, biggest watershed moment that’s happening in terms of the network. It’s a bi-directional communication in terms of these entities that are all datacentres.”
Cracks in the System
That’s an interesting statement, but the notion of securing user devices like datacentres deserves to be challenged. Datacentres tend to be professionally managed and secured, kept away from physical intrusion as much as possible. It’s just not feasible to apply the same kinds of physical security to a device in the home (unless a lot changes!). Is it really possible to secure user devices in anything like the same way?
“I think the hardware and firmware industries have done a lot in the past few years to make that portion of it really hard to crack,” Mallikarjuna commented. That said, “We’ve all seen attacks on the hardware itself using lower-level protocols. There’s spoofing of chips within open devices as well. Those are hard problems to solve, and we have to look at all the layers of hardware, firmware and software and harden them.”
Particularly when it comes to software, there has been an awareness that applications can be especially vulnerable. Application developers are often working to a budget, and in many cases don’t have the skills in security systems to make foolproof applications. Meanwhile, airtight security may simply not be a consideration for some of the people commissioning applications. So we have a situation where a highly distributed system has vulnerabilities in-built.
Mallikarjuna agrees, but highlights what can be done to address that. “Platforms like Akamai, the cloud platforms, or the compute and edge platforms can do a lot in terms of providing a more secure framework,” he pointed out.
“At the same time, there’s a lot of development in DevSecOps and areas where security and operations are included in the app development itself. This is a good trend and platforms like Akamai can support that.”
Even so, Mallikarjuna highlighted that nobody can afford to be complacent, including end users.
“A simple click on a mobile phone when you’re in a hurry, for example dropping your kids to school, can result in the major attack for an enterprise. Even the most educated and the most aware professional in the industry could do that,” he said.
“Yes, you need systems to automate and be able to save cost, but at the same time you need education to make people aware so that, with every action they take, there’s a kind of ingrained security mentality within them.”
The growing attack surface that networks face – with more distributed compute and storage capabilities all the way from the device to the core, and interworking with third parties – presents an increasing threat to security. However, there is a chance to change mindsets and view this as an opportunity too.
“I think the answer lies in containing the impact of these attacks to where it happens,” Mallikarjuna argued, pointing out that the content delivery networks Akamai was built upon help with this.
“It’s because of our approach with distributed computing, where we assume that our servers are going to crash sooner or later. When you assume that, you have backup systems that can take over seamlessly, and that’s just the nature of a distributed system.
“If we try to contain attacks to the systems where it’s happening in a distributed manner then that approach not only helps to contain the attacks, but also gives you enough compute power and resource in that local system to deal with the attack.”
Dismissing Distributed Legends
Speaking of distributed systems and security, there has been a good deal of enthusiasm about blockchain and similar DLT [Distributed Ledger Technology] systems as a possible method to build resilience. However, so far little has come to market using blockchain-based security. Why would that be?
“The industry so far has not seen many solutions that can address the scale and speed that’s required for the use cases we are talking about in the enterprise context,” Mallikarjuna explained.
That is not necessarily a compelling challenge however, as technology has always started at higher costs and lower efficiency, and then improved from there.
“DLTs definitely have a role to play in validating a particular transaction happened in a certain way.; and if there’s some tampering to that, then you can go back to an earlier state. That’s absolutely right,” Mallikarjuna conceded.
“However, the ecosystems themselves are not secure inherently. You can put bad data into a DLT system then that stays there. It’s as good as the input.”
This has been, in fact, part of the problem with using blockchain-based systems to trace supply chains for goods such as organic foods. While the blockchain may prove to be immutable, it does not prevent people from inputting false data or making mistakes, meaning that the tracking is no more accurate than under other systems of record.
“And even so, security is a big risk. Even if you trust the system completely you’ve still got to make sure that it’s secure.”
Tackling the ‘Home Datacentre’ Problem
So, then, what systems are likely to be successful in a distributed environment where traffic is constantly two-way? Where would Mallikarjuna place his bets?
“I think some of the solutions that we already have would still apply in the new paradigm, but of course have to be modified and adapted. For example, defences against DDOS [Distributed Denial of Service] have to be modified to make sure that we work on a bi-directional flow, not uni-directional,” he began.
“Bot management and mitigation now has to address a two-way communication. It has to identify good versus bad entities and offer protection, but now it has to do it at a scale and speed that was previously unseen, because you’re talking about APIs that are linking to other entities in less than a few milliseconds.”
All of this may seem like an evolutionary approach – and to a great degree that is true in terms of processes. The real revolution lies in how they are applied, and will require intense work to bring to fruition. The same could be said of Mallikarjuna’s next set of key topics. What are these?
“Zero-trust solutions, where we provide passwordless authentication, multifactor authentication and remote access to systems along with micro-segmentation within enterprises. These are applied to enterprise datacentres today, but there’s huge applicability in the new ‘datacentre everywhere’ world because the micro-segmentation can be that specific,” Mallikarjuna argued.
This micro-segmentation can prevent intruders from being able to move laterally within a device or organisation to uncover fresh data or applications that deliver value to the hacker.
Overall, Mallikarjuna is cautiously optimistic.
“There’s a lot of hard work to be done. People like me are pretty sceptical about how all of these things work, but looking at everything coming ahead, it could be brilliant if it all works together.”
Alex Lawrence is Managing Editor at 6GWorld. His mission is to bring together stakeholders from across industries, countries and disciplines to make sure that, as technology evolves in the coming decade, it’s meeting the changing demands of society, government and business.
He has been involved as a professional nosy person in the telecoms sphere since 2004, with short detours through industrial O&M and marketing.
If you’d like to talk to Alex about your ideas or projects he’d love to hear from you. @animalawrence or email@example.com.