Confidential computing is still a relatively new concept. But security requirements in combination with technological progress will make it the rule rather than the exception within a few years.
In short it is confidential computing a next step in end-to-end encryption. Belgium is also tinkering with the tools that encrypt our data, even up to the moment that data is loaded into the working memory for processing.
We have long known the principle of encrypting data when it is stored or sent. With confidential computing, however, that data is only decrypted at the very last moment in a special enclave (a ‘trusted execution environment’ or TEE). The system works with special chips that both Intel and AMD are currently focusing on. This should prevent, for example, malware from stealing unencrypted data from the RAM or internal memory.
That’s it for the brief explanation. Accenture Belgium, which recently released a whitepaper together with Intel, believes that the technology is gradually becoming mature enough to be applied on a large scale. Read: in large data centers. “In that paper we explain how you as a hyperscaler can go to confidential computing and the complete stack can go right down to the chip,” says Frederik De Ryck, Strategic Cyber Security at Accenture. The authors of the white paper do this from three points of view: that of Intel, that of Scone (a company that creates an intermediate layer with containers) and from Accenture, which wants to form the bridge between the hardware player and the end customer or user.
“With confidential computing, Intel is building a CPU with two keys: a known one and an unknown one that is already built in at the factory. They use this to generate encryption keys in-memory encrypt. So everything in that memory block can only be read by that specific CPU,” says Frederik De Ryck. If that data is dumped from memory, for example in the event of a crash, it remains encrypted and therefore unreadable. That is the concept of Intel SGX (Software Guard Extensions).
Accenture has developed a number of concepts based on this, including using the system in a hypervisor. One of the possible functions of confidential computing is, for example, ‘attestation’, where the CPU knows whether the data is being processed in the right place. “That is also signed, so you can trace processing to a specific container and chip,” says De Ryck. The aim is to make the entire chain traceable.
“The concept of keeping data ‘confidential’ for as long as possible has been around for some time. But we are looking for something that remains scalable and is not slowed down too much by encryption. This approach allows you to put things together while not only guaranteeing security, but also scalability and speed,” says Valerie Tanghemanaging director at Accenture.
Basic hygiene remains necessary
With the white paper, Accenture mainly wants to indicate what exists and is possible today. “This is not something we can do alone, but we do want to show what is currently possible,” says De Ryck. At the same time, he also warns that confidential computing is not a replacement for what exists today. “If you allow ransomware into your enclave because the hash says it is good, then you are not safe. Your various other security layers are also not suddenly redundant. You have to maintain your basic hygiene.”
What is new is that confidential computing is gradually taking steps to be deployed on a large scale. “The SGX chipsets from 3 years ago could load 128 megabytes into memory. If you have to process twenty terabytes of data for an algorithm, that can take months. The chipset that Intel released this year can process 500 Gigabytes per chip platform, so you can do a lot more with it.”
De Ryck points out that companies want to combine different types of data, so there is already a need to make confidential computing scalable: “But you don’t want to do that with ten thousand chip platforms, with a hundred there are many use cases within reach.” The ecosystem is starting to take shape and that means that confidential computing as a concept is on the verge of a breakthrough. It is unclear to anyone exactly what that will look like.
“Now we see parties such as Scone, Decentriq or Fortanix with containers, but there is a chance that the layer above and below them becomes so transparent that the intermediate layer can disappear,” De Ryck adds. At the same time, there remains a certain need to involve different parties, a bit like certificates from a website that are signed by an external player.
“Suppose you want to do something on Azure, but you don’t want Microsoft to have access to that data. Then such an intermediate layer is useful because otherwise it would be directly on Microsoft hardware. At Amazon it is something else: they also make their own chips, but that makes it more difficult because a large part of the components are completely in their hands.”
Intel is already transparent. Their design for confidential computing is public. “You can literally ask whether the chip you are using comes from a certain factory. Is that trust inviolable? No, but it is growing because you control it with different parties.”
The ideology behind it is a bit like blockchain, although technically it concerns completely different matters. “In a blockchain it is important to know what the last message was because there can be a difference in decentralized terms as to what exactly that was. Scone also uses this concept in its software. At the RAM level, for example: if the CPU needs pieces in the RAM, it will load small pieces of that data. What Scone does there is check the hashes of all that data to make sure you have the latest version at all times. There are various systems around versioning, just like with blockchain, that ensure that you can trust something.”
Standard
The fact that the technology is now reaching cruising speed is mainly due to the increase in computing power. CPUs are getting better and faster. The memory is becoming larger, there is a larger cache and so the bottlenecks of the present are gradually being solved. “That really ensures scalability,” says De Ryck. “Certainly with a hyperscaler, you are not always going to need that computing power, so in a data center it is useful to bring that together and offer it on a large scale.”
According to De Ryck, these technological advances also create opportunities to raise the bar for sensitive data that is today stored and processed securely, but not as secure as with confidential computing. “Why wouldn’t we put all our health data there? Of course, your basic hygiene must first be good in terms of safety. But as far as I am concerned, confidential computing should be included in basic hygiene as a government standard.” De Ryck points to Germany, among other things, where you as a citizen can only request your own data from the systems using a key on your identity card.
Will confidential computing become something that companies will use for very specific scenarios, or can it become a standard in about five years, like encryption already is in many apps? “Two years ago, the hyperscalers and Gartner started saying that everything would become confidential and that within five years everything would be running confidentially in their cloud. We are now halfway through and things are getting going, but technically we are not there yet,” says De Ryck. “The large chips will be on the market at the end of this year.”
Although he also wants to qualify that skepticism: “I am convinced that we are heading in that direction. The standard may not yet be available everywhere, but there is a need for it. If you previously had a storage account that was not closed, it made little difference if no one knew about it. Today it has to be closed because they scan for such open environments every day. That is why there is extra security in service provision. Big companies are going to expect it, the EU is going to expect it and the resources will be there.”