Monday, September 2, 2024

3 Golden Uses Cases For Confidential Computing

 


Happy Labor Day everybody!!!  As we now loaf into almost the 4th quarter of this year, Cybersecurity is going to be gaining more attention.  The primary fuel for this one will be the Presidential Election that is coming up in just a matter of two months.  There is widespread fear of voter fraud, the proper identification of voters, and the biggest concern now is how Generative AI will have an impact.  It has evolved very quickly since the last election, and some of the biggest fears are as follows:

*Widespread of use of Deepfakes

*A huge uptick in Phishing based emails

*Spoofed and phony websites, asking for campaign donations

Apart from the other ways I have written about before in mitigating these risks, I came across a new concept today that I never have heard of before.  It is called “Confidential Computing”.  A technical definition of it is as follows:

“Confidential computing technology isolates sensitive data in a protected CPU enclave during processing. The contents of the enclave, which include the data being processed and the techniques that are used to process it, are accessible only to authorized programming codes. They are invisible and unknowable to anything or anyone else, including the cloud provider.”

(SOURCE:  https://www.ibm.com/topics/confidential-computing).

Put another way, it is using the specialized parts of the Central Processing Unit (CPU) in order to protect your most sensitive datasets.  But the trick here is that it is only those that are currently being processed that are shielded from prying eyes, such as the Cyberattacker.  More details on it can also be found at this link:

https://www.darkreading.com/cyber-risk/how-confidential-computing-can-change-cybersecurity

So, why should you consider making use of this technique for your business?  Here are three compelling reasons:

1)     Compliance:

The fuel that feeds Generative AI are datasets.  It needs a lot of them to not only start learning, but it needs it all of the time to create the most robust set of outputs that are possible.  Because, of this, data theft and data leakages have become much more prevalent, and  the Cyberattacker is taking full advantage of this.  As a result, the major data privacy laws, such as those of the GDPR, CCPA, HIPAA, etc. have now included the use of datasets in Generative AI models in their tenets and provisions of compliance.  This is still a rather murky area, but by using Confidential Computing you will have some reasonable assurances that you will come to some degree of compliance with these laws.  This is especially advantageous to those businesses who conduct a lot of e-commerce-based transactions, or process a lot of financial information and data.

2)     Cloud:

Whether you make use of the AWS or Microsoft Azure, data leakages are a common threat, and ultimately, you will be held responsible for anything that occurs.  Not the Cloud Provider, as many people believe!!!  While these two give you out of the box tools to protect your datasets, you are responsible for their proper configuration.  But whatever you make use of, ensure that even in this kind of environment you have deployed Confidential Computing.  To do this, make sure that you have implemented what is known as the “Trusted Execution Environment”.  This is the secure environment of your CPU, whether it is physical or virtual based.  It makes use of both public and private keys, and mechanisms are established from within it to mitigate the risks of malicious party intercept of them.

3)     AI:

As it was mentioned earlier in this blog, Generative AI models needs tons of datasets to train on, so it can learn effectively.  But once again, you are responsible for the safekeeping of them!!!  Yes, another way to make this happen to some extent is to once again use Confidential Computing.  This also helps to provide assurances that the datasets you feed into the model are authentic, and not fake.  This is something that you must address now, if you make use of AI or any subset of it in your business.  The downside to this is that in a recent survey that was conducted by Code 42, 89% of the respondents believed that using new AI methodologies is actually making their datasets much more vulnerable.

My Thoughts On This:

As you can glean from this blog, the protection of your datasets should be one of the top priorities for the CISO and its IT Security team.  It’s not just the compliance that you have to look out for, it’s also the reputational damage that your company will suffer if you are hit with a Data Exfiltration attack.  After all, it can take months to get a new customer, but it can only take sheer minutes to lose them. 

By making use of Confidential Computing, you can provide one, very strong layer of assurances to your customers and prospects that you are taking a very proactive approach to safeguard their data that they so entrust you with.

Finally, in this blog, we had mentioned about data that is being processed.  There are two other types of datasets that need to have careful attention paid to it as well, and they are:

Ø  Data At Rest:  These are the datasets that are simply residing in a database, and not being used for any special purpose.  They are just “archived”.

 

Ø  Data In Motion:  These are the datasets that are being transmitted from one system to another, such as going from a server in one location to another in a different location.

No comments:

Post a Comment

Beware Of That IoT Device You Are Going To Give As A Gift!!!

  As we fast track now into Thanksgiving and the Holidays, gift giving is going to be the norm yet once again.   To me, I think it should be...