LLMs & the Future of Privacy – Homomorphic Encryption

LLMs & the Future of Privacy – Homomorphic Encryption

Today most corporations are looking into how they can leverage services like OpenAI. When considering these services, privacy, and cybersecurity take the forefront of leadership’s minds.

divider

One up-and-coming cryptographic technique that could solve some of the challenges is none as homomorphic encryption. Though it is not new, the concept of homomorphic encryption was first introduced in the late 1970s by Rivest, Adleman, and Dertouzos. Challenges from computational complexity to standardization kept it from becoming mainstream. Thanks to increasing computing power, improved algorithms and techniques, and standardization, recent advances in the last decade have collectively made homomorphic encryption more practical and viable for real-world applications.

Traditionally, when we think about encryption, we think about the techniques like AES, DES, etc., which require data to be decrypted in order to be processed or analyzed. The need to decrypt the data exposes it to potential security risks.

What if you didn’t have to decrypt the data? Sounds impossible. Well, that’s where Homomorphic encryption comes into play. It addresses this problem by enabling computations to be performed directly on encrypted data, eliminating the need for decryption. Eliminating the need to decrypt the data enhances data privacy and security by allowing computations on encrypted data, thereby reducing the risk of exposing sensitive information during processing.

Homomorphic encryption enables secure collaboration and outsourced computation, as multiple parties can perform calculations on encrypted data without revealing the data itself. It allows computations to be performed directly on encrypted data without decrypting to perform the requested operation.

This technology can transform various domains, such as healthcare, finance, and data analytics, by enabling secure computations of sensitive information. It allows corporations to leverage the power of cloud computing while preserving privacy and keeping data secure.

An example where homomorphic encryption is valuable is the secure outsourcing of computations. Consider a financial institution that wants to outsource complex calculations to a third-party service provider. Instead of sharing the raw data, the institution can encrypt the sensitive information and send it to the provider. The provider can perform computations on the encrypted data using homomorphic encryption techniques. The results are then returned in encrypted form and can be decrypted by the institution, ensuring the privacy and security of the data throughout the process.

There are different types of homomorphic encryption schemes, including partially homomorphic encryption and fully homomorphic encryption:

  • Partially Homomorphic Encryption: It can only perform certain computations on encrypted data. For example, a scheme may support either addition or multiplication operations on encrypted data but not both.
  • Fully Homomorphic Encryption: Fully homomorphic encryption (FHE) allows for arbitrary computations on encrypted data. You can apply a series of mathematical operations, such as addition and multiplication, on encrypted values without decryption. The result of these computations remains encrypted, and when decrypted, it yields the same result as if the operations were performed on the original unencrypted data.

FHE is particularly challenging to implement due to the need for complex mathematical operations that maintain the integrity of the encrypted data. However, advancements in the field have led to the development of practical FHE schemes, albeit with some performance and efficiency limitations.

In 2009, Craig Gentry, made a significant breakthrough that made practical homomorphic encryption possible. Before 2009, encryption schemes supported either addition or multiplication, but not both. This limitation meant that they were only partially homomorphic and could not be used to compute arbitrary functions on encrypted data.

The new scheme was based on ideal lattices and allowed addition and multiplication operations to be performed on encrypted data, making it possible to compute arbitrary functions without needing to decrypt the data first.

This breakthrough opened up new possibilities for the secure computation of encrypted data, including privacy-preserving data analysis, secure cloud computing, and more.

That said, the initial versions of FHE were impractical due to their high computational complexity. Since then, much research has been done to optimize FHE schemes, and while they are still computationally intensive compared to non-encrypted computation, their efficiency has improved significantly, making them more practical for certain applications.

FHE enables computations on encrypted data, allowing for broader privacy-preserving applications. While performance has improved significantly, running large language models (LLMs) with FHE remains cost-prohibitive.

Estimates show that generating one encrypted LLM token would require up to 1 billion fully homomorphic encryption operations. These operations are called bootstrapping. With today’s CPUs that cost is sub one cent per operation. On the other hand, with bootstrapping each token would cost around $5,000.

However, reaching the 500,000x improvement needed for $0.01 tokens is achievable through three key trends:

  • LLM optimization, delivering at least 2x faster operation.
  • FHE cryptography advances, expecting 5x speedup within five years.
  • The biggest speedup will come from dedicated hardware acceleration for FHE. Companies like Duality, CryptoNext Security, and IBM are developing custom ASICs and FPGAs optimized for bootstrapping and homomorphic operations.
    • Duality is targeting a 1,000x speedup with their CipherCore FHE accelerator launching in 2025. CryptoNext also plans to deliver over 1,000x faster bootstrapping with their FN-FHE chip in 2025.
    • IBM aims even higher with 10,000x acceleration using their FutureHomomorphic hardware architecture in their second-generation system expected after 2025.
    • With these improvements, running encrypted LLMs will require just ASICs or FPGAs accelerators to run an encrypted LLM at reasonable cost. This is on par with the number of GPUs required today for non-encrypted LLMs.

As most challenges are largely solved, we should see end-to-end encrypted artificial intelligence processing within the next five years.

When this happens, privacy will no longer be an issue. Not because it’s unimportant, but because it will be guaranteed by design.

Let's create something amazing.

Coffee Much?

Built in center of everything 🌎 Indianapolis, IN.

Privacy Policy