Who owns the data? Who can read what data? At the center of some of the thorniest issues facing the internet is a set of encryption algorithms that hold everything together. The routines are mathematically complex and often difficult to understand even for experts, but stopping fraud, protecting privacy and ensuring accuracy implicitly depend on how well these algorithms work.
Their role in the governance of cyberspace attracts many researchers who try to improve them while trying to reveal their weaknesses by breaking them. Some of the newer approaches offer new opportunities to protect everyone with more sophisticated protocols and more robust algorithms. The newer tools bundle better privacy and more agile applications that will better resist attacks, including those that could be launched using largely hypothetical quantum computers.
The booming world of cryptocurrencies is also creating opportunities to secure not only money and transactions, but all stages of the digital workflow. The exploration and innovation devoted to building blockchains to immortalize all interactions is among the most creative and intense areas of computing today.
The good news is that for all of this exciting innovation, the basic foundation remains remarkably stable, strong and secure, as long as care is taken to put it in place. Standards last for decades, allowing businesses to rely on them without recoding or redesigning protocols very often.
Standard algorithms such as Secure Hash Algorithm (SHA) and Advanced Encryption Standard (AES) were designed with careful public competitions managed by the National Institute of Standards and Technology (NIST), and the result has been remarkably resistant to public attacks without end. While some of them have weakened a bit thanks to advancements in technology – SHA1, for example, is expected to be obsolete and replaced by SHA256 – there is no catastrophic security collapse to report.
Quantum Resistant Encryption
Fears that something will break protocols and algorithms on a large scale, however, is pushing the algorithms to harden against attacks that could come from quantum computers in the future. A big push from NIST aims to create a new collection of “quantum resistant” or “post quantum” algorithms with competition currently unfolding.
Last summer, NIST announced the start of the third round of a competition that began in late 2016. Sixty-nine different algorithms started the process and the list has been sorted to the first 26 algorithms and now 15. Of those 15 , seven are called the “finalists” and the other eight are alternatives that target niche applications or are still under consideration because they, according to the announcement, “may need more time to mature.”
The screening process is difficult as researchers try to imagine attacks that could come from machines that don’t exist, at least on a useful scale. The RSA digital signature algorithm, for example, can be broken by successful factoring of a large number. In 2012, researchers reported successfully using a quantum computer to divide 21 into the product of 7 and 3, two numbers that aren’t particularly large. Many assume that it will take a long time to develop enough precision to successfully factorize longer numbers, and it appears that many standards like RSA are more threatened by easily accessible cloud computing than by hypothetical quantum machines.
Much of the competition focuses on algorithms that are resistant to Shor’s algorithm, the most commonly described way for quantum computers to attack algorithms like RSA. Publicly announced quantum machines take many different forms, and no one knows if other algorithms or designs may appear.
Yet despite all the uncertainties, the researchers find that some of the quantum resistant designs can still be useful even if quantum attacks never materialize. Paul Kocher, a cryptographer, said in an interview that digital signatures based on hash functions can be easily deployed in dedicated hardware and software environments that must operate with underpowered processors. “Verification only requires a tiny state machine and a hash function, which makes them ideally suited for hardware implementation,” he said, adding that “the robustness of the approach against quantum computers is simply based on the hash function, rather than any other quantum-safe algorithms that involve new mathematics.
NIST has said the final round may take a little longer due to delays due to the pandemic, but they hope to announce new standard algorithms for encryption and digital signatures in 2022.
Another big push from researchers is to work directly with encrypted data without needing to access a key. More and more information resides in cloud machines which may not be as reliable as those located on premises. If the data is never decrypted while the algorithms are running, secrets can be preserved but the work can be distributed to untrusted machines.
It has been possible to perform a limited number of operations on encrypted information for a period of time. Chapter 14 of my book Translucent databases, for example, describes very basic systems which can support addition but not multiplication or vice versa.
Interest has exploded over the past decade with the announcement of algorithms that can apply a wider range of operations. The first round of what some call “functional encryption” or “fully homomorphic encryption”, however, was so computationally expensive that it was not usable for joint work. Basic calculations can take days, weeks, or months.
The effort is paying off, however, and now practical implementations are appearing. IBM, for example, released its Fully Homomorphic Encryption Toolkit for MacOS, iOS, Android, and Linux this summer. The code includes examples of searches for confidential bank documents to prevent fraud.
Microsoft has released its own library using a different approach which is best for mixing addition and multiplication operations but not searches. It can be used in accounting applications, but not in those that require data to be matched.
Another approach called differential privacy is often lumped together with encryption because it shares a goal of protecting personal information. The underlying math, however, is different because the tool only offers statistical guarantees of privacy by adding just enough noise to the data to make it difficult for data items to connect with their owners. The data is not locked in a safe but lost in a sea of noise. Users can be happy because their information is likely to be safe using statistic bounds.
Microsoft and Google recently released open source toolkits for anyone looking to experiment with algorithms. Microsoft’s Basic Tools contain examples of the best way to generate privacy protection reports from SQL data sources. They have also started adding the tools to add these features to the data stored and analyzed in Azure.
Google Libraries can provide basic statistical results from a data source by counting items and calculating an average and standard deviation. The most feature-rich version is implemented in C ++, but they port the different functions to Java and Go.
One of the most publicized applications for differential privacy is managed by the US Census Bureau, which plans to release statistical summaries of the nation once the full tally is complete. The office must balance a tradition of protecting citizen privacy with the desire of communities and businesses to use data for planning. They were one of the first to start building production apps and they aim to use the algorithms on the 2020 census results.
“In 2008, we were the first organization in the world to take the concept of differential privacy from theory to practice for one of our data products,” said John M. Abowd, chief scientist at the Census Bureau. . “Since that time, it has become more evident that old privacy protection systems do not measure up to today’s data-rich digital world. This is why tech giants like Microsoft, Apple and Google use differential privacy, which is designed to protect against these kinds of threats. And that’s why more and more companies with identifiable information to protect are turning to this solution every day. “
One of the hottest areas of crypto research is the various virtual currencies like Bitcoin or Ethereum and the blockchains that govern them. These naturally rely heavily on crypto algorithms and many companies developing currencies or governance mechanisms are looking for new ways to push the different algorithms. Some want to build casinos and others want to create venture capital funds. Everyone wants to find the best ways to harness the mathematical power of algorithms to create business systems that everyone can trust.
One of the most active goals is to add layers of privacy by mixing zero-knowledge evidence into the blockchain. Early protocols used basic digital signatures to authenticate transactions, a feature that linked all transactions signed with the same key. Lately, much more effective versions of zero-knowledge evidence with names like ZK-Snark have been developed that allow a transaction to be confirmed without revealing any information about your identity. Tools like Zokrates are just one example of how developers are building additional privacy and authentication into blockchains.
Developers want to design a new generation of products that don’t just sit there. The first blockchains simply tracked ownership. The most recent add layers of software to create elaborate contracts enabling sophisticated workflows that follow modern supply chains. Some coins or tokens can pay interest and track real world assets.
David Chaum, one of the early developers of anonymous digital money, thinks we are just starting to understand what we can do with math. Algorithms will take over more and more aspects of life while increasing the level of trust and security. “Secrets have long been the key to power. ” he says. “This kind of crypto infrastructure isn’t just a better old thing, it’s a really new thing. A new world, by and for us, in which to flourish.
Copyright © 2020 IDG Communications, Inc.