After spending decades focused on the intersection of consumer privacy, cryptography, and digital currencies, for me the Libra hearings and today’s regulatory hearings in front of the U.S. Senate are an exciting moment for the blockchain industry. This newfound Congressional attention on consumer privacy is a step in the right direction, and Facebook’s past transgressions with consumer data have rightly heightened public sensitivity toward the privacy issues inherent in large, centralized platforms.
Legislative hearings can be a watershed moment of public attention on a movement, and a sign that an industry has moved from the shadows to the spotlight. But as exciting as these hearings have been lately, it also felt very familiar, because believe it or not, these meetings have happened before.
24 years ago, in 1995, Congress held a similar round of hearings focused on “The Future of Money.” Held before the House Subcommittee on Domestic and International Monetary Policy, they focused on the emergence of digital currencies (led by my company DigiCash and by ecash), the privacy implications of the coming e-commerce boom, and the potential adoption of privacy-preserving technologies. I was among those who testified, along with corporate representatives of companies like Visa and Mastercard who can be seen popping up today in the Libra Foundation.
During the hearings, I pushed for the adoption of technologies, not just legislation, as the most effective way of preserving privacy in the coming age of the internet. As a technologist, I believe supporting new creative technologies is the most effective way to protect privacy, and that they can do so in a way special interest-backed legislation rarely can.
It’s a message I’ve pushed for decades dating back to graduate school at Berkeley, when my interest in technology, particularly encryption, as a method for protecting personal privacy really began. At that time the “privacy community” was very legislatively focused. Largely non-technical, the majority of people involved were appointed government officials, consultants, advocacy groups, talking heads, and conference organizers. They almost uniformly believed that the privacy problem was a legal one.
I had just started my own research on new technical solutions to the privacy problem while at Berkeley, and had come up with a few things like mix networks and blind signatures that I thought deserved a place in the privacy conversation. With that in mind, I set out to convince other leaders in the privacy space that the right new technology, not just legislation, was a viable solution to the problem we shared.
I traveled to Europe to meet with people in Norway, Sweden, and Germany as well as various places around the U.S. in search of like-minded experts who would see technology as a solution. But time and time again the answer was the same: privacy was inherently a legal issue, to be solved by policy, and technology had no role in its solution. It was a deeply rooted belief. Institutions with a strong focus on preserving free speech and privacy were not merely passively disinterested, but actively opposed to tech as a solution. The Director of the ACLU’s Privacy Technology Project went so far as to send a response letter saying in no uncertain terms that privacy was a matter for legal experts, there was no role for technology in protecting privacy whatsoever, and that my ideas represented “a form of societal paranoia.”
This was in response to my 1985 paper, “Security Without Identification”, which mapped out the potential for a massive, centralized company in the coming digital age that would leverage metadata and marketing to consolidate power, monopolize media distribution, and ultimately threaten election integrity by dangerously mishandling that data. Clearly it was an outcome that seemed outlandish to many at the time. Now in hindsight that “societal paranoia” reads eerily descriptive of a certain platform currently sitting through meetings on the Hill.
“The inadequate security and the accumulation of personally identifiable records, moreover, pose national vulnerabilities. Additionally, the same sophisticated data acquisition and analysis techniques used in marketing are being applied to manipulating public opinion and elections as well.”
-David Chaum, 1982
At that point it was clear to me that those in charge of stewarding privacy were only interested in pursuing legislative solutions. So I set out on my own to improve the technologies I’d discovered and develop a strategy for rolling them out to the world. That strategy ultimately resulted in DigiCash, new voting technologies, Elixxir, and more, which I’ll discuss in detail in future posts. But sitting here in 2019, as the dust settles after another round of hearings focused on the future of digital privacy, it seems that we have a chance to learn from the previous three decades’ mistakes and empower technologies, not just legislation, to usher in the age of privacy and digital sovereignty we deserve. I’m optimistic about what we can accomplish.