When I contemplate the return of the crypto wars—attempts to block citizens’ use of encryption by officials who want unfettered spying powers—I look back with dread on the late Middle Ages. I wasn’t alive back then, but one feature of those times lingers in my consciousness. Starting around 1337 and all the way until 1453, England and France fought a series of bloody battles. The conflict went on so long it was immortalized by its centenarian length: We know it as the Hundred Years’ War.
The crypto wars haven’t yet reached that mark. (In this column I will be reclaiming the term “crypto” from its more recent and debased usage by blockchain enthusiasts, too many of whom haven’t read my 2001 book called, um, Crypto.) Dating from the publication of the groundbreaking 1976 paper that introduced public key cryptography—a means of widening access to encryption that was developed just in time for the internet—the skirmish between encryption advocates and their foes in officialdom is only just approaching 50 years.
From the start, government efforts to constrain or outlaw secure encrypted communications were vigorous and persistent. But by the turn of the millennium it appeared the fight was over. Encryption was so obviously critical to the internet that it was built into every browser and increasingly included in messaging systems. Government snooping didn’t end—check out Edward Snowden’s revelations—but certain government elements around the world never got comfortable with the idea that citizens, including the most rotten among us, could share secrets safe from the eyes of surveillants. Every few years, there’s a flareup with proposed new regulations, accompanied by scary scenarios from the likes of FBI directors about “going dark.”
The arguments of the anti-crypto faction are always the same. If we allow encryption to flourish, they plead, we’re protecting terrorists, child pornographers, and drug dealers. But the more compelling counterarguments haven’t changed, either. If we don’t have encryption, no one can communicate securely. Everyone becomes vulnerable to blackmail, theft, and corporate espionage. And the last vestiges of privacy are gone. Building a “back door” to allow authorities to peek into our secrets will only make those secrets more accessible to dark-side hackers, thieves, and government agencies operating off the books. And even if you try to outlaw encryption, nefarious people will use it anyway, since the technology is well known. Crypto is toothpaste that can’t go back in the tube.
The good news is that so far encryption is winning. After a long period where crypto was too hard for most of us to use, some extremely popular services and tools have end-to-end encryption built in as a default. Apple is the most notable adopter, but there’s also Meta’s WhatsApp and the well-respected standalone system Signal.
Still, the foes of encryption keep fighting. In 2023, new battlefronts have emerged. The UK is proposing to amend its Investigatory Powers Act with a provision demanding that companies provide government with plaintext versions of communications on demand. That’s impossible without disabling end-to-end encryption. Apple has already threatened to pull iMessage and FaceTime out of the UK if the regulation passes, and other end-to-end providers may well follow, or find an alternative means to keep going. “I’m never going to willingly abandon the people in the UK who deserve privacy,” says Signal president Meredith Whittaker. “If the government blocks Signal, then we will set up proxy servers, like we did in Iran.”
And now the US Senate has opened its own front, with a proposed law called the Cooper Davis Act. It renews the crypto war under the guise of the perpetually failing war on drugs. In its own words, the legislation “require(s) electronic communication service providers and remote computing services to report to the Attorney General certain controlled substances violations.”
In other words, if someone is doing drug deals using your phone service or private messaging app, you must drop a dime on them. This, of course, requires knowing what people are saying in their personal messages, which is impossible if you provide them end-to-end encryption. But one draft of the bill sternly says that providing privacy to users is no excuse. A company’s deniability, reads the provision, applies only “as long as the provider does not deliberately blind itself to those violations.” No cryptanalysis is required to decode the real message: Put a back door into your system—or else.
Let me linger on the term “deliberate blindness.” Congress-critters, don’t you get that “deliberate blindness” is the essence of encryption? The makers and implementers of end-to-end privacy tools make a conscious effort—a deliberate one, if you will—to ensure that no one, including the system’s creators, can view the content that private citizens and businesses share among themselves. Signal’s Whittaker zeros in on the insidious irony of the nomenclature. “For many years, we've been talking about tech accountability and the need to enhance privacy and rein in the surveillance capacities of these companies,” she says. “But here you have a willful turn of phrase that frames the unwillingness to surveil as something negative.”
Right now, we need more end-to-end encryption. There’s little evidence that weakening encryption will make much of a dent on the fentanyl trafficking on our streets. But after the US Supreme Court’s Dobbs decision, end-to-end encryption is now a critical means of thwarting attempts to prosecute women who seek abortions in states where politicians lay claim to their major life choices. Last year, Meta turned over private messages from a Facebook user to Nebraska police that led to felony charges against a mother who aided her daughter in ending a pregnancy by abortion pills. If those messages had been protected by end-to-end encryption—as WhatsApp and Signal messages are—authorities would not have been able to read them. If “deliberate blindness” is banned, watch out for widespread snooping to find out who might be seeking abortions.
And there’s another new reason why encryption is more important than ever. With the rise of generative AI we are now expected to share every thought, including some pretty intimate ones, with AI chatbots. Sorry, OpenAI, Inflection, Microsoft, and Google—I don’t want you to read those conversations. They’re between me and my bot. I’ll let you see those prompts only when you pry them from my cold dead fingers.
Will the crypto wars continue until they reach the century mark? Hard to say. Climate disaster or superintelligent systems may well decimate our species and make encryption protocols the least of our problems. If such catastrophes don’t happen (fingers crossed) my bet is that official scare tactics and demands for back doors will still be around in 2076. Happy anniversary, public key!
In my 2001 book Crypto, I described the first iteration of the crypto wars. The invention of public key cryptography, along with a commercial implementation, created a path for secure electronic commerce. But the US National Security Agency, and law enforcement, tried to thwart that movement. Their instrument was export regulations, which got tweaked to make it illegal to sell software overseas that used strong encryption. It was critical to change the laws standing in the way of secure online communication and commerce, but government agencies had cowed legislators with doomy scenarios of what would happen if bad actors encrypted their emails and files. By 1993, however, some legislators started resisting the fear tactics—the same fear tactics still being used three decades later.
In October 1993, [US representatives] Gejdenson and Cantwell held a subcommittee hearing to draw attention to the [crypto] problem. “This hearing is about the well-intentioned attempts of the National Security Agency to control that which is uncontrollable,” said Gejdenson … While the majority of legislators accepted the NSA’s contentions at face value, a cognitive dissonance was emerging between its arguments and what appeared to be a compelling view of reality. On one hand was a mind-set so locked into Cold War posturing that it ignored the inevitable. On the other were the techno-visionaries who powered our future, eager to fortify American ascendancy in a global marketplace …
After the public session, security experts swept the room for bugs before the inevitable follow-up hearing involving the interests of the National Security Agency: The Briefing. NSA briefings were notorious in Congress. They involved a dramatic presentation by the NSA on why our international eavesdropping abilities were so vital, typically involving a litany of victories achieved by clandestine snooping, and perilous international situations that require continued vigilance and support. Perfected by Bobby Ray Inman in his days as NSA director, they initiated legislators into the society of Top Secret, implicitly shifting their alliance from the citizenry to the intelligence agencies. A newly cleared congressperson would get a presumably unvarnished and reportedly terrifying dose of global reality … Representatives and senators had been known to emerge grim-faced, stunning their go-go staffers by remarking, “Well, maybe we should reconsider.”
Not Maria Cantwell. She was among a growing number of legislators who found The Briefing impressive but not persuasive. The issue for those skeptics wasn’t what successes we had breaking codes, but whether maintaining export rules was actually productive. If the genie was out of the bottle, so what if American companies couldn’t export. Crooks would get crypto elsewhere!
Wolfgang asks, “Why is building a remote work culture so hard, and what can be done about it?”
Thanks, Wolfgang. The first part of your question is easy to answer. It’s hard to create a great work culture when everyone’s working remotely because people are working remotely. In person, solid work relationships emerge spontaneously, conversations happen anytime, and sometimes a colleague takes a break from parental leave and brings in a baby and everyone goes nuts.
Making those culture-affirming moments happen remotely is tough. If I had the solution, I’d make a fortune hiring myself out as a consultant. I do have a few suggestions, gleaned from my experience as a worker for a big employer and a reporter who observes what’s happening in a lot of companies.
First, every week, do an all-hands where everybody gathers. In addition to company-wide announcements, choose one team to share what they’re doing. Leave time for Q and A. If it’s a big company, add another meeting for each suborganization. Second, when new employees join, greet them with a series of one-on-ones, not only with their managers and close coworkers, but experienced colleagues who can help them navigate the org.
A related question is how to get people back to the office. One idea is to make the office more attractive. If they are returning to workplaces where their previous customized spaces are gone, and they have to scramble for desks, they will dread returning. In the tech world in particular, many companies are urging employees back to work at the same time they are cutting back perks. That makes financial sense, but disincentivizes people to come back. If they lose their micro kitchens, they’ll yearn for their home kitchens.
Also, during the pandemic, lots of executives moved away from their headquarters to luxury homes around the globe. In many cases, they still spend a lot of time there, even though their underlings are taking the corporate buses to HQ. Those employees might reasonably ask themselves, “They want me back at my workstation so I can Zoom with my boss in Hawaii, Reno, or London?” Hey, you millionaires, set an example! At the very least, don’t share pictures of your aquatic adventures or—not naming names or anything—your MMA training. Instead put out a shingle that says “Office hours—the latte is on me.”
You can submit questions email@example.com. Write ASK LEVY in the subject line.
Why generative AI won’t disrupt books. By ChatGPT. Just kidding.
Despite the war, Ukraine’s once-thriving tech sector is hanging on.
As Oppenheimer hits theaters (I loved it!), here’s a warning that we should monitor AI the way we do The Bomb.
OMG, there’s a supersize AeroPress coffee maker!
Don't miss future subscriber-only editions of this column.Subscribe to WIRED (50% off for Plaintext readers)today.
Get More From WIRED
Credit belongs to : www.wired.com