Despite growing consensus about the magnitude of cyber security threats, a clear strategy for securing the United States’ critical digital infrastructure has yet to be reached. This is partially due to the complexity of cyber security issues, which intersect computer science, law, policy, economics, public opinion, and ethics. In recent years, however, the Hoover Institution has helped scholarship and dialogue on cyber security to move forward by channeling the expertise of Hoover fellows, Stanford University, and Silicon Valley, as well as extending these resources to policy makers and the media.
Hoover’s Cyber Security Boot Camps, led by Hoover fellows Amy Zegart and Herbert Lin in partnership with Stanford University’s Cyber Policy Program and the Center for International Security and Cooperation (CISAC), are key components of these efforts. Past boot camps have assembled senior congressional staff from both sides of the aisle for expert briefings and discussions about the law, policy, and technology pertaining to cyber security. This year, Zegart and Lin shifted the program’s focus toward national media, partnering with Hoover’s public affairs team to host a cyber security themed Media Roundtable.
Following the format of previous Media Roundtables, Hoover brought dozens of reporters from leading outlets such as the Wall Street Journal, Washington Post, and New York Times together with cyber policy and technology experts on May 16, 2016. The program featured presentations, interactive discussion, and thought-provoking exercises designed to aid reporters in understanding and communicating cyber security news and debates. The interactive atmosphere also helped strengthen lines of communication between the reporters, technology experts, and strategists tasked with making sense of the changing cyber security landscape.
Amy Zegart, Davies Family Senior Fellow at Hoover, introduced attendees to the unique challenges of crafting cyber security policy. Zegart discussed the exceptional vulnerability of powerful countries to cyber threats, consumer driven connectivity as a factor that increases cyber risks, and the obstacles to protecting privately held cyber infrastructure at a time of acute mistrust of government.
John Villasenor, a professor of electrical engineering, public policy, and management; visiting professor of law at UCLA; and a national fellow at the Hoover Institution, introduced the technical challenges associated with cyber security. Villasenor discussed the irreversible growth of cyberspace as mobile connectivity proliferates and data storage costs plummet, the overwhelming complexity of cyber systems, and the startling capabilities of hackers in identifying and exploiting security weaknesses.
Herbert Lin, Hoover research fellow and senior research scholar for cyber security and policy at CISAC, applied his expertise to an often-overlooked topic in cyber security: the role of offensive cyber tactics. Where passive defenses such as network security or law enforcement fail, offensive measures can prove critical in disrupting or identifying the source of cyber security breaches. Lin also discussed the potential use of offensive cyber tactics against our adversaries without waiting for incoming attacks, which he likens to “punching” in cyberspace, rather than “punching back.”
Carey Nachenberg, a vice president and fellow at Symantec Corporation and prolific developer of cyber security technology, delivered a technical primer on cyber exploitation. Nachenberg described ways that design flaws, human error, and the sheer complexity of cyber systems create potential vulnerabilities. He also provided a step-by-step walkthrough of various tactics hackers use to exploit these weaknesses, including denial of service attacks, computer worms, and manipulating human agents.
Jack Goldsmith, senior fellow at Hoover and the Henry L. Shattuck Professor of Law at Harvard, discussed the complications of applying international law designed to address traditional uses of force to cyber hostilities. Goldsmith highlighted the problematic distinction between cyber attacks, which constitute illegal acts of international aggression, and exploitations, which constitute legal acts of espionage.
Elaine Korzak, a W. Glenn Campbell and Rita Ricardo-Campbell National Fellow at Hoover, reported on the evolving UN response to cyber security concerns. After decades of review, UN action on cyber law gained traction in 2014 with a milestone report recognizing the applicability of international law to cyberspace. A subsequent 2015 report recommended several cooperative steps on cyber security, although the proposed rules and norms rely on voluntary implementation.
The roundtable also featured interactive exercises to expand media perspectives on cyber issues, including a detailed simulation of a cyber security breach at a major web services company. Participants formed groups to address technical, legal, public relations, and other concerns related to the breach and presented their strategies to real-world private-sector cyber security experts. Hoover invited four other cyber security leaders to discuss what the media is getting right and wrong on cyber coverage and how reporters can develop stronger relationships with private sector sources.
The 2016 Cyber Media Roundtable covered a wide range of complex topics, and the engagement of participants signaled strong interest in internalizing the material. Discussion periods spilled into breaks, and participants asked penetrating questions characteristic of good reporting.
Reflecting on the outcomes of the event, Amy Zegart stated:
The media cyber boot camp was a great success—giving some of the nation’s top national security reporters a fast and deep dive into key cyber issues, developing broader networks of experts to help inform the public debate, and enabling candid conversation with industry leaders about what the press can do to improve coverage of cyber issues. Our vision is to hold a boot camp every year to educate a wide range of key policymakers and influencers—including congressional staff, federal judges, and the press.
Moving cyber policy forward will require continued attention to issues raised in the Media Roundtable. How can tensions between government and the private sector be eased to allow for greater cooperation? Can current international rules and norms be applied to cyber issues? To what extent do legal and ethical considerations permit “hacking back” or even hacking first? Where should reasonable expectations for cyber security be set in light of the overwhelming complexity of cyber systems?
As the larger policy community expands their focus on these and other key cyber security questions, Hoover’s ongoing research and outreach will help inform their answers.
Hero Image
CISAC co-director Amy Zegart (center) speaks during a simulated cybersecurity breach group exercise at the 2016 Cyber Media Roundtable at Stanford.
Abstract: The NERC-CIP standards are the only federally mandated cybersecurity standards for critical infrastructure in the United States. Targeting the electric system, the standards have been developed to ensure the reliability and the resilience of the electric grid and prevent catastrophic failures. Although the standards have been around for almost a decade, their role in building the resilience of the electric grid is fiercely contested, with critics claiming the standards represent little more than a ‘check box’ exercise that directs attention and resources away from achieving real security. This talk will present evidence on the effectiveness of the standards in addressing risk and offer suggestions as to how the standards might be improved to enhance resilience.
About the Speaker: Aaron Clark-Ginsberg is a U.S. Department of Homeland Security Cybersecurity Postdoctoral Scholar at CISAC. His research interests center on the theory and practice of disaster risk governance, particularly resilience and disaster risk reduction approaches. He is currently researching how government regulations designed to improve the resilience of the power grid to cyber-threats are affecting utility companies.
Aaron holds a PhD and MSc in Humanitarian Action from the University College Dublin and a BA in American Studies with a Concentration in Environmental Studies from Kenyon College. Aaron's doctoral research examined how international NGOs interacted with national stakeholders to reduce disaster risk in developing countries. As part of this, Aaron traveled to ten countries in Asia, Africa, and the Caribbean to review risk reduction and resilience building approaches addressing a variety of hazards including flooding, drought, price shocks, cyclones, landslides, erosion, disease, and conflict.
Aaron has extensive experience in real world application of risk management principles. Aaron’s PhD was in conjunction with Concern Worldwide, an international Irish humanitarian organization. While at Concern, Aaron produced a series of reports on risk management in different countries and contexts designed to improve the effectiveness of Concern’s approach to risk reduction. He has also conducted policy-focused research on humanitarian reform for the World Humanitarian Summit Irish Consultative Process, the results of which were used to help develop the Irish position on humanitarian action. Aaron also spent four seasons working as a wildland firefighter for various governmental and private sector organizations across the western United States.
Cybersecurity Regulations and Power Grid Resilience (preliminary findings)
It’s a quintessential Silicon Valley scene. A group of tech-savvy Stanford students are delivering a passionate pitch about a product they hope is going to change the world, while a room full of venture capitalists, angel investors and entrepreneurs peppers them with questions.
But there’s a twist. This Stanford classroom is also packed with decorated military veterans and active duty officers. And a group of analysts from the U.S. intelligence community is monitoring the proceedings live via an iPad propped up on a nearby desk.
These Stanford students aren’t just working on the latest “Uber for X” app. They’re searching for solutions to some of the toughest technological problems facing America’s military and intelligence agencies, as part of a new class called Hacking for Defense.
A student team briefs the class on a wearable sensor they're developing for an elite unit of U.S. Navy SEALs – a product they're pitching as "fitbit for America's divers."
“There’s no problems quite like the kind of problems that the defense establishment faces, so from an engineering standpoint, it has the most powerful ‘cool factor’ of anything in the world,” said Nitish Kulkarni, a senior in mechanical engineering.
Kulkarni’s team is working with an organization within the US Department of Defense to devise a system that will provide virtual assistance to Afghan and Iraqi coalition forces as they defuse deadly improvised explosive devices.
“At Stanford there’s a lot of opportunities for you to build things and go out and learn new stuff, but this was one of the first few opportunities I’ve seen where as a Stanford student and as an engineer, I can go and work on problems that will actually make a difference and save lives,” said Kulkarni.
A 21st century tech ROTC
That’s exactly the kind of “21st century tech ROTC” model of national service that Steve Blank, a consulting associate professor at Stanford’s Department of Management Science and Engineering, said he had in mind when he developed the class.
“The nation is facing a set of national security threats it’s never faced before, and Silicon Valley has not only the technology resources to help, but knows how to move at the speed that these threats are moving at,” said Blank.
MBA student Rachel Moore presents for Team Sentinel, which is working with the U.S. 7th Fleet to find better ways to analyze drone and satellite imagery.
The students’ primary mission will be to produce products that can help keep Americans and our allies safe, at home and abroad, according to Blank.
Former U.S. Army Special Forces Colonel Joe Felter, who helped create the class and co-teaches it with Blank, said the American military needs to find new ways to maintain its technological advantage on the battlefield.
“Groups like ISIS, al–Qaeda and other adversaries have access to cutting edge technologies and are aggressively using them to do us harm around the world,” said Felter, who served in Iraq and Afghanistan and is currently a senior research scholar at Stanford’s Center for International Security and Cooperation (CISAC) and research fellow at the Hoover Institution.
“The stakes are high – this is literally life and death for our young men and women deployed in harm’s way. We’re in a great position here at Stanford and in Silicon Valley to help make the connections and develop the common language needed to bring innovation into the process, in support of the Department of Defense and other government agencies’ missions.”
Startup guru Steve Blank shares a light moment with a group of students.
The class is an interdisciplinary mix of undergraduate and graduate students, from freshman to fifth year PhD student.
“It’s like a smorgasbord of all these people coming together from different parts and different schools of Stanford, and so I think that’s just a really cool environment to be in,” said Rachel Moore, a first-year MBA student.
Moore’s team includes electrical and mechanical engineering students, and they’re working together to develop a system to enable the Navy’s Pacific Fleet to automatically identify enemy ships using images from drones and satellites.
Tough technological challenges
Months before the course start date, class organizers asked U.S. military and intelligence organizations to identify some of their toughest technological challenges.
Class co-teacher Pete Newell throws his hands up to celebrate a student breakthrough.
U.S. Army Cyber Command wanted to know if emerging data mining, machine learning and data science capabilities could be used to understand, disrupt and counter adversaries' use of social media.
The Navy Special Warfare Group asked students to design wearable sensors for Navy SEALs, so they could monitor their physiological conditions in real-time during underwater missions.
Intelligence and law enforcement agencies were interested in software that could help identify accounts tied to malicious “catfishing” attempts from hackers trying to steal confidential information.
And those were just a few of the 24 problems submitted by 14 government agencies.
Developing Solutions
The class gives eight teams of four students 10 weeks to actively learn about the problem they are addressing from stake holders and end users most familiar with the problem and to iteratively develop possible solutions or a “minimum viable product,” using a modified version of Steve Blank’s “lean launchpad methodology,” which has become a revered how-to guide among the Silicon Valley startup community.
Rachel Olney, a graduate student in mechanical engineering, tries on a military-grade dry suit on a visit to the 129th Rescue Wing at Moffett Field.
A key tenet of Blank’s methodology is what he calls the “customer discovery process.”
“If you’re not crawling in the dirt with these guys, then you don’t understand their problem,” Blank told the class.
One student team, which was working on real-time biofeedback sensors and geo-location devices for an elite team of Navy SEALS (a project they were initially pitching at “fitbit for America’s divers”), earned a round of applause from the class when they showed a slide featuring photos from a field trip they took to the 129th Rescue Wing at Moffett Field to find out what it felt like to wear a military-grade dry suit.
Rachel Olney, a graduate student in mechanical engineering, said the experience of squeezing into the tight suit and wearing the heavy dive gear gave her a better appreciation for the physical demands that Navy SEALs have to deal with during a mission.
“They’re diving down to like 200 feet for up to six to eight hours…and during that time they can’t eat, they can’t hydrate, they’re physically exerting a lot, because they’re swimming miles and miles and miles at depth and they can’t see and they can’t talk to each other,” Olney said.
Image
“It’s probably one of the most extreme things that humans do right now.”
Another group came in for some heavy criticism from the teaching team for failing to identify and interview enough end users.
But the next week, they were back in front of the class showing a video from a team member’s visit to an Air Force base in Fresno, where he logged some time inside the 90-pound bomb suit that explosive ordinance disposal units wear in the field.
“You can’t address a customer issue unless and until you really step into the shoes of the customer,” said Gaurav Sharma, who’s a student at Stanford's Graduate School of Business.
“That was the exact reason why I went to Fresno and wore the bomb suit, to get into the shoes of the end customer.”
Navigating the defense bureaucracy
Active duty military officers from CISAC’s Senior Military Fellows program and the Hoover Institution’s National Security Affairs Fellows program act as military liaisons for the class and help students navigate the complex defense bureaucracy.
“[The students] have really just jumped in with both feet and immersed themselves in this Department of Defense world that for so many civilians is just very foreign to them,” said U.S. Army Colonel John Cogbill, who has spent the last year as a senior military fellow at CISAC.
“I think they will come away from this experience with a much better appreciation of what we do inside the Department of Defense and Intelligence community, and where there are opportunities for helping us do our jobs better.”
Cogbill said he hoped that some of the inventions from the class, like an autonomous drone designed to improve situational awareness for Special Forces teams, could help the troops on his next combat deployment, where he will serve as the Deputy Commanding Officer of the U.S. Army’s elite 75th Ranger Regiment.
“It’s not just about making them more lethal, it’s also about how to keep them alive on the battlefield,” said Cogbill.
Students also get support from their project sponsors and personnel at the newly established Defense Innovation Unit Experimental (DIUx) stationed at Moffett Field.
Tech saves lives on the battlefield
Another key member of the teaching team is Pete Newell, who was awarded the Silver Star Medal (America’s third-highest military combat decoration), for leading a U.S. Army battalion into the Battle of Fallujah, where he survived an ambush and left the protection of his armored vehicle in an attempt to save a mortally wounded officer.
Class co-teacher and Silver Star Medal recipient Pete Newell explains some of the classic reasons why military products fail in the field.
Newell said he saw first-hand the difference that technology can make on the battlefield in his next job, when he served as director of the U.S. Army’s Rapid Equipping Force, which was tasked with creating technological solutions to the troops fighting in Afghanistan.
“What I realized is that the guys on the front edge of the battlefield who were actually fighting the fight, don’t have time to figure out what the problem is that they have to solve,” Newell said.
“They’re so involved in just surviving day to day, that they really don’t have time to step back from it and see those problems coming, and what they needed was somebody to look over their shoulder and look a little deeper and anticipate their needs.”
One of the first and most urgent problems Newell faced on the job was responding to the sudden spike in IED attacks on dismounted infantry.
The Army was still using metal detector technology from the ‘50s to find mines, but the new breed of IEDs, which were often hidden inside buried milk jugs, were virtually undetectable to the outdated technology.
Former U.S. Army Colonel Pete Newell demystifies some military jargon for the class.
“They could create an improvised explosive device and a pressure plate trigger…by using almost zero metal content,” Newell said. “It was almost impossible to find.”
Newell’s solution was a handheld gradiometer, the kind of technology used to find small wires in your backyard during a construction project, paired with a ground penetrating radar that can see objects underground.
But by the time the new technology reached troops in the field last summer, more than 4,000 had been wounded or killed in IED attacks.
Newell said he hoped the class would help get life-saving technology deployed throughout the military faster.
“I think it’s important to enable this younger generation of technologists to actually connect with some of the national security issues we face and give them an opportunity to take part in making the world a safer place,” Newell said.
Tom Byers, an entrepreneurship professor in Management Science and Engineering and faculty director of the Stanford Technology Ventures Program, rounds out the teaching team and brings his experience in innovation education and entrepreneurship to the classroom.
Inspiring the next generation
Students said the opportunity to find solutions to consequential problems was their primary inspiration for joining the class.
“When I first came to Stanford, the hype around entrepreneurship was very much around, ‘go out, make an app, do something really fun and cool, and get rich’,” said Darren Hau, a junior in Electrical Engineering.
Students share a laugh during a class break.
“In Hacking for Defense, I think you’re seeing a lot of people bring that same entrepreneurial mindset into a problem statement that seems a lot more impactful.”
Felter said he was humbled that so many students were willing to serve in this way.
“It’s encouraging to find out that students at one of our top universities are very interested and highly motivated to work very hard and use their skills and expertise and talent and focus it on these pressing national security problems,” said Felter.
The teaching team said they planned on expanding their class to other universities across the country in the coming years, to create a kind of open source network for solving unclassified national security problems.
For military officers like Cogbill, who will likely soon be leading U.S. soldiers into combat, that’s welcome news.
“Every time you run a course, that’s eight more problems,” Cogbill said.
“If this scales across 10, 20, 30, 40 more universities, you can imagine how many more problems can be solved, and how many more lives can potentially be saved.”
Hero Image
Influential startup educator Steve Blank (center) gives advice to Stanford students working on tough national security problems, while retired U.S. Army Colonels and class co-teachers Joe Felter (right) and Pete Newell (left) listen in.
Abstract: The disclosure of software vulnerabilities has stirred controversy for decades among security researchers and software vendors, and more recently governments. Despite increasing interdependency of software and systems (e.g., the Internet of Things) and resulting complexity in vulnerability disclosure and coordination, no unified norms have yet emerged.
This talk addresses the development of norms that (attempt to) govern the disclosure of software security flaws in relation to structural changes of the software industry and the Internet. This includes new forms of private, but monetarily rewarded disclosure on markets and through bug bounty programs, as well as government efforts to prohibit proliferation of knowledge and technology through export controls. Recently, governments acknowledged the withholding of vulnerability information on the grounds of national security and law enforcement needs, trading off against the need for defensive security of civilian computers and networks.
The talk outlines pressing policy issues and connects them to recent developments (e.g., Apple vs. FBI). It concludes by making the case for why norms on vulnerability disclosure are an essential component in shaping cybersecurity governance.
About the Speaker: Andreas Kuehn is a Ph.D. Candidate in Information Science and Technology at Syracuse University. He joined CISAC as a Zukerman Cybersecurity Predoctoral Fellow in October 2014. Prior, he was a visiting graduate student at Cornell University’s Department of Science & Technology Studies. He holds a M.Sc. in Information Systems from the University of Zurich, Switzerland.
In his dissertation, Andreas examined the historical, organizational, and institutional developments of software vulnerability and exploit markets as they are shaped by the perennial controversy on vulnerability disclosure. His qualitative, empirical research on emerging technologies and governance is informed by Science and Technology Studies and Institutional Theory.
Cybersecurity Predoctoral Fellow
CISAC, Stanford University
Abstract: For four years running now, the Director of National Intelligence’s Worldwide Threat Assessment to Congress has led with cyber threats to national and international security. Under statute, the several National Intelligence Officers constitute the most senior advisors of the US Intelligence Community in their areas of expertise. This discussion with the National Intelligence Officer for Cyber Issues will begin by highlighting the technology trends that are having a transformational change on cyber security and the future of intelligence. It will then assess strategic developments in international relations and their implications for deterring malicious activity in cyberspace. The analysis will focus on the (in)applicability of existing arms control mechanisms and deterrence principles to modern information and communication technologies.
About the Speaker: Sean Kanuck was appointed as the first National Intelligence Officer for Cyber Issues in May 2011. Mr. Kanuck came to the National Intelligence Council after a decade of experience in the Central Intelligence Agency’s Information Operations Center, including both analytic and field assignments. In his Senior Analytic Service role, he was a contributing author for the 2009 White House Cyberspace Policy Review, an Intelligence Fellow with the Directorates for Cybersecurity and Combating Terrorism at the National Security Council, and a member of the United States delegation to the United Nations Group of Governmental Experts on international information security.
Prior to government service, Mr. Kanuck practiced law with Skadden Arps et al. in New York, where he specialized in mergers and acquisitions, corporate finance, and banking matters. He is a member of the International Institute for Strategic Studies, and his academic publications focus on information warfare and international law. Mr. Kanuck holds degrees from Harvard University (A.B., J.D.), the London School of Economics (M.Sc.), and the University of Oslo (LL.M.).
Sean P. Kanuck
National Intelligence Officer for Cyber Issues (until April 2016)
Office of the Director of National Intelligence
The Critical Infrastructure Initiative builds the cyber-resilience of critical infrastructure through methodologically diverse outputs-oriented research and engagement with end users and homeland security practitioners. The initiative was launched in 2016 in the recognition of the need to address growing threat that cyber-incidents pose to the functioning of the basic infrastructure that societies depend upon. For this initiative, Stanford has partnered with 11 other institutes to found the Critical Infrastructure Resilience Institute (CIRI), an institute focused on research and education to designed enhance the resiliency of the nation’s critical infrastructures. CIRI is led by the University of Illinois at Urbana-Champaign and funded by the Department of Homeland Security.
Stanford cyber-security innovators Whitfield Diffie and Martin Hellman, who brought cryptography from the shadowy realm of classified espionage into the public space and created a major breakthrough that enabled modern e-commerce and secure communications over the Internet, are being honored with the Association for Computing Machinery's 2015 A.M. Turing Award.
The award is often referred to as the "Nobel Prize of computing" and comes with a $1 million prize funded by Google.
The Association for Computing Machinery (ACM) made the official announcement this morning at the RSA conference in San Francisco – one of the largest gatherings of cryptographers working on Internet security.
[[{"fid":"222241","view_mode":"crop_870xauto","fields":{"format":"crop_870xauto","field_file_image_description[und][0][value]":"Martin Hellman (left) and Whitfield Diffie (right), winners of the 2015 Association for Computing Machinery's A.M. Turing Award, are shown in this 1977 photo.","field_file_image_alt_text[und][0][value]":"Martin Hellman (left) and Whitfield Diffie (right), winners of the 2015 Association for Computing Machinery's A.M. Turing Award, are shown in this 1977 photo.","field_file_image_title_text[und][0][value]":"Martin Hellman (left) and Whitfield Diffie (right), winners of the 2015 Association for Computing Machinery's A.M. Turing Award, are shown in this 1977 photo.","field_credit[und][0][value]":"Chuck Painter / Stanford News Service","field_caption[und][0][value]":"Martin Hellman (left) and Whitfield Diffie (right), winners of the 2015 Association for Computing Machinery's A.M. Turing Award, are shown in this 1977 photo.","field_related_image_aspect[und][0][value]":"","thumbnails":"crop_870xauto"},"type":"media","attributes":{"alt":"Martin Hellman (left) and Whitfield Diffie (right), winners of the 2015 Association for Computing Machinery's A.M. Turing Award, are shown in this 1977 photo.","title":"Martin Hellman (left) and Whitfield Diffie (right), winners of the 2015 Association for Computing Machinery's A.M. Turing Award, are shown in this 1977 photo.","width":"870","style":"width: 450px; height: 693px; margin-left: 15px; float: right;","class":"media-element file-crop-870xauto"}}]]Diffie and Hellman's 1976 paper "New Directions in Cryptography" stunned the academic and intelligence communities by providing a blueprint for a revolutionary new technique that would allow people to communicate over an open channel, with no prearrangement, but keep their information secret from any potential eavesdroppers.
They called it public-key cryptography.
They also showed how, by reversing the order of operations, it was possible to create a "digital signature." Like a written signature, this has to be easy for the legitimate signer to create and for everyone else to verify. But it has to be difficult – preferably impossible – for anyone else to sign new messages. Unlike a written signature, which looks the same even if it's taken from a $1 check and forged onto a $1,000,000 check, a digital signature can only be used with the specific message that was signed.
Digital signatures and the "digital certificates" or "certs" they can produce are critical components in the modern security architecture. They allow your browser to know that your bank is really who it claims to be, and they allow iPhones to only run software signed by Apple.
"Their 1976 invention is widely viewed as the birth of modern cryptography," said Dan Boneh, Stanford professor of computer science and electrical engineering and co-director of the Stanford Cyber Initiative.
"Simply put, without their work, the Internet could not have become what it is today," Boneh said. "Billions of people all over the planet use the Diffie-Hellman protocol on a daily basis to establish secure connections to their banks, e-commerce sites, e-mail servers, and the cloud."
Threat of jail time
It was a feat made even more impressive by the fact that little serious academic scholarship on cryptography existed at the time of their invention outside the realm of classified research conducted under the purview of secretive government agencies such as the National Security Agency. Hellman said academic colleagues had tried to discourage him from pursuing his interest in cryptography early in his career because of the NSA's virtual monopoly on the subject.
Martin Hellman explains the principles of encryption in a Stanford classroom in this photo taken in the late '70s.
"They said, 'You're wasting your time working on cryptography because the NSA has such a huge budget and a several-decades head start," said Hellman, Stanford professor emeritus of electrical engineering. "How are you going to come up with something they don't already know? And if you come up with something good, they'll classify it.'"
Diffie and Hellman clashed with the NSA over their publications, including one that claimed that the agency had pressured IBM to weaken the National Bureau of Standards' Data Encryption Standard (DES) by limiting the key size to 56 bits instead of a stronger option of 64 bits or higher.
After the publication of "New Directions in Cryptography" and another paper on the DES key size, the conflict intensified as the NSA waged a concerted campaign to limit the distribution of Diffie and Hellman's research.
An NSA employee even sent a letter to the publishers warning that the authors could be subject to prison time for violating U.S. laws restricting export of military weapons.
These skirmishes became known as the first of the "crypto wars."
Ultimately, the NSA failed to limit the spread of their ideas, and public key cryptography became the backbone of modern Internet security.
"Cryptography is the one indispensable security technique," said Diffie, who was a part-time researcher at Stanford at the time he and Hellman invented public-key cryptography. "There are lots of other things needed, but if the government had succeeded in blocking people from having strong cryptographic systems … it would have meant you could not have had security on the Internet."
Cryptography's starring role
Diffie and Hellman said the U.S. government's recent demands that Silicon Valley companies build so-called back doors into their products so law enforcement and intelligence agencies could access encrypted messages reminded them of the first crypto war. As then, the government did not have a workable proposal for how to create those back doors without undermining the security of those products.
Diffie and Hellman both said they sided with Apple in the current legal standoff over the FBI's request that Apple provide access to an iPhone belonging to one of the San Bernardino terrorists by writing software to bypass some of its security features.
"All the computer security experts that I talk with – I don't think there's been one who believes that we should do what the government wants," Hellman said. "While in this one case it might not do much harm, it establishes a dangerous precedent where Apple is then likely to be inundated with thousands upon thousands of requests that they'll have to either fight or comply with at great risk to the security of the iPhone system."
Diffie said giving in to the FBI's request would also make it harder for Apple to resist similar requests from foreign governments who want to spy on their citizens and crush internal dissent.
Whitfield Diffie (right) listens to former U.S. Secretary of State George Shultz (left) during an event at Stanford's Center for International Security and Cooperation.
"We do not wish to support the ability of totalitarian regimes to do this kind of thing when they are persecuting people for their free speech," Diffie said.
Diffie and Hellman are both currently affiliated with Stanford's Center for International Security and Cooperation (CISAC), where they regularly attend seminars on a diverse range of national security issues and mentor young pre- and postdoctoral fellows on issues of cyber security.
"What's great about both Whit Diffie and Marty Hellman is the way in which they contribute to the ongoing intellectual discourse of the Center," said CISAC co-director David Relman. "Both of them think broadly and deeply far outside the bounds of their formal training and the areas of accomplishment for which they are now being recognized by this prize."
Persis Drell, dean of Stanford's School of Engineering, said the award, and the work behind it, exemplified the caliber and tone of research for which the school's faculty are noted.
"Engineers want to have a positive impact on our world, and we are enormously proud to have Marty Hellman as an emeritus member of the Stanford Engineering faculty," Drell said.
Boneh, whose main area of research is applied cryptography, said Diffie and Hellman's work continued to inspire a new generation of cryptographers.
"Beyond the practical implications of the work, their groundbreaking 1976 paper 'New Directions in Cryptography' introduced new concepts and opened up new directions that were previously thought to be impossible," Boneh said.
"It introduced number theory into the realm of cryptography and launched an entire academic discipline to further develop the area of public-key cryptography. By now there are thousands of researchers and tens of thousands of research papers building on their work. The field of cryptography would be a pale image of what it is today without the work of Diffie and Hellman."
Hero Image
Martin Hellman (center) and Whitfield Diffie (right) the inventors of public-key cryptography are shown in this 1977 photo.
Every day, security engineers cope with a flow of cyber security incidents. While most incidents trigger routine reactions, others require orders of magnitude more effort to investigate and resolve. How security operation teams in organizations should tune their response to tame extreme events remains unclear. Analyzing the statistical properties of sixty thousand security events collected over six years at a large organization, we find that the distribution of costs induced by security incidents is in general highly skewed, following a power law tail distribution. However, this distribution of incident severity becomes less skewed over time, suggesting that the organization under scrutiny has managed to reduce the impact of large events. We illustrate this result with a case study focused on the empirical effects of full disk encryption on the severity of incidents involving lost or stolen devices.
Despite the tempting similarities, the analogy between nuclear and cyber weapons is presently flawed. High-ranking officials that are using it as the basis for policies of deterrence in cyberspace are making a potentially dangerous misjudgment. Given the wide-open future of cyber warfare, it would make sense to expand the analogy to include other revolutionary military technologies to provide the conceptual flexibility necessary to confront the presently unforeseeable challenges that lie ahead in cyberspace.