Information Technology
Authors
News Type
News
Date
Paragraphs

In a memo from March 2021, Secretary of Defense Lloyd Austin outlined new mandates for the Department of Defense to modernize, encourage innovation and “invest smartly for the future” in order to meet the dynamic threat landscape of the modern world. Writing in the same memo, he acknowledged that this goal cannot be met without the cooperation of stakeholders from across the board, including private industries and academic institutions.

In keeping with that priority, on April 5, 2022, Deputy Secretary of Defense Kathleen Hicks and her team joined a cross-departmental roundtable of faculty and students to hear more about Stanford's efforts to bring Silicon Valley-style innovation to projects at the Department of Defense and its interagencies.

These students are working under the umbrella of the Gordian Knot Center for National Security Innovation (GKC), a new program at the Center for International Cooperation and Security (CISAC) at the Freeman Spogli Institute for International Studies (FSI). GKC aims to coordinate resources at Stanford, peer universities, and across Silicon Valley’s innovation ecosystem in order to provide cutting-edge national security education and train national security innovators.


This is a great place to be doing this. Here in Silicon Valley, there’s a huge amount of opportunity and ecosystem available across both Stanford and the broader research community and commercial sector.
Kathleen Hicks
Deputy Secretary of Defense

At the core of GKC is a series of classes and initiatives that combine STEM skills with policy know-how in a way that’s meant to encourage students to leverage entrepreneurship and innovation in order to develop rapid, scalable solutions to national security issues. Students from both undergraduate and graduate level programs, regardless of their prior experience in national defense, are encouraged to participate.

“We’re really trying to empower students to pursue national security-relevant work while they’re here at Stanford,” explains Joe Felter, GKC’s director, co-founder, and senior research scholar at CISAC. FSI and CISAC have deep roots in this type of innovative, interdisciplinary approach to policy solutions GKC is working to implement. Michael McFaul, FSI’s director, is a founding faculty member and principal investigator for GKC, and David Hoyt, the assistant director of GKC, is an alumnus of the CISAC honors program.

Results from GKC’s classes have been very encouraging so far. Working through "Hacking for Defense," a GKC-affiliated class taught out of the MS&E department, Jeff Jang, a new Defense Innovation Scholar and MBA student, showed how implementing a rapid interview process and focusing on problem and customer discovery has allowed his team to create enterprise software for United States Air Force (USAF) fleet management that has vastly improved efficiency, reduced errors and enabled better planning capabilities into the workflow. Their product has been given numerous grants and awards, and the team has received signed letters of interest from 29 different USAF bases across the world.

In another GKC class, "Technology, Innovation, and Great Power Competition,” Abeer Dahiya and Youngjun Kwak, along with Mikk Raud, Dave Sprague and Miku Yamada — three students from FSI’s Ford Dorsey Master’s in International Policy program (MIP) — have been tackling the challenges involved in developing a domestic U.S. semiconductor strategy. They were among the student teams asked to present the results of their work to Dep. Sec. Hicks during her visit.

“Attending this class has been one of the highlights of my time at Stanford,” says Mikk Raud (MIP ‘22). “It’s been a great example of how important it is to run interdisciplinary courses and bring people from different fields together.”

He continues, “As a policy student, it was very insightful for me to learn from my peers from different programs, as well as make numerous visits to the engineering quad to speak to technical professors whom I otherwise would have never met. After meeting with and presenting to Deputy Secretary Hicks and hearing about the work other students are doing, it really hit home to me that the government does listen to students, and it really is possible that a small Stanford group project can eventually lead into significant changes and improvements of the highest levels of policy making.”

This kind of renewed interest in national security and defense tech among students is precisely what the Gordian Knot Center is hoping to foster. Building an interconnected innovation workforce that can “think deeply, [and] act quickly,” GKC’s motto, is a driving priority for the center and its supporters.


We’re really trying to empower students to pursue national security-relevant work while they’re here at Stanford.
Joe Felter
GKC Director

The Department of Defense recognizes the value of this approach. In her remarks, Dep. Sec. Kathleen Hicks acknowledged that reshaping the culture and methodologies by which the DoD runs is as imperative as it is difficult.

“My life is a Gordian knot, day in and day out at the Defense Department,” she quipped. Speaking seriously, she reminded the audience of the tremendous driving power DoD has had in creating future-looking national security defenses.  “Because of its sophistication, diversity, and capacity to innovate, the U.S. Defense Industrial Base and vibrant innovation ecosystem remains the envy of the world,” Hicks emphasized. “Every day, people like you are designing, building, and producing the critical materials and technologies that ensure our armed forces have what they need.”

But she also recognized that the challenges facing the DoD are real and complex. “There are many barriers in front of the Department of Defense in terms of what it takes to operate in government and to make the kinds of shifts we need in order to have the agility to take advantage of opportunities and partner effectively.” She reiterated that one of her key priorities is to accelerate innovation adoption across DoD, including organizational structure, processes, culture, and people.

Partnerships with groups like the Gordian Knot Center are a key component to breaking down the barriers to innovation facing our national institutions and rebuilding them into new, more adaptable bridges forward. While the challenges facing the Department of Defense remain significant, the work of the students in GKC’s classes so far proves that progress is not only possible, but can be made quickly as well.

Read More

Woman
Q&As

Are We Dumb about Intelligence?

Amy Zegart on the Capabilities of American Intel Gathering
Are We Dumb about Intelligence?
All News button
1
Subtitle

A visit from the Department of Defense’s deputy secretary gave the Gordian Knot Center a prime opportunity to showcase how its faculty and students are working to build an innovative workforce that can help solve the nation’s most pressing national security challenges.

News Type
Q&As
Date
Paragraphs

This interview with CISAC Affiliate Christopher Painter was originally produced by Jen Kirby. The complete article is available at Vox.

The frequency, scope and scale of ransomware attacks against public and private systems is accelerating. In the latest incident, the ransomware group REvil has demanded $70 million to unlock the systems of the software company Kaseya, an attack that affects not only Kaseya, but simultaneously exploits all of the company’s clients.

The REvil, JBS meatpacking and Colonial Pipeline attacks have abruptly raised the profile of ransomware from a malicious strand of criminality to a national security priority. These are issues that Christopher Painter, an affiliate at the Center for International Security and Cooperation (CISAC), has worked on at length during his tenures as a senior official at the Department of Justice, the FBI, the National Security Council and as the world's first top cyber diplomat at the State Department.

Jen Kirby, a reporter for Vox, interviewed Painter to discuss how cybercrimes are evolving and what governments should do to keep ransomware attacks from escalating geopolitical tensions online and off.



Jen Kirby:
I think a good place to start would be: What are “ransomware attacks”?

Christopher Painter:
It is largely criminal groups who are getting into computers through any number of potential vulnerabilities, and then they essentially lock the systems — they encrypt the data in a way that makes it impossible for you to see your files. And they demand ransom, they demand payment. In exchange for that payment, they will give you — or they claim, they don’t always do it — they claim they’ll give you the decryption keys, or the codes, that allow you to unlock your own files and have access to them again.

That is what traditionally we say is “ransomware.” That’s been going on for some time, but it’s gotten much more acute recently.

There is another half of that, which is that groups don’t just hold your files for ransom, they either leak or threaten to leak or expose your files and your information — your secrets and your emails, whatever you have — publicly, either in an attempt to embarrass you or to extort more money out of you, because you don’t want those things to happen. So it’s split now into two tracks, but they’re a combined method of getting money.

Jen Kirby:
We’ve recently had some high-profile ransomware attacks, including this recent REvil incident. Is it that we’re seeing a lot more of them, or they’re just bigger and bolder? How do you assess that ransomware attacks are becoming more acute?

Christopher Painter:
We’ve seen this going on for some time. I was one of the co-chairs of this Ransomware Task Force that issued a report recently. One of the reasons we did this report was we’re trying to call greater attention to this issue. Although governments and law enforcement were taking it seriously, it wasn’t being given the kind of national-level priority it deserved.

It was being treated as more of an ordinary cybercrime issue. Most governments’ attention is focused on big nation-state activity — like the SolarWinds hack [where suspected Russian government hackers breached US government departments], which are important, and we need to care about those. But we’re very worried about this, too.

It’s especially become more of an issue during the pandemic, when some of the ransomware actors were going after health care systems and health care providers.That combined with these big infrastructure attacks — the Colonial Pipeline clearly was one of them. Another one was the meat processing plants. Another one was hospital systems in Ireland. You also had the DC Police Department being victimized by ransomware. These things are very high-profile. When you’re lining up for gas because of a ransomware attack, and you can’t get your food because of a ransomware attack, that brings it home as a priority. And then, of course, you have what happened this past weekend. So ransomware has not abated, and it continues to get more serious and hit more organizations.

painter

Christopher Painter

Affiliate at the Center for Internatial Security and Cooperation (CISAC)
Full Profile

Read More

Woman smiling
Commentary

Biden and Putin both place a ‘high priority’ on cybersecurity, says presidential adviser after Geneva summit

Despite tensions in the summit lead-up, the two leaders were overly cordial in their remarks after the meeting. Rose Gottemoeller, lead US negotiator for the New Strategic Arms Reduction Treaty (New START), joined The World's host Marco Werman to offer insight.
Biden and Putin both place a ‘high priority’ on cybersecurity, says presidential adviser after Geneva summit
Man with glasses and gray hair
Commentary

The U.S. says it can answer cyberattacks with nuclear weapons. That’s lunacy.

Over the July 4 weekend, the Russian-based cybercriminal organization REvil claimed credit for hacking into as many as 1,500 companies. In May, another cybercriminal group, DarkSide shut down most of the operations of Colonial Pipeline. These incidents were bad enough.
The U.S. says it can answer cyberattacks with nuclear weapons. That’s lunacy.
All News button
1
Subtitle

Christopher Painter explains why the emerging pattern of ransomware attacks needs to be addressed at a political level – both domestically and internationally – and not be treated solely as a criminal issue.

Authors
Mary Duan
News Type
Q&As
Date
Paragraphs

Does a tracking system making laws more enforceable actually improve society? Ahmed examines how technology firms and the Chinese government build databases and information sharing procedures that monitor the behavior of individuals, corporations, legal institutions, and government representatives, with the end goal of building a society where those individuals and corporations follow the law.

Read the rest at Stanford HAI

All News button
1
Subtitle

Does a tracking system making laws more enforceable actually improve society?

-

This event is co-sponsored with the Cyber Policy Center and the Center for a New American Security.

* Please note all CISAC events are scheduled using the Pacific Time Zone

 

Seminar Recording: https://youtu.be/KaydMdIVtGc

 

About the Event: The United States is steadily losing ground in the race against China to pioneer the most important technologies of the 21st century. With technology a critical determinant of future military advantage, a key driver of economic prosperity, and a potent tool for the promotion of different models of governance, the stakes could not be higher. To compete, China is leveraging its formidable scale—whether measured in terms of research and development expenditures, data sets, scientists and engineers, venture capital, or the reach of its leading technology companies. The only way for the United States to tip the scale back in its favor is to deepen cooperation with allies. The global diffusion of innovation also places a premium on aligning U.S. and ally efforts to protect technology. Unless coordinated with allies, tougher U.S. investment screening and export control policies will feature major seams that Beijing can exploit.

On early June, join Stanford's Center for International Security and Cooperation (CISAC) and the Center for a New American Security (CNAS) for a unique virtual event that will feature three policy experts advancing concrete ideas for how the United States can enhance cooperation with allies around technology innovation and protection.

This webinar will be on-the-record, and include time for audience Q&A.

 

About the Speakers: 

Anja Manuel, Stanford Research Affiliate, CNAS Adjunct Senior Fellow, Partner at Rice, Hadley, Gates & Manuel LLC, and author with Pav Singh of Compete, Contest and Collaborate: How to Win the Technology Race with China.

 

Daniel Kliman, Senior Fellow and Director, CNAS Asia-Pacific Security Program, and co-author of a recent report, Forging an Alliance Innovation Base.

 

Martijn Rasser, Senior Fellow, CNAS Technology and National Security Program, and lead researcher on the Technology Alliance Project

Virtual Seminar

Anja Manuel, Daniel Kliman, and Martijn Rasser
Seminars
Authors
Herbert Lin
News Type
Commentary
Date
Paragraphs

In a Lawfare post earlier this year, I questioned the wisdom of referring to cyber operations as psychological operations. These campaigns are the bread and butter of U.S. Cyber Command’s operational activities. My interest in this question stemmed from two recent articles, one on NPR and one in the Washington Post. The former discussed past activities of U.S. Cyber Command and the latter discussed possible future activities. Taken together, both articles used terms such as “information warfare,” “information operations,” “psychological operations” and “influence operations” to describe these activities.

I closed that post with a promise to comment on the doctrinal and conceptual confusions within Defense Department policy regarding all of these concepts. This post makes good on that promise.

 

Read the rest at Lawfare Blog

Hero Image
All News button
1
-

Seminar Recording: https://youtu.be/wKXawdBrCEs

 

About this Event: Are we still in the Nuclear Age? Is this the Age of AI? Are we entering the Age of Synthetic Biology? Technologies such as nuclear power, artificial intelligence, and synthetic biology are “epochal,” as in epoch-making: They redefine the world in which we live, introducing new uncertainties and risks, as well as new responsibilities—but for whom? World-changing technologies are inextricably political entities, affecting distribution of power and resources throughout and between societies. However, despite decades of academic and practical experience with the political dimensions of technology, contemporary societies appear to be inadequately prepared to cope skillfully with the new worlds that their scientists and technologists are creating. Why? What lessons can be learned from existing epochal technologies that might help societies understand, evaluate, and direct their technical potentials and trajectories into the future? Within the context of growing concern about national security threats that may emerge from germline genetic engineering, Greene will consider the cultivation of a “culture of responsibility” in synthetic biology labs. Polleri will examine a set of public controversies surrounding the role of nuclear power and the threat of radioactive contamination in a post-Fukushima Japan. Garvey will map out the risk landscape surrounding AI systems and discuss strategic approaches to coping with uncertainty and disagreement in protecting against catastrophic technological risk.

 

About the Speakers:

Colin Garvey is a Postdoctoral Fellow at CISAC and the Stanford Institute for Human-Centered Artificial Intelligence. He studies the history and political economy of artificial intelligence (AI), among other things, with a comparative focus on Japan. He is currently a PhD Candidate and Humanities, Arts, and Social Sciences Fellow in the Science and Technology Studies Department at Rensselaer Polytechnic Institute (RPI). His dissertation “Averting AI Catastrophe, Together: On the Democratic Governance of Epochal Technologies,” challenges utopian/dystopian thinking about AI by explaining how more democratic governance of the technology is not only necessary to avert catastrophe, but also to steer AI R&D more safely, fairly, and wisely. He won Best Early Career Paper at the 2017 meeting of the Society for the History of Technology for “Broken Promises & Empty Threats: The Evolution of AI in America, 1956-1996.” His research article on the history and political economy of Japanese AI, “An Alternative to Neoliberal Modernity: The ‘Threat’ of the Japanese Fifth Generation Computer Systems Project,” will be published in a forthcoming special issue of Pacific Historical Review. His work has been supported by the National Science Foundation (NSF). In addition to an MS in STS from RPI, Colin double-majored in Japanese and Media Studies at Vassar College. Before starting graduate school, Colin spent several years teaching in Japan, where he became a Zen Buddhist monk. Colin is fluent in Japanese and freelances as a translator of Japanese books and scientific articles.

 

Daniel Greene is a Postdoctoral Fellow at CISAC, where he works with Dr. Megan Palmer on strategies for risk governance in biotechnology. He uses computational social science methods to identify factors that influence the decisions of biology labs to engage in potentially risky research. Daniel completed a PhD at the Stanford University Graduate School of Education, where he worked with Prof. Carol Dweck to develop and test social-psychological interventions to improve student motivation at scale. His dissertation identified and influenced novel psychological constructs for motivating unemployed and underemployed adults to pursue job-skill training. Outside of academia, Daniel worked for five years as a data scientist and product developer at the Project for Education Research That Scales, a nonprofit that develops resources and infrastructure for disseminating best practices from education research. He also holds a BA in Cognitive Science (Honors) from Rutgers University. Daniel's work has been supported by the Open Philanthropy Project, the Carnegie Foundation for the Advancement of Teaching, the Gates Foundation, the Stanford Digital Learning Forum, and an Amir Lopatin Fellowship.

 

Dr. Maxime Polleri is a MacArthur Nuclear Security Postdoctoral Fellow at the Center for International Security and Cooperation. As an anthropologist of science and technology, his work examines the governance of risk in the aftermath of technological disasters implying environmental contamination. His current research focuses on Japanese public and state responses to the release of radioactive contamination after the 2011 Fukushima nuclear disaster. He has published articles and op-ed in Social Studies of Science, American Ethnologist, Anthropology Today, Anthropology Now, Medical Anthropology Quarterly Second Spear, Somatosphere, Bulleting  of the Atomic Scientists, and The Diplomat. 

Virtual Seminar

Colin Garvey, Daniel Greene, & Maxime Polleri
Seminars
Authors
News Type
News
Date
Paragraphs

From genome editing to “hacking” the microbiome, advances in the life sciences and its associated technological revolution have already altered the biosecurity landscape, and will continue to do so. What does this new landscape look like, and how can policymakers and other stakeholders navigate this space? A new report by Stanford scholars David Relman and Megan Palmer along with George Mason University’s Jesse Kirkpatrick and Greg Koblentz assesses this emerging biosecurity landscape to help answer these questions and illustrates gaps in governance and regulation through the use of scenarios.

The report—the product of two years of workshops, issue briefs, and white papers authored by different participants—involved people from different organizations and backgrounds ranging from life sciences and medicine to social science and ethics. “The project process was just as important as the product,” said Palmer. “It was a truly interdisciplinary effort.”

Genome editing, including CRISPR, is disruptive to the biosecurity landscape, and it serves as an illustration of more general trends in the evolving landscape, the authors write. CRISPR technology does not exist in a vacuum—rather, it is enabled by, represents, and gives rise to a suite of technologies with potential benefits and that require new approaches to adaptive policy making and governance.

Scenarios illustrating governance gaps in in the report include:

  • A reckless CRISPR user who develops and markets a probiotic created with genome editing that has serious unanticipated effects for consumers;
  • An agricultural biotechnology firm conducting dual use genome editing research that lies outside current oversight, but nonetheless could have negative consequences for human health
  • An intentional release of a gene drive organism from a lab, that while having limited physical harm, feeds a state-based misinformation campaign with large economic impacts
  • An accidental release of a gene drive organism due to lack of awareness and uncertainty about the risk classifications and protocols for handling new technologies
  • A terrorist group using commercial firms that lack strong customer and order screening to use genome editing to weaponize a nonpathogenic bacteria
  • A state-sponsored program to develop biological weapons for new strategic uses, including covert assassination, using largely publicly available research
     

In each of these examples, the researchers play out a hypothetical situation exposing a number of security and governance gaps for policymakers and other stakeholders to address.

In the report, the authors conclude that genome editing has tremendous potential benefits and economic impacts. The authors note that the market for genome editing is expected to exceed $3.5 billion by 2019, but a security incident, safety lapse, reckless misadventure, or significant regulatory uncertainty could hurt growth. Increased reliance on the “bio-economy,” they write, means biosecurity is increasingly critical to economic security as well as human health.

Other key takeaways:

Genome editing has the potential to improve the human condition. Genome editing is poised to make major beneficial contributions to basic research, medicine, public health, agriculture, and manufacturing that could reduce suffering, strengthen food security, and protect the environment.

Genome editing is disruptive to the biosecurity landscape. The threat landscape has, and continues to expand to include new means of disrupting or manipulating biological systems and processes in humans, plants, and animals. Genome editing could be used to create new types of biological weapons. Further, technical advances will make misuse easier and more widespread.

CRISPR illuminates broader trends and the challenges of an evolving security landscape. An approach to biosecurity that accounts for these trends, and encompasses risks posed by deliberate, accidental, and reckless misuse, can help address the complex and evolving security landscape.

Technology must be taken seriously.  A thorough, informed, and accessible analysis of any emerging technology is crucial to considering the impact that it may have on the security landscape.

Key stakeholders must be engaged. Stakeholders in the genome editing field encompass a more diverse array of actors than those that have been involved so far in biosecurity discussions. These stakeholders range from international organizations to government agencies to universities, companies, lay communities writ large, and scientists.

Applied research is needed to create and implement innovative and effective policies. Applied research is necessary to continue the process of modifying existing governance measures, and testing and adapting new ones, as new genome editing technologies and applications are developed, new stakeholders emerge, and new pathways for misuse are identified.

Download the executive summary and full report at editingbiosecurity.org.

 

Hero Image
All News button
1
Paragraphs

Nations around the world recognize cybersecurity as a critical issue for public policy. They are concerned that their adversaries could conduct cyberattacks against their interests—damaging their military forces, their economies, and their political processes. Thus, their cybersecurity efforts have been devoted largely to protecting important information technology systems and networks against such attacks. Recognizing this point, the Oxford Dictionaries added in 2013 a new word to its lexicon—it defined cybersecurity as “the state of being protected against the criminal or unauthorized use of electronic data, or the measures taken to achieve this.” Read more.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Oxford Academic
Authors
Amy Zegart
Amy Zegart
Herb Lin
Herbert Lin
Herb Lin
Number
1
Authors
News Type
News
Date
Paragraphs

Stamos joins the Hoover Institution and the Center for International Security and Cooperation at the Freeman Spogli Institute for International Studies. Former Facebook chief security officer, Alex Stamos, to bring rich real-world perspective on cybersecurity and technology policy.

 

Stanford University’s Freeman Spogli Institute for International Studies and the Hoover Institution announced today the appointment of Alex Stamos as a William J. Perry Fellow at the Center for International Security and Cooperation (CISAC), Cyber Initiative fellow, and Hoover visiting scholar.

Stamos, a computer security expert and the outgoing chief security officer at Facebook, will engage in teaching, research and policy engagement through CISAC and the Hoover Institution's Cyber Policy Program as well as the Stanford Cyber Initiative. Drawing on his considerable experience in the private sector, he will teach a graduate level course about the basics of cyber offense and defense to students without technical backgrounds as part of the Ford Dorsey Master’s in International Policy program at the Freeman Spogli Institute, which houses CISAC.

"With our country facing unprecedented challenges in digital interference with the democratic process and numerous other cybersecurity issues, Alex’s experience and perspective are a welcome addition to our group of fellows,” said Freeman Spogli Institute Director Michael McFaul.

In his role, Stamos will also engage in research projects aimed at public policy initiatives as a member of the Faculty Working Group on Information Warfare. The working group will develop, discuss and test concepts and theories about information warfare, as well as conduct applied research on countermeasures to identify and combat information warfare. The working group will also develop policy outreach in briefings to government officials, public seminars and workshops, Congressional testimony, online and traditional media appearances, op-eds and other forms of educating the public on combatting information warfare.

“We are thrilled that Alex is devoting even more energy to our cyber efforts,” said CISAC Co-Director Amy Zegart. “He's been a vital partner to the Stanford cyber policy program for several years and his Stanford "hack lab"--which he piloted in Spring 2018--is a cutting-edge class to train students in our new master’s cyber policy track. He brings extraordinary skills and a unique perspective that will enrich our classes, research, and policy programs."

Over the past three years, the Hoover Institution and CISAC have jointly developed the Stanford Cyber Policy Program.  Its mission is to solve the most important international cyber policy challenges by conducting policy-driven research across disciplines, serving as a trusted convener across sectors, and teaching the next generation. The program is led by Dr. Amy Zegart and Dr. Herbert Lin. Stamos has participated on the advisory board of the program since its inception.

“We look forward to working with Alex on some of the key cyber issues facing our world today," said Tom Gilligan, director of the Hoover Institution. "He brings tremendous experience and perspective that will contribute to Hoover’s important research addressing our nation’s cyber security issues.”

“I am excited to join Stanford and for the opportunity to share my knowledge and expertise with a new generation of students--and for the opportunity to learn from colleagues and students across many disciplines at the university,” said Stamos.

A graduate of the University of California, Berkeley, Stamos studied electrical engineering and computer science. He later co-founded a successful security consultancy, iSEC Partners, and in 2014 he joined Yahoo as its chief information security officer. Stamos joined Facebook as chief security officer in June 2015, where he led Facebook’s internal investigation into targeted election-related influence campaigns via the social media platform.

###

About CISAC: Founded in 1983, CISAC has built on its research strengths to better understand an increasingly complex international environment. It is part of Stanford's Freeman Spogli Institute for International Studies (FSI). CISAC’s mission is to generate knowledge to build a safer world through teaching and inspiring the next generation of security specialists, conducting innovative research on security issues across the social and natural sciences, and communicating our findings and recommendations to policymakers and the broader public. 

About the Hoover Institution: The Hoover Institution, Stanford University, is a public policy research center devoted to the advanced study of economics, politics, history, and political economy—both domestic and foreign—as well as international affairs. With its eminent scholars and world-renowned Library & Archives, the Hoover Institution seeks to improve the human condition by advancing ideas that promote economic opportunity and prosperity and secure and safeguard peace for America and all mankind.

About the Stanford Cyber Initiative:  Working across disciplines, the Stanford Cyber Initiative aims to understand how technology affects security, governance, and the future of work.

Media contact: Katy Gabel, Center for International Security and Cooperation: 650-725-6488, kgabel@stanford.edu

 

All News button
1
Authors
Amy Zegart
News Type
Q&As
Date
Paragraphs

In a world complicated by terrorism, cyber threats and political instability, the private sector has to prepare for the unexpected. Amy Zegart, CISAC co-director, the Hoover Institution’s Davies Family Senior Fellow, and co-author (along with Condoleezza Rice) of Political Risk: How Businesses And Organizations Can Anticipate Global Insecurity, explains lessons learned in keeping cargo planes moving, hotel guests protected – and possibly coffee customers better served.  

Hero Image
All News button
1
Subscribe to Information Technology