Sections

Research

Uncomfortable ground truths: Predictive analytics and national security

Artificial Intelligence

Executive summary

Reducing uncertainty in all aspects of life is undoubtedly an action that all individuals, societies, and governments seek to achieve. In the national security policy space, such forewarning takes on even greater importance owing to the high stakes and lives on the line. It is no surprise, then, that forecasting is a longstanding tradition, both within the intelligence community and the Department of Defense. More recently, the Department of State also started to seek its own oracle through the establishment of the Center for Analytics, the “first enterprise-level data and analytics hub” that will utilize big data and subsequent data analytic tools to “evaluate and refine foreign policy.”

This paper focuses not on those predictive analytics systems that attempt to predict naturally occurring phenomenon. Rather, it draws attention to a potentially troublesome area where AI systems attempt to predict social phenomenon and behavior, particularly in the national security space. This is where caution must be advised for the policy crowd. This paper discusses human behavior in complex, dynamic, and highly uncertain systems. Using AI to predict ever more complex social phenomena, and then using those predictions as grounds for recommendations to senior leaders in national security, will become increasingly risky if we do not take stock of how these systems are built and the “knowledge” that they produce. Senior leaders and decisionmakers may place too much reliance on these estimations without understanding their limitations.

Author