Category: Research

  • Failure Studies

    Every once in awhile, an established social scientist proclaims the need for a new interdisciplinary approach to solving the world’s most pressing problems. This week’s entry comes from Tyler Cowen, writing with Patrick Collison. Their objective? Progress Studies.

    By “progress,” we mean the combination of economic, technological, scientific, cultural, and organizational advancement that has transformed our lives and raised standards of living over the past couple of centuries. For a number of reasons, there is no broad-based intellectual movement focused on understanding the dynamics of progress, or targeting the deeper goal of speeding it up. We believe that it deserves a dedicated field of study. We suggest inaugurating the discipline of “Progress Studies.”

    There is a lot going on here. The simple response is the tired “yes, Tyler” response that I suppose that most people currently employed in disciplines like political science, economics, public policy, sociology, operations research, business, history, classics, etc. will have upon first read of this Progress Studies manifesto. The big questions they have identified—why did great civilizations emerge when they did and where they did? Why did the Industrial Revolution start in northwest England? Why is Silicon Valley in California? How do you train brilliant people? What incentives are appropriate for joint effort?—have been asked and answered across the disciplines, literally for centuries. We do not lack for theories or evidence on these questions.

    Maybe a slightly deeper response would be to focus on their call for a discipline of Progress Studies (if not departments of Progress Studies). What would a discipline do that the eclectic mix of interdisciplinary approaches—which is, of course, the status quo—does not? The answer is not clear, because Collison and Cowen probably haven’t thought seriously about what it means to be a discipline. Here is a clue: the word discipline ought to be taken rather literally, as a way of thinking that “disciplines” inquiry and exploration. One does not generate a discipline like economics by saying “somebody should study how markets work! Our discipline will study how markets work.” One creates a discipline by specifying a set of tools, methods, or procedures through which to study markets. Samuelson, not Smith, created the modern discipline of economics as we know it. (My guess is that Collison and Cowen don’t really mean a discipline, just something more like interdisciplinary centers or programs.*)

    But a third response might be to question the very premise that we need to study progress. My only-slightly tongue-in-cheek response is that the most pressing task is not how to create progress, but rather how to prevent failure. By “failure,” I mean economic, social, or political forces that destroy the social bases of human flourishing. The question of why Rome fell is at least as interesting as how Rome rose; the problem of how to stop global warming is more important than the problem of generating another Silicon Valley in Singapore. I would suggest we inaugurate instead the discipline of Failure Studies.**

    There is, after all, a school of thought that believes that predicting, organizing, or incentivizing radical innovations that transform the human condition is impossible. That school does believe, of course, we can try to set up rules to prevent us from stagnating or destroying what we’ve created.

    NOTES

    * Read: “fiefdoms.”
    ** Or: Centers of Failure Studies.

  • Imbens on DAGs, and the Pedagogy of Causal Inference

    Guido Imbens has an interesting new essay on the graphical causal modeling approach pioneered by Judea Pearl, which uses directed acyclic graphs (DAGs) to understand how to infer causal relationships from data. Imbens is a pioneer in applying the potential outcomes (POs) framework in economics to study causal questions. Although authors such as Morgan and Winship see DAGs and POs as complementary, many proponents of DAGs hold that DAGs are superior to POs, for purposes of clarity of exposition, generality, and others. This debate can get pretty heated, with allegations that economists (in particular) are stupid, arrogant, or somehow afraid of DAGs.

    Imbens’s new essay presents an alternative view, on which which is charitable to DAGs but does not concede the point that they are generally superior to POs. His discussion of questions of manipulability (essential in POs, not so in DAGs) raises familiar questions from the philosophy of causality that go back to Lewis (1973) on counterfactuals and causation. So too with simultaneous causation. But his discussion of techniques such as instrumental variables from the DAG-vs-PO perspectives is, to my knowledge, novel. He writes

    some of the key assumptions in instrumental variables settings are not naturally captured in DAGs, whereas they are easily articulated in the PO framework. This extends to other shape restrictions that play an important role in economic theory. Second, one of the modern results in instrumental variables settings, the identification of the Local Average Treatment Effect (LATE, Imbens and Angrist [1994], Angrist et al. [1996]) is not easily derived in a DAG approach

    This is important because it directly challenges the argument that “DAGs are better are whatever POs do” (the DAG > PO argument). It also threatens the more general argument that “DAGs can do whatever POs can do” (the DAG == PO argument).

    Imbens’s conclusions resonate with my own from my experiences teaching causal inference to beginners. I insist that DAGs are wonderfully helpful for teaching causal inference in a context that focuses on identification rather than estimation and inference. They are, in fact, absolutely essential for illustrating concepts like collider bias, which I cannot even express properly without a DAG. The benefit of DAGs is that they allow students to understand the logic of causality without requiring statistical foundations.

    But in my experience, and again consistent with Imbens’s argument, this does not extend to instrumental variables and related identification strategies, largely developed in the field of applied economics. Here, I find DAGs often unhelpful. Take Imbens’s Figure 3:

    I remain unable to articulate to my own satisfaction why this DAG is a solution to the problem of identifying the effect of X on Y without either invoking a range of additional concepts drawn from the PO framework or making other assumptions, as in Morgan and Winship, pp. 297-299.* And certainly I cannot explain how instrumental variables work or what effect they identify without such concepts drawn from POs. Even Pearl cites Imbens when presenting what one can learn from instrumental variables in his canonical text on DAGs (see Causality, Second edition, p. 90), or cites economists who managed to derive the IV formula without the assistance of DAGs back in the 1920s (see pp. 152-153), although a more typical expression of the DAG > PO argument is on pp. 247-248, especially note 29.

    Imbens notes that others working in the graphical causal modeling tradition have recently proposed DAGs for instrumental variables designs, e.g. Steiner et al. (2017).

    Imbens “do[es] not find these DAGs particularly illuminating” (challenging the DAG > PO argument). I will take a stronger position: I suspect that Steiner et al. could not have derived their Figure 5 without knowledge of the PO framework. This challenges the DAG == PO argument. Similar arguments about the expository limits of DAGs are probably also valid for designs such as regression discontinuity and others, although the instrumental variable case is the one that springs to mind given personal experience.

    The conclusion I draw from this reflection on Imbens on DAGs is that students need many tools. DAGs are a remarkable nonparametric identification algorithm, and for some pedagogical purposes they are irreplaceable. But not for all, and at least some of the conceptual insights derived from POs are irreplaceable.

    NOTE

    * I will note that once students have a firm command of instrumental variables, DAGs are helpful for representing the assumptions that are implied by such designs. They are also very useful for explicating how “conditional” instrumental variables designs work, such as Morgan and Winship’s Figure 9.2(b).