The increasing prevalence of ever-sophisticated technology permits machines to stand in for or augment humans in a growing number of contexts. The questions of whether, when, and how the so-called actions of machines can and should result in legal liability thus will also become more practically pressing. One important set of questions that the law will inevitably need to confront is whether machines can have mental states, or—at least—something sufficiently like mental states for the purposes of the law. This is because a number of areas of law have explicit or implicit mental state requirements for the incurrence of legal liability. Thus, in these contexts, whether machines can incur legal liability turns on whether a machine can operate with the requisite mental state. Consider the example of copyright law. Given the long history of mechanical copying, courts have already faced the question of whether a machine making a copy can have the mental states required for liability. They have often answered with a resounding, unconditional “no.” But this Essay seeks to challenge any generalization that machines cannot operate with a mental state in the eyes of the law. Taking lessons from philosophical thinking about minds and machines—in particular, the conceptual distinction between “conscious” and “functional” properties of the mind—this Essay uses copyright’s volitional act requirement as a case study to demonstrate that certain legal mental state requirements might seek to track only the functional properties of the states in question, even ones which can be possessed by machines. This Essay concludes by considering how to move toward a more general framework for evaluating the question of machine mental states for legal purposes.
The full text of this essay may be found by clicking the PDF link to the left.
With the increasing prevalence of ever more sophisticated technology—which permits machines to stand in for or augment humans in a growing number of contexts—the questions of whether, when, and how the so-called actions of machines can and should result in legal liability will become more practically pressing. 1 See infra section I.B. Although the law has yet to fully grapple with questions such as whether machines are (or can be) sufficiently humanlike to be the subjects of law, philosophers have long contemplated the nature of machines. 2 See infra Part III. Philosophers have considered, for instance, whether human cognition is fundamentally computation—such that it is in principle possible for future artificial intelligences (AI) to possess the properties of human minds, including consciousness, semantic understanding, intention, and even moral responsibility—or if humans and machines are instead fundamentally different, no matter how sophisticated AI becomes. 3 See infra Part III. It is thus unsurprising that, in thinking through how the law should accommodate and govern an increasingly AI-filled world, the lessons and frameworks to be gleaned from these philosophical discussions will have undeniable relevance.
One important set of questions that the law will inevitably need to confront is whether machines can have mental states, or—at least—something sufficiently like mental states for the purposes of the law. This is because a wide range of areas of law have explicit or implicit mental state requirements for the incurrence of legal liability. 4 See infra section I.A. Consider, for example, questions of intent and recklessness versus negligence in tort law; mens rea and actus reus in criminal law; offer and acceptance in contract law; and, as we will see, infringement and authorship in copyright law. In each of these contexts, the law either implicitly or explicitly asks for the presence of some particular mental state on the part of the actors in question. Whether the operations of machines can incur legal liability—and what kind of liability they can incur—would thus often seem to turn on whether a machine is regarded as operating with the mental state required.
In some contexts, the decision already seems to have been made that machines can never possess the mental states required for liability. Consider copyright law’s volitional act requirement for infringement. Copyright law has generally claimed that machines making copies of protected material lack the requisite volition for this conduct to give rise to legal liability on the part of those responsible for the machine, even when the machine has been designed to make copies, often of copyrighted works. 5 See infra Part II. In other contexts, such as criminal and tort law, the question of machines’ capacity for mental states remains open and underexplored. 6 See generally Mark A. Geistfeld, A Roadmap for Autonomous Vehicles: State Tort Liability, Automobile Insurance, and Federal Safety Regulation, 105 Calif. L. Rev. 1611 (2017) (tort liability); Gabriel Hallevy, “I, Robot—I, Criminal”—When Science Fiction Becomes Reality: Legal Liability of AI Robots Committing Criminal Offenses, 22 Syracuse Sci. & Tech. L. Rep. 1 (2010) (criminal law); Ignatius Michael Ingles, Note, Regulating Religious Robots: Free Exercise and RFRA in the Time of Superintelligent Artificial Intelligence, 105 Geo. L.J. 507, 516 n.67 (2017) (criminal law).
This Essay aims to challenge any hasty and blanket generalization that machines cannot have mental states as a legal matter, drawing on philosophical thinking surrounding mental states and using copyright’s volitional act requirement as a case study. In so doing, this Essay concludes that—as a matter of copyright doctrine—a copying technology might be sufficiently “volitional” for the technology provider to be held directly liable for the technology’s so-called actions in producing copies; and—as a matter of general legal theory—machines in some contexts might be capable of being sufficiently “mental” to count as agents of the humans behind them, depending on the aims of the area of law in question. 7 See infra Parts IV–V. This conclusion is thus not merely of philosophical interest but one with practical implications for determinations of legal liability. In the context of copyright law, this Essay’s chosen case study, this conclusion has implications for who is and is not directly accountable for the copying of protected material and for the law’s ability to effectuate its goals of encouraging the creation and dissemination of expressive works.
To mount this Essay’s challenge, after giving an overview of mental states in the law and the puzzle raised by technological advancement in Part I, as well as the specific challenges posed by copyright law in Part II, Part III of the Essay recounts two of the most influential philosophical discussions on minds and machines, and the resulting theoretical distinction between the conscious and functional properties of mental states. Using this distinction as a framework, this Essay argues that it is an open question whether the law’s mental state requirements seek to track the conscious or merely functional properties of the particular mental state in question, 8 Note that there are arguably nonconscious mental states aside from functional mental states, such as intentional and computational states. In this way, the distinction on which we focus—consciousness versus functionality—is not exhaustive, as one could similarly ask whether the law cares about intentionality, computation, and so forth. Nonetheless, the Essay focuses on consciousness versus functionality not only for the sake of simplicity, but also because this distinction is plausibly the most important one for legal purposes. The Essay otherwise leaves the question of whether intentionality (or other nonconscious, nonfunctional properties of mental states) should ever matter to the law for exploration in future work. and the analysis depends on the ultimate aims of the relevant area of law. Part IV then defends the view that copyright law’s volitional act requirement might be interested in merely functional properties, which could—in principle—be replicated by machines. Next, Part V considers which functional properties copyright law might seek to track and what a machine might have to look like to be “functionally volitional” under copyright law, to count as the technology provider’s agent, and thereby to give rise to direct liability. These relevant functional properties include the ability to pause and analyze the nature of the work in question before “choosing” to undertake an act of copying, one which might cause exposure to liability. On the basis of this framework, this Essay concludes that machines with the appropriate functionality might satisfy copyright law’s volitional act requirement, thus forming the basis for holding technology providers directly liable for infringement. Finally, generalizing this Essay’s framework, Part VI offers preliminary thoughts on machines and mental state requirements in the contrasting contexts of criminal law and copyright authorship doctrine, as well as a general hypothesis regarding when the law is interested in conscious versus merely functional properties of the mental states in question.