Advances in artificial intelligence, machine learning, computing capacity, and big data analytics are creating exciting new possibilities for legal automation. At the same time, these changes pose serious risks for civil liberties and other societal interests. Yet, existing scholarship is narrow, leaving uncertainty on a range of issues, including a glaring lack of systematic empirical work as to how legal automation may impact people’s privacy and freedom. This article addresses this gap with an original empirical analysis of the Digital Millennium Copyright Act (DMCA), which today sits at the forefront of algorithmic law due to its automated enforcement of copyright through DMCA notices at mass scale. With literally millions of such notices sent daily, this automation has been criticized for causing large scale chilling effects online, yet few empirical studies have examined this issue in depth. This article does so with a mixed-method empirical study synthesizing survey-based findings with an analysis of 500 Google Blogs and 500 Twitter accounts that have received DMCA notices. The findings offer a number of new insights, including evidence for DMCA notice chilling effects across a range of activities; support for a privacy theory of automated law chilling effects; evidence of differential impacts including that women are disproportionately chilled and that legal information can mitigate chilling effects; and the effectiveness of automated DMCA notices as compared to non-automated ones. This article also explores the implications of these findings for future forms of automated law and lays the foundations for a new theory of governance for personal legal automation.