Abstract
The public debate surrounding the GDPR’s “right to explanation” has sparked a global conversation of profound social and economic significance. But from a practical perspective, the debate’s participants have gotten way ahead of themselves. In their search for a revolutionary new data protection within the provisions of a single chapter of the GDPR, many prominent contributors to the debate have lost sight of the most revolutionary change ushered in by the Regulation: the sweeping new enforcement powers given to European data protection authorities (“DPAs”) by Chapters 6 and 8 of the Regulation. Unlike the 1995 Data Protection Directive that it will replace, the GDPR’s potent new investigatory, advisory, corrective, and punitive powers granted by Chapters 6 and 8 will render DPAs de facto interpretive authorities of the Regulation’s controversial “right to explanation.” Now that the DPAs responsible for enforcing the right have officially weighed in, this Article argues that at least one matter of fierce public debate can be laid to rest. The GDPR provides an unambiguous “right to explanation” with sweeping legal implications for the design, prototyping, field testing, and deployment of automated data processing systems. While the protections enshrined within the right may not mandate transparency in the form of a complete individualized explanation, a holistic understanding of the Regulation’s interpretation by DPAs reveals that the right’s true power derives from its synergistic effects when combined with the algorithmic auditing and “data protection by design” methodologies codified by the Regulation’s subsequent chapters. Accordingly, this Article predicts that algorithmic auditing and “data protection by design” practices will likely become the new gold standard for enterprises deploying machine learning systems both inside and outside of the EU bloc.