Volume 28 - Symposium Issue

Privacy Paradox in Digital Service Taxation

Authors: 
Zhaoyi Li
Volume: 
Issue: 
Spring
Starting Page Number: 
181
Year: 
2026
Preview: 
As the digital economy expands, tax jurisdictions face increasingly large challenges, as taxable activities like online shopping and advertising frequently extend beyond national borders. This shift has led to the emergence of the European Union’s Digital Services Tax (“DST”). While current discussions on this topic focus on the optimal methods and equitable distribution of taxing rights among countries, they overlook user privacy issues inherent in taxes like the DST. In light of the ongoing debate over whether the U.S. should tax digital transactions, this Article examines the legal framework of the DST and explores its implications from a data privacy perspective. By analyzing the implications of taxing the collection, use, and security of consumer data in the digital economy, this Article illustrates the broader effects of digital taxes on privacy rights and compliance. While the DST offers fiscal benefits, it simultaneously raises significant privacy concerns that must be addressed to safeguard consumer interests in an increasingly data-driven marketplace. To resolve this tension, this Article advances a privacy-centric model for the DST, integrating privacy protection measures directly into the DST’s structure and objectives. This comprehensive approach underscores the need for a harmonized framework that balances the economic goals of taxation with the protection of individual privacy, fostering a fairer and more equitable digital ecosystem for all stakeholders.
Abstract: 

As the digital economy expands, tax jurisdictions face increasingly large challenges, as taxable activities like online shopping and advertising frequently extend beyond national borders. This shift has led to the emergence of the European Union’s Digital Services Tax (“DST”). While current discussions on this topic focus on the optimal methods and equitable distribution of taxing rights among countries, they overlook user privacy issues inherent in taxes like the DST. In light of the ongoing debate over whether the U.S.

Disciplining Mechanisms: Governing Data Markets with Competition and Regulation

Authors: 
Peter Ormerod
Volume: 
Issue: 
Spring
Starting Page Number: 
308
Year: 
2026
Preview: 
The past decade has witnessed conceptual renewals in both competition law and information privacy law. These regulatory movements—Neo-Brandeis antitrust and structural data governance—share the objective of recalibrating the balance of power between individuals and the massive data-processing firms that now dominate modern life. Despite their common ends, policy interventions drawn from these schools of thought can work at cross purposes: competitive pressure can induce data exploitation, and privacy rules tend to benefit the largest firms. This Essay exposes the friction in their relationship and offers guidance on how to mediate their tension. Competition policy alone will prove ineffective at indirectly disciplining most data activities, so policymakers should largely favor the structural data-governance approach to address the information economy’s pathologies. But pro-competition policies will nevertheless be essential to reining in firms that are too big to meaningfully regulate and may also prove helpful in solving certain discrete data-processing problems. Policymakers today have two distinct mechanisms for disciplining firms’ data-driven activities. This Essay describes them, exposes their contours, and offers those policymakers guidance on how best to deploy them.
Abstract: 

The past decade has witnessed conceptual renewals in both competition law and information privacy law. These regulatory movements—Neo-Brandeis antitrust and structural data governance—share the objective of recalibrating the balance of power between individuals and the massive data-processing firms that now dominate modern life. Despite their common ends, policy interventions drawn from these schools of thought can work at cross purposes: competitive pressure can induce data exploitation, and privacy rules tend to benefit the largest firms.

Governing Toxic Data

Authors: 
Diane Lourdes Dick, Joseph W. Yockey
Volume: 
Issue: 
Spring
Starting Page Number: 
279
Year: 
2026
Preview: 
Companies increasingly boast to the public markets about their massive digital transformations and the value of their extraordinary customer insights. In this way, data is emerging as a crown jewel asset with unique corporate-governance implications under state and federal laws. For those firms touting data and other digital resources as among their most valuable assets, compliance with evolving cybersecurity and privacy laws, regulations, customer expectations, digital norms, and best practices will be the key to unlocking this value. By the same token, when compliance and policy gaps become pronounced, data and other digital assets can become toxic; not only will they fail to serve as drivers of corporate value, but they may generate significant liabilities. This category of “toxic” data can cause firms to incur massive litigation costs and regulatory fines and penalties, as well as major reputational damage that can destroy brand equity and erode market share. In light of recent signals by the U.S. Securities and Exchange Commission that it intends to focus on these risks, companies and their advisors must now anticipate that well-funded teams of regulators will aggressively monitor corporate disclosures and investigate compliance in an effort to carry out their mission to protect investors and maintain fair, orderly, and efficient markets. In response to this evolutionary enforcement moment, this Article provides the first comprehensive review of the corporate governance of data and other digital assets under state business-entities laws and the federal securities laws, paying special attention to evolving fiduciary responsibilities to monitor, oversee, and report on the risks associated with what we call toxic data.
Abstract: 

Companies increasingly boast to the public markets about their massive digital transformations and the value of their extraordinary customer insights. In this way, data is emerging as a crown jewel asset with unique corporate-governance implications under state and federal laws. For those firms touting data and other digital resources as among their most valuable assets, compliance with evolving cybersecurity and privacy laws, regulations, customer expectations, digital norms, and best practices will be the key to unlocking this value.

Information About Data

Authors: 
Mihailis E. Diamantis, Chen Sun, Rishab Nithyanand
Volume: 
Issue: 
Spring
Starting Page Number: 
238
Year: 
2026
Preview: 
Deterrence-based approaches to privacy enforcement rely on an overlooked and often false premise—that firms know what their own data practices are. There is good reason for skepticism because operational information tends to become siloed within firm subunits. Information about data management is no different. Firms may neglect to memorialize relevant information in reports for internal distribution. And even if such reports are generated, they may not be presented in a manner that is intelligible across firm constituencies. This paper looks outside of privacy law for a solution. Recent scholarship on securities disclosures has highlighted the variety of goals that disclosures serve. While the traditional purpose of financial disclosures is to inform outside investors, the process of preparing disclosures has beneficial internal effects too. It forces firms to study their own financial health and ensures that relevant corporate units are apprised of the results. Mandatory disclosures about corporate data practices could have similarly beneficial effects. While some states already require firms to publish generic information about data practices to consumers, these disclosures lack basic attributes that make financial disclosures effective—they lack detail, no human signs them, and they are not filed with any state authority. Securities- style disclosures hold more promise. By carefully tailoring the content, format, and required signatories of data practice disclosures, authorities could force firms to generate, translate, and internally propagate important information about data. Firms that actually know what they are doing with data are more susceptible to efforts aimed to deter data misuse.
Abstract: 

Deterrence-based approaches to privacy enforcement rely on an overlooked and often false premise—that firms know what their own data practices are. There is good reason for skepticism because operational information tends to become siloed within firm subunits. Information about data management is no different. Firms may neglect to memorialize relevant information in reports for internal distribution. And even if such reports are generated, they may not be presented in a manner that is intelligible across firm constituencies. This paper looks outside of privacy law for a solution.

The Physicist and The Sheep Farmer

Authors: 
Ari Ezra Waldman
Volume: 
Issue: 
Spring
Starting Page Number: 
213
Year: 
2026
Preview: 
This Essay explores two historical events—the exposure of the Daigo Fukuryū Maru (Lucky Dragon #5) to nuclear fallout from a U.S. thermonuclear bomb test in the Pacific Ocean and the contamination of the Cumbrian Fells in the United Kingdom as a result of the nuclear explosion at the Chernobyl disaster— to better understand what, if anything, can the history of technoscientific advising in policymaking contexts teach scholars about technical expertise in policymaking today? The Essay then teases out three lessons. First, expertise in political contexts is never unmediated, meaning that technical expertise should be understood as filtered through social, political economic, and other kinds of biases. Second, informational technologies are multifaceted sociotechnical systems such that giving one form of expertise a privilege over decision-making is a recipe for skewed policymaking. Third, sociotechnical systems operating in the physical world are subject to acute and irresolvable indeterminacies that make the kind of reduction to numbers preferred by technical expertise inappropriate. Sociolegal scholars working in law and technology should consider these lessons in context.
Abstract: 

This Essay explores two historical events—the exposure of the Daigo Fukuryū Maru (Lucky Dragon #5) to nuclear fallout from a U.S. thermonuclear bomb test in the Pacific Ocean and the contamination of the Cumbrian Fells in the United Kingdom as a result of the nuclear explosion at the Chernobyl disaster— to better understand what, if anything, can the history of technoscientific advising in policymaking contexts teach scholars about technical expertise in policymaking today? The Essay then teases out three lessons.

Public Utility for What? Governing AI Datastructures

Authors: 
Julie E. Cohen
Volume: 
Issue: 
Spring
Starting Page Number: 
135
Preview: 
Both in the U.S. and in Europe, initiatives for AI governance have focused principally on identifying and mitigating the risks created by AI models and their downstream uses rather than on those created by the datasets on which the models are trained. However, some of the most intractable dysfunctions of generative AI systems involve datasets. In particular, the very large datasets amassed by dominant providers of generative AI and related services are rapidly taking on infrastructural characteristics and importance. Effective AI governance therefore requires an infrastructural turn in thinking about data. First, the Article explains the significance of the infrastructure lens and sketches some of the distinctive implications of data infrastructures, in particular, for governance of networked digital processes and the social and economic activities that they facilitate. Next, it explores two interrelated problems manifesting within generative AI systems—simulation and sociopathy—that illustrate the extent to which the project of AI governance is, unavoidably, a data governance project. In brief, generative AI models trained on mass content from the open internet are also trained on data infrastructures that have been developed for behaviorist, extractive purposes and that encourage the production and spread of particular kinds of content and particular styles of communication. Last, the article considers whether the concept of public utility, now the subject of growing interest among legal scholars who study regulated industries, might supply a possible foundation for tackling the data governance problems associated with generative AI systems. The public utility model, however, addresses only some of the considerations that the infrastructure lens highlights. It is highly attuned to questions about access to infrastructures and their outputs but relatively insensitive to questions about infrastructure configuration and input sourcing. The problems of simulation and sociopathy belong in the latter category.
Abstract: 
Both in the U.S. and in Europe, initiatives for AI governance have focused principally on identifying and mitigating the risks created by AI models and their downstream uses rather than on those created by the datasets on which the models are trained. However, some of the most intractable dysfunctions of generative AI systems involve datasets. In particular, the very large datasets amassed by dominant providers of generative AI and related services are rapidly taking on infrastructural characteristics and importance.

Online Age Gating: An Interdisciplinary Evaluation

Authors: 
Noah Apthorpe, Brett Frischmann, Yan Shvartzshnaider
Volume: 
Issue: 
Spring
Starting Page Number: 
66
Year: 
2026
Preview: 
The recent surge in regulation seeking to establish age-based governance online is part of a decades-long attempt to establish online zoning. It is driven by active development of technologies to estimate or verify user age based on various characteristics of users, their credentials, or their activities. However, these developments have heightened prevailing concerns that online age gating technology will inevitably be abused and misused to cause a variety of privacy harms and rights infringements. This paper examines this ongoing debate by bridging technical and legal scholarship to explore the current state of online age-based governance. We discuss the current legal and policy landscape, the current status of online age gating technologies, and provide recommendations to guide legal and technological scholarship and practice. Our interdisciplinary assessment is particularly important and timely, given the recent flurry of state and federal laws that aim to implement age gating online and ongoing litigation challenging such laws.
Abstract: 

The recent surge in regulation seeking to establish age-based governance online is part of a decades-long attempt to establish online zoning. It is driven by active development of technologies to estimate or verify user age based on various characteristics of users, their credentials, or their activities. However, these developments have heightened prevailing concerns that online age gating technology will inevitably be abused and misused to cause a variety of privacy harms and rights infringements.

AI Evaluation and the Standards Metaphor

Authors: 
Amina A. Abdu, Abigail Z. Jacobs
Volume: 
Issue: 
Spring
Starting Page Number: 
37
Year: 
2026
Preview: 
Significant attention has been devoted to the question of how best to govern artificial intelligence (AI). In addition to legislation, many policy proposals focus on extra-legal regulatory instruments. Notably, AI evaluations provide a particularly attractive solution, imposing seemingly neutral measurements across the widespread contexts in which AI operates. Because AI evaluations are driven by a wide range of actors, their adoption as a governance tool is shifting power in AI policymaking. In particular, the companies that create AI are also key players in designing and marketing AI evaluations. This Essay examines how large technology companies and government actors conceptualize self-regulation by technology companies as a legitimate policy intervention. We note that AI evaluations are often described using the language of standards, another more established soft law regulatory instrument. Drawing on the history of standards, we discuss how AI companies leverage the metaphor of standards to describe benchmarks and evaluations in order to legitimate corporate expertise. We then examine the implications of this metaphor, describing where it is useful in the context of AI and where it obscures important policy decisions.
Abstract: 

Significant attention has been devoted to the question of how best to govern artificial intelligence (AI). In addition to legislation, many policy proposals focus on extra-legal regulatory instruments. Notably, AI evaluations provide a particularly attractive solution, imposing seemingly neutral measurements across the widespread contexts in which AI operates. Because AI evaluations are driven by a wide range of actors, their adoption as a governance tool is shifting power in AI policymaking.

Governing Data: The Role of State Privacy Law

Authors: 
Jennifer M. Urban
Volume: 
Issue: 
Spring
Starting Page Number: 
1
Year: 
2026
Preview: 
This essay, built on keynote remarks, makes two claims. First, privacy, as a fundamental right, should be considered a principal component of data governance. In today’s world, the concerns about computer processing that arose in the 1960s and 1970s have accelerated even as policy remains behind. And where those concerns, and protections, in the United States have focused most closely on governmental collection and use of data, we now know that the porosity between commercial and government collection and use necessitates attention to both. Data governance thus must take into account, take seriously—and indeed, center—individual privacy. Second, in the U.S., states have a key role to play in these efforts. Where federal efforts have fallen short, California and other states have picked up the privacy baton. Using as an illustrative example recent implementations of California’s Consumer Privacy Act by the California Privacy Protection Agency, the essay show how fundamental rights concepts like autonomy are embedded within California’s updates to the “notice and choice” model. States have always had an important role in privacy and data governance; today this role is crucial. The 2025 inauguration was quickly followed by actions that threaten to upend entirely the foundation of privacy and data protection that, at the federal level, has been in place since the 1970s, and in some cases approaching a century. Indeed, federal activities today precisely echo the federal surveillance and harassment of Americans uncovered by the Church Committee in the 1970s. State-level protections are vital to privacy, and to the democratic participation it enables.
Abstract: 

This essay, built on keynote remarks, makes two claims. First, privacy, as a fundamental right, should be considered a principal component of data governance. In today’s world, the concerns about computer processing that arose in the 1960s and 1970s have accelerated even as policy remains behind. And where those concerns, and protections, in the United States have focused most closely on governmental collection and use of data, we now know that the porosity between commercial and government collection and use necessitates attention to both.

Subscribe to RSS - Volume 28 - Symposium Issue