Over the past decade, the rapidly decreasing cost of computer storage and the increasing prevalence of high-speed Internet connections have fundamentally altered the way in which scientific research is conducted. Led by scientists in disciplines such as genomics, the rapid sharing of data sets and cross-institutional collaboration promise to increase scientific efficiency and output dramatically. As a result, an increasing number of public “commons” of scientific data are being created: aggregations intended to be used and accessed by researchers worldwide. Yet, the sharing of scientific data presents legal, ethical and practical challenges that must be overcome before such science commons can be deployed and utilized to their greatest potential. These challenges include determining the appropriate level of intellectual property protection for data within the commons, balancing the publication priority interests of data generators and data users, ensuring a viable economic model for publishers and other intermediaries and achieving the public benefits sought by funding agencies. In this paper, I analyze scientific data sharing within the framework offered by organizational theory, expanding existing analytical approaches with a new tool termed “latency analysis.” I place latency analysis within the larger Institutional Analysis and Development (IAD) framework, as well as more recent variations of that framework. Latency analysis exploits two key variables that characterize all information commons: the rate at which information enters the commons (its knowledge latency) and the rate at which the knowledge in the commons becomes be freely utilizable (its rights latency). With these two variables in mind, one proceeds to a three-step analytical methodology that consists of (1) determining the stakeholder communities relevant to the information commons, (2) determining the policy objectives that are relevant to each stakeholder group, and (3) mediating among the differing positions of the stakeholder groups through adjustments to the latency variables of the commons. I apply latency analysis to two well-known narratives of commons formation in the sciences: the field of genomics, which developed unique modes of rapid data sharing during the Human Genome Project and continues to influence data sharing practices in the biological sciences today; and the more generalized case of open access publishing requirements imposed on publishers by the U.S. National Institutes of Health and various research universities. In each of these cases, policy designers have used timing mechanisms to achieve policy outcomes. That is, by regulating the speed at which information is released into a commons, and then by imposing time-based restrictions on its use, policy designers have addressed the concerns of multiple stakeholders and established information commons that operate effectively and equitably. I conclude that the use of latency variables in commons policy design can, in general, reduce negotiation transaction costs, achieve efficient and equitable results for all stakeholders, and thereby facilitate the formation of socially-valuable commons of scientific information.
Available at: http://works.bepress.com/jorge_contreras/5/