Big Data, Little Data, No Data – Who is in Charge of Data Quality?World Data Systems Webinar #9 (2016)
The more value that is placed on research data as a commodity to be shared, sustained, and reused, the greater the need to assure the quality of those data. Data repositories—whether domain-specific or generic across domains—are essential gatekeepers of data sustainability. Data quality is a consideration throughout the research process. To what extent should responsibility for assuring data quality be the responsibility of the investigators; of publishers, editors, and peer reviewers; of data repositories; of data librarians or data scientists; or of later reusers of those data? Considerations for data quality vary throughout the lifecycle of data handling. These questions have neither simple nor generic answers. In this Webinar, Prof Christine Borgman (UCLA), author of 'Big Data, Little Data, No Data: Scholarship in the Networked World' (MIT Press, 2015), will explore these issues of responsibility for data quality in conversation with Dr Andrea Scharnhorst, head of the research and innovation group at DANS, an institute of the Royal Netherlands Academy of Arts and Sciences.
Publication DateMay 9, 2016
Citation InformationChristine L Borgman and Andrea Scharnhorst. "Big Data, Little Data, No Data – Who is in Charge of Data Quality?" World Data Systems Webinar #9 (2016)
Available at: http://works.bepress.com/borgman/384/