Masked by Trust: Bias in Library Discovery(2019)
The rise of Google and its integration into nearly every aspect of our lives has pushed libraries to adopt similar "Google-like" search tools, called discovery systems. Because these tools are provided by libraries and search scholarly materials rather than the open web, we often assume they are more "accurate" or "reliable" than their general-purpose peers like Google or Bing. But discovery systems are still software written by people with prejudices and biases, library software vendors are subject to strong commercial pressures that are often hidden behind diffuse collection-development contracts and layers of administration, and they struggle to integrate content from thousands of different vendors and their collective disregard for consistent metadata.
Library discovery systems struggle with accuracy, relevance, and human biases, and these shortcomings have the potential to shape the academic research and worldviews of the students and faculty who rely on them. While human bias, commercial interests, and problematic metadata have long affected researchers' access to information, algorithms in library discovery systems increase the scale of the negative effects on users, while libraries continue to promote their "objective" and "neutral" search tools.
Publication DateJune 1, 2019
PublisherLibrary Juice Press
Citation InformationMatthew Reidsma. Masked by Trust: Bias in Library Discovery. Sacramento, CA(2019)
Available at: http://works.bepress.com/mreidsma/5/
Creative Commons license
This work is licensed under a Creative Commons CC_BY International License.