For the past few days, this article from the Electronic Frontier Foundation has been making its way through the library sphere:
The Cost of Censorship in Libraries: 10 Years Under the Children’s Internet Protection Act by Rainey Reitman (posted on September 4, 2013)
There’s much excellent material to go over in this piece. I have many reactions to it. The first and most important being this:
It’s not a library’s job to police people.
It’s not actually our job to act in loco parentis. This is one of the big differences between public libraries and public schools, and it’s something that many library patrons misunderstand. It’s not a library’s job to judge any patron’s information needs—it’s not even any of our business why they need it.
It is our job to provide access to information and to help people learn how to handle it in useful and healthy ways.
My favorite passage from the EFF article:
“When websites such as social networking sites, political advocacy sites, and LGBTQ-themed sites are censored from the Internet experience of young adults, we are failing to empower our children with the skills they need to use good judgment, common sense, and basic precautions when browsing the web. Rather than employing overly stringent filters to censor the Web, libraries and schools should educate students to protect themselves online.”
The most pernicious thing about censorship is that the impulse to censor so frequently begins as a genuine impulse to help (ignoring instances of political and institutional censorship, which are acts of power). When people truly believe that something is dangerous and harmful, they feel an obligation to want to protect people from it, especially children. The real problem is that their definition of “protection” is shallow and useless. As I argue in my previous post about censorship, and as the EFF article argues:
The best protection isn’t to shelter and isolate people from obscene things, it’s to teach them how to handle it when they encounter them.
One of the many other issues I have with CIPA—an issue I have with most systematic attempts to censor (aside from the fact that it’s censorship)—is that it doesn’t do enough to set explicit limits, to make it clear at what point a library should stop censoring internet content. As the EFF article points out, there are many libraries that go too far in their filtering and CIPA imposes no consequence for this.
The law presumes that a patron complaining about content is correct; it doesn’t realistically allow for the library to exercise professional judgement on the issues. The reason some libraries go too far in their filtering is because it’s safer than having to deal with the legal consequences of not filtering enough—and CIPA doesn’t make that line clear. Without clear legal boundaries, libraries are left terrified of anything that could potentially offend patrons.
Either allow libraries to judge the issue for themselves, or give us clear guidelines and limits. CIPA does neither.
Of course, here we run into a technological limitation: most internet filtering software simply isn’t agile or adaptable enough to be able to set reasonable limits, even in cases where sincere attempts are made to do so. Frequently, in order to catch all the bad stuff your only option is to over-filter everything.
Consider the data presented in this article from the Librarian in Black:
Why internet filters don’t work and why libraries who filter are wrong by Sarah Houghton (posted on May 7, 2010)
She makes it abundantly clear—internet filers don’t work. The data in conclusive.
To close, I offer this quote from the LiB article linked above. For me, this is simply the best encapsulation of the issue I’ve read:
“Just because someone is using a library computer, does that mean that he or she automatically has less access to information? It shouldn’t, and libraries are fighting for information access rights every day.”