This is the sixth and final piece in a series of posts that offer insights and calls to action based on the results of three recent surveys conducted by Book Riot and the EveryLibrary Institute. The surveys explored parental perceptions of public libraries, parental perceptions of librarians, and parental perceptions of school libraries. The first post in the series emphasized how data overwhelmingly supports libraries and library workers. The second looked at how what’s happening in school libraries is foreshadowing the future of public libraries. The third, on why library workers need to be their own advocates of the library, and the fourth, on the erosion of trust in professional library workers despite parental trust in those same professionals. The fifth piece explores how deeply ingrained intolerance is in America and how that manifests in book banning.
Somewhere between 2016 and 2018, a major shift happened in our internet experiences. Algorithms began to take over what were once chronological timelines across social media.* With that shift, users no longer only saw posts from friends, family, or groups to which they belonged. Instead, algorithms predicted user behavior based on prior engagement with content. If you clicked on and liked or shared a particular news story, the algorithm would learn that and serve up more stories like it to your feed. Access to Facebook or Twitter or Instagram was free to you as a user, but only in so much that you did not pay for it with money. You paid for it with your data, used by those social media companies to attract advertisers.
Several outcomes from this shift have happened, and two of them are especially relevant to our moment in history around book banning. The first is that algorithms have impacted local news. The downfall of local news since 2000 has been well-documented — somewhere around one in four local newspapers have shuttered between that year and 2020. This closure has meant local stories, including the reports on happenings at the local school and library board meetings, have gone untold or, if they are told, they are locked behind a paywall. The papers that remain either through good luck and those which have been absorbed into larger media conglomerates have had to play the game online to get their news in front of readers. Stories and headlines alone no longer do their job to catch attention. They need to compel readers to engage with the content via likes, shares, and comments in order for those stories to show up in more feeds. You might “Like” your local paper on Facebook or follow them on Twitter, but unless you’re doing something with their stories, you are probably not seeing them show up in your feed. Thus, a shift to cover the most outlandish has been crucial, not because the stories are important or impact the lives of a community. They’re crucial because they solicit the engagement those outlets need in order to even get their work out there.
The second big outcome of the shift to algorithms is that echo chambers online have gotten even bigger. Because engagement is what fuels the algorithm, anything you might be commenting on, sharing, or clicking, is going to package that data and help serve up more content like what you’ve seen. This is why it can be shocking for folks who are otherwise smart and well-informed to learn about something that has really been all over the news. A lot of times, when people online say, “Why has no one been talking about this?” the reality is that they have. That work just has not crossed into every feed.
Echo chambers borne from the algorithm create tunnel vision for people. If you’ve clicked and are compelled to share how angry you are about a new anti-trans law being proposed in your state legislature, the algorithm is going to serve up more news that is similar, both within your state and beyond. You will likely not see the stories of the activists on the ground fighting those bills or who have successfully codified trans rights in other states unless you are also engaging with that work.
Copyright
© Book Riot