Friday, October 31, 2025

CRIT Redux: Evaluating AI, Google, and Databases for research

Since I began my career (about the time I first heard the phrase "alternative facts"), my guiding question as a high school librarian has been "How do I teach students to think critically about the media and information they encounter?"

That's arguably more important than ever, but there's a new hurdle to clear before we get there:  

 How do I convince high school students not to rely on AI for challenging school work?  

 Every semester I get the opportunity to work with our Advanced Composition classes (mix of juniors & seniors) on research for their argumentative essays, and I've used the same lesson for the last couple of years (see previous CRIT post), inserting a little "ChatGPT is not a search engine" spiel that I'm sure went into and out of ears very quickly. This year, with this new hurdle in mind, I redesigned the lesson in a way that integrates CRIT while asking the students to consider the benefits and pitfalls of using AI, Google, and databases. 

Lesson Slides 

Set-up: 

  • AI/Google/Databases scorecard on a whiteboard; dry erase marker nearby
  • cards with numbers 1-5 on each table (recommend different colors for each number so it's easy to see at a glance)-- see slide #1
  • slides & screen/projector

 Background: At various checkpoints throughout the lesson, students will collaborate with their table mates to assign a score to AI, Google, or Databases for each criteria. Ask each table to hold up their number, then roughly average the scores offered. If you're feeling confident, ask tables whose scores are outliers to justify their choices. 

<Slide 2> Introduce the lesson

<Slide 3> I start every lesson with this pseudo-existential question to make sure students understand the point of our time together. By reviewing the upcoming assignment, we can begin with the end in mind. 

<Slide 4> I took some liberties here, assuming that students want to get their assignments done well… so we talk about this, and I mention that even if they don't have that goal, their teacher (and adults) have that goal for them. It's my job, then, to help them bridge the gap between fast and well. 

<Slide 5> "research" vs "look stuff up"... This is an opportunity to understand that research is an in-depth, iterative process. 

Checkpoint: have students score AI, then Google, then databases according to how fast and easy they are to use. Record the rough average (I used whole and half numbers) on the whiteboard. 

Reflect as a group: if the only goal were to get the upcoming assignment done quickly, it's easy to see why students might be tempted to use AI. However, this is plagiarism, which is both unethical and a violation of academic integrity policies (this is especially crucial for students in dual credit classes who could face college-level consequences). It also doesn't help them learn anything new, so they lose out. 

Before moving to slide 6, you can also initiate a discussion about why citations are important (to give credit where it's due, to prove you're not plagiarizing, to give your reader the opportunity to further their own research).    

<Slide 6> walk through the information on the slide regarding citations and AI. I include a personal anecdote about how ChatGPT has repeatedly given me lists of books attributed to incorrect authors when I ask for things to put in a themed library display, but you could also refer to the "Summer Reading List" that was published in a number of newspapers in May 2025 for a similar example. 

Checkpoint: have students score AI with regard to citations.

<Slide 7> At our school, we encourage students to use Zbib.org to generate citations and edit them as needed. Our students are familiar with this already, so this doesn't require much additional explanation.

Checkpoint:  have students score Google with regard to citations.

<Slide 8> Our students are already familiar with the fact that our databases have built-in citation tools, so we don't spend much time on this point either. 

Checkpoint: have students score databases with regard to citations. 

<Slide 9>  Introduce CRIT- an acronym to help students think CRIT-ically about the information and media they encounter (I usually pause for a pity laugh). (For more details, see CRIT post). 

<Slide 10> Define Credibility, then walk through each piece of info on the slide. My usual examples/notes:

  1. On social media, how often do you encounter someone telling you how you should do something or what you should believe about a given topic? How often do you leave the app you're in to investigate whether that content creator has the education or experience to have expertise about what they're telling you? ... I've been a school librarian for 10 years, and I have a Master's in Library & Information Science. These are the experience and education that make me a credible source to talk to students about information literacy & research. I am not a credible source on other topics, like ice fishing. That's why it's important to know a little bit about the author/creator's background. 
  2. Why would it matter in terms of credibility whether something has been reviewed by someone other than the creator/author? 
  3. Citing sources or explaining methodology is how people justify the claims they're making, just like you'll need to do in your paper. 
  4. Cross-checking claims, especially those that might have any political implications, is crucial. Try to look up the same information or event in sources that are known to have opposite biases. Typically the things that appear in both reports, like the middle of a Venn Diagram, are credible.   

<Slide 11> Some of this was covered in the AI/citations slide, but it bears repeating. The most important thing, though, is point #2 (that's why it has 3 citations). Currently, ALL generative AI tools, even the ones designed for research, even the pro/premium models, have been found to hallucinate (provide false / made up information) anywhere between 10% -50% of the time. That means for some AI tools, it's a literal coin toss whether it's providing you with credible information or not. WHAT?? 

Checkpoint: have students score AI for credibility.

<Slide 12> Here, I like to begin the Google:Youtube/TikTok::Databases:Streaming Services analogy (shout out to Dr. Kristen Mattson for the idea- I still think this is one of the most relatable ways to help students understand what a database is).   When students use Google (or any other search engine), it's imperative that they run through that Credibility checklist from slide 10. 

Checkpoint: have students score Google for credibility.

<Slide 13> This is where the analogy continues. Database providers have quality control gatekeepers who check credibility so students don't have to. 

Checkpoint: have students score databases for credibility (you can see where this is going).

< Slide 14> First, define relevance. Then I explain that the part in the purple is pretty automatic- if I do an ineffective search for "train" and I want information on training a puppy, I'm going to scroll past results about bullet trains in Japan without even thinking about it, because they have nothing to do with the topic I'm searching for. Next, reading through a source and determining which piece of information is the most relevant evidence to support a claim is something they'll work on with their classroom teachers (and likely already have). My angle is about the algorithms operating behind the scenes that serve up results in all these tools. 

 <Slide 15> background: In the spring of 2025, I created this AI lesson video; this slide is an evolution of that.   

AI is programmed to make it feel like we're having a conversation. But just like I can have a conversation in person with someone and have a misunderstanding, AI can also misunderstand and, by extension, give a response that isn't relevant to the request. I used the prompts shown in Canva's AI image generator. 

Starting with the bottom series of images: I asked for a hamburger with no cheese-- there's no cheese on any of these, so that's accurate-- and pickles -- two images clearly have pickles; two are a little harder to discern. And, all the images include other things I didn't ask for- lettuce, tomatoes- because the AI was making assumptions about what I needed. 

So I tried another prompt, and that's where things went really sideways. There's no hamburger patty in any of those 4 images, but my intention when I used that prompt was to get a hamburger patty with just pickles- no bun, no tomatoes, nothing else. Rather than giving me even a single fully relevant result, AI generated images that were tangential at best. We can see that because we know what hamburgers and pickles look like, but imagine if we were using AI to write us a report about a topic we're not very familiar with. Would we be able to tell, in a wall of text, that the hamburger patty is missing? 

Checkpoint: have students score AI on relevance. 

Prior to moving on to slide 16, I ask my students how long they've had their Google accounts. Our school uses Google for Education, so most of them have had their accounts for close to a decade now. I've had my work account for just as long.

<Slide 16> Every time we Google something while logged in to those accounts, every result we click on, and every result we scroll past, Google is collecting that data to give us increasingly personalized search results. Sometimes this is great-- this is also what populates our social media feeds, so it means we don't have to spend a ton of time searching for content we're interested in. But sometimes, like when you need information or perspectives outside your filter bubble, it becomes an issue. 

The screenshot included is from my work computer- both windows open on the same device at the same time. On the right, I was logged in to my work account in Chrome; on the left, I was in Firefox and not logged in to anything. On the Google News homepage, most of the information/sources are the same, with one exception-- the first additional source. In order to get outside this bubble to get other sources/perspectives, I recommend doing the same search in various search engines, or logged in and logged out, or with a friend with a different worldview, and comparing results. 

Checkpoint: have students score Google on relevance. 

<Slide 17> Basically just walk through the points on the slide, emphasizing that databases don't collect data for personalized results. Students will get out what they put in, so it's important to know how to use search strategies and limiters effectively. 

Checkpoint: have students score databases on relevance. 

<Slide 18> What are the possible intentions of media/information? Some of my students can come up with Persuade, Inform, and Educate (PIE), and I expand those based on a lesson from the News Literacy Project's Checkology Classroom. We also talk about Selling (which is a form of Persuasion), Provoking (an extreme form of Persuasion), and Documenting (the step before Informing; Informing provides context to Documentation). 

Why does it matter? Well, if you have to write a history report about the sinking of the Titanic, are you going to use the 1997 blockbuster film starring Leonardo DiCaprio and Kate Winslet as a historical source? No? Why not? (it's made to entertain and sell tickets, not to inform). Or if you're researching the health effects of sugary beverages, and you find a study that says soda has health benefits... and then you see that the study was sponsored by Pepsi, how does that change your opinion? Makes it less credible, because their intention is to sell you things, right? 

So what do you think the main intention of the AI companies is?  

<Slide 19> I think the AI companies goal is to profit. Most of them would like you to buy the pro or premium version, but even if you don't, they (like social media) want you to keep using their product, because every interaction with AI gives them data and helps train the tool. 

Also mentioned in the OpenAI report cited on this slide: the desire to keep users interacting with the AI tool may be contributing to the frequency of hallucinations. As the report explains, "language models hallucinate because standard training and evaluation procedures reward guessing over acknowledging uncertainty" (Why Language Models Hallucinate). 

Checkpoint: have students score AI for how useful the intention of AI is in performing quality research.

<Slide 20> Very similar to AI companies now, although Google's original mission was "to organise the world’s information and make it universally accessible and useful" (Google had outgrown its 14-year-old mission statement).

<Slide 21> Furthermore, when student select an open-web resource from their Google results, they'll have to investigate the Intention behind it.  

Checkpoint: have students score Google for how useful its intention is in performing quality research.

<Slide 22> The Intention of the database providers is pretty self-explanatory.  

Checkpoint: have students score databases for how useful their intention is in performing quality research.

<Slide 23> Calling this element "Timeliness" is a little clunky, but it's still important, and this was the best thing we could think of to make the acronym work. We need to make sure that the resources we're using are timely based on our information need. If we're researching history or literature, we'll want primary sources, which are usually a bit older. If we're researching something related to science, technology, politics, sports scores, etc., then we need the most updated information we can find. 

<Slide 24> How do you determine the publication date of the answers you get from AI? How easy is it to get older results from Google if that's what you need?  Databases have publication date limiters that make it easy to dial in to the exact time frame that is appropriate for your topic. 

Checkpoint: have students score AI, then Google, then databases for ease of locating timely resources. 

<Slide 25> I've had a student volunteer in each class come total the scores for me. In this example AI scored significantly lower than in a few other classes, but so far, databases have always come out on top!


 <Slide 26> We then talk about how to use databases efficiently:

<Slide 27> How important search strategies are, and how to find potential synonyms or related terms for their keywords. 

<Slide 28> Similarly, they need to know what kind of tools are available in every database and have an idea of how to find them so they can become more database proficient. On this slide I included screenshots of the tools from the databases their teachers and I recommended for this particular project.

<Slide 29> Finally, we use slide 29 to transition to our Canvas course, where we house our database access links, and discuss how they're organized by type, which one to use for what kind of information, etc. 

PHEW! This one is a lot, but I'm pretty happy with it for now. We'll see how I have to change it again in the future as the tools continue to change....  

 

Tuesday, April 23, 2024

Adapting Dewey: The Ups and Downs

 If you're reading this blog post, you're likely already aware of some of the downsides of the Dewey Decimal System, like how often people bring it up when they find out that you're a librarian. 

Other downsides include, but are not limited to:

  • Eurocentrism
  • Bigotry
  • Confusion
  • Lack of forethought (cramming the whole internet into the 00s)
  • Significant portions of US history shelved in the 300s instead of the 900s because they relate to "social issues"

The list goes on. If you're looking for more in-depth information on this topic, I highly recommend Kelsey Bogan's Ditching Dewey blog posts (post 1, post 2, post 3). 

Aside from those larger Dewey issues, I have also struggled as a librarian trying to help students find things organized by the traditional DDC. I lost track of the number of times I would get asked for World War II books and say, "Well, you'll check here, and here, and here... " Or students students to the 600s for books on dogs and cats, but telling them to head back to the 500s for other pets like fish. And as someone who studied comparative religions in college, I was driven absolutely batty by the 200s. The 000s made me so crazy I made a TikTok about it. (I actually made a handful of Dewey-related TikToks.)


 Our library has tall shelves that line the walls and form a bit of a frame around the main space of the library, and couched within them is a section of short shelves. Previously, the tall shelves held non-fiction while fiction was crammed on the short shelves. 


During the lulls of the 2020-2021 school year, my colleagues and I slowly undertook the process of genrefying our fiction collection. As part of that reorganization process, we decided to move fiction to the tall shelves (and I am so glad we did; we get so many compliments on how inviting our color-coded genre labels make the library look) and non-fiction to the short shelves. 

And, being the masochist I am, I thought, "well if I'm touching every non-fic book to move it anyway, why not just reorganize Dewey while I'm at it!"I am glad I did it. It makes so much more sense to me, and I think it's easier for my students to navigate.  But the way I did it was arduous, especially with a non-fiction collection that hovers around 8,000 items.

I knew I wanted to keep a decimal system, so I started with a spreadsheet.  My primary goals were to make it make sense, to keep the hundreds-level categories as similar as possible, and to keep to an absolute maximum of 3 numbers after the decimal. I took every book in a hundreds-level category off the shelf and put them all on tables, tops of bookshelves, the floor-- I was doing most of this work when our students were taking finals, so they weren't using the library-- and then tried to put them in groups that made sense (if they weren't already sensical; Dewey's system wasn't all bad). As you may imagine, some areas were easier to work with than others.

My overarching categories ended up as such:

  • 000s Information & Computer Science 
    • I devoted the entire 000s to information technology, leaving plenty of room for growth. And thank goodness, because hello AI!
  • 100s Traditions, Folklore, and the Unexplained
    • I kicked philosophy to the 200s (ways of understanding existence) and psychology to the 600s (health and wellness), then scooped the unexplained from the 000s and combined it with traditions & folklore out of the 300s.
  • 200s Mythology, Religion, & Philosophy 
    • This section was where I really got to put my BA to good use. I tried to give each belief system equal weight and respect. 
  • 300s Society & Social Issues
    • I hated the 300s. I still kind of dislike them, honestly, but I did my best. I moved military to the 900s (with war, so that makes sense), and I tried to contextualize "social issues" based on their historical and current arguments in a way that (hopefully) doesn't reinforce marginalization, because a person's identity is not, in and of itself, a social problem.     
  • 400s Languages      
    • I basically left the 400s alone. As they exist, they're incredibly Eurocentric, but my collection really only contains books in the languages our school offers (currently German, French, Japanese, and Spanish), and they're not in high demand. 
  • 500s Math & Science
    • A lot of the 500s stayed the same. I did add a section (my 560s) specifically for climate and environment because our Bio classes have a climate research project. I also put all the animal books in the 590s regardless of domestication status. 
  • 600s Health, Wellness, & Self Help
    • I look at our revamped 600s as the "take care of your physical and mental self" section. Medical information flows into psychology flows into mental health flows into self-help flows into life after high school. I also love that physically, this section wraps around a common hangout area in our library, so students can see these resources without having to dive into shelves and dig around.        
  • 700s Arts, Hobbies, & Recreation  
    • Our 700s became the "what you might do in your spare time" section, including the usual 700s things like art, music, and sports, and adding in cooking, gardening, pop culture, and fandoms. Sidenote: our graphic novels had long-since been broken out of their 741.5 DDS category into a GN section.
  • 800s Literature
    • This was another section that I attempted to make a little less Eurocentric. Rather than subdividing by geographical origin, I subdivided by literature type or topic. Now all our poetry is in the 830s, plays are 840s, etc. 
  • 900s History & Geography
    • I spent so much time deliberating over how to organize the 900s. I knew I wanted all the WWII books together, so I gave the 910s to military, weapons, and major wars (defining "major" as wars my students would be aware of, which is inherently US-centric but also best serves my population). But beyond that, should I turn it into a global timeline? Or keep but tweak the geographical categories and subdivide those chronologically? I ultimately went with the latter option, making sure to leave room for the 21st century and beyond.                                                                                                                                    

Upsides: 

  • it's easier for students to find what they're looking for without having to hunt in multiple sections
  • the spine labels are easier to read and reshelving is easier because the decimals only go out 3 places max
  • there's less bias in the system
  • there's more room in this system for future-facing topics 
  • my brain feels better when I think about the system

Downsides: 

  • it's wholly my own system, so whoever replaces me (in the distant, distant future) is going to hate me (but at least there's a spreadsheet!)
  • I can't rely on others' cataloging for my non-fic
  • my bias is built into the system, but hopefully I kept it to a minimum by trying to best serve my population
  • we had to replace nearly every non-fiction spine and barcode label in our library (the 400s and some of the 500s didn't change)
  • it took a loooong time, including nearly every day in the summer of 2021
  • trying to figure out how to redo the 300s and the 900s broke my brain a little bit

If you are considering adapting or ditching Dewey, I would definitely recommend it! I would only caution you to research a few different ways of reorganizing your non-fiction and really ponder what will work best for you and your library population. And finally, make sure you have plenty of time! 

Monday, February 12, 2024

CRIT: An updated approach to Information Evaluation for Research

 I have always been passionate about information literacy and getting the facts straight. I used to get in trouble as a kid for talking back, which was usually just me trying to correct an adult's factual error. In high school, I wrote a strongly-worded letter to one of our local journalists because he misidentifed turtles as amphibians in a piece about the Teenage Mutant Ninja Turtles. I screamed myself hoarse when people in powerful positions started using phrases like "alternative facts". In an ideal world, I would teach a compulsory course at my high school that is solely devoted to information evaluation. So when, in the fall of 2022, I learned about the shortcomings of that long-touted librarian tool, the CRAAP test, I went straight to work with my colleague Nadia to come up with something that would be more effective, but still suit our students' needs.

Cut the CRAAP: Explore a new acronym to get students thinking CRITically about information by Nadia Komp & Emily Wilt 

We combed through a number of alternative methods for information evaluation, including Mike Caufield's SIFT (the four moves), the book Developing Digital Detectives, the book Fact vs. Fiction, and lessons from The News Literacy Project and the Digital Inquiry Group's (formerly Stanford History Education Group) Civic Online Reasoning (COR) course. These are all excellent resources, but none of them independently satisfied everything we were looking for. We liked the acronym aspect of CRAAP, and felt that SIFT was good for general info evaluation but didn't quite tick all the boxes we wanted for school-related research. So we invented... 
 
CRIT: Credibility, Relevance, Intention, Timeliness

4 letters that pack a punch. I always start CRIT lessons with a terrible joke-- "this will help you think CRIT...ically about the information and media you encounter"-- that will hopefully help my students remember to apply this lens to all the media the come across, whether it's for school or not. Then we dive in and break down each letter. 
 
Full disclosure: I usually only get 90 minutes to do what I'm about to describe (we have block scheduling). Ideally, we would break this lesson down and take at least one class per letter so we would have ample time to practice each skill, but c'est la vie. Maybe next year. 

Credibility. 

What is it? How do you determine whether a source is credible? This is a crucial skill for both academic research and personal knowledge building. Here we have conversations about 
  • web domains ( .orgs have been available for anyone to purchase since 2019, y'all, and top-level domains [TLDs] are an ever changing beast)
  • the open web versus databases (Check out Dr. Kristen Mattson's blog post "Academic Databases are the Netflix for Nerds!" if you struggle to teach this concept to your students) 
  • how human psychology predisposes us to trust information shared by people we know, and how social media blurs the lines between people we actually know and people we feel like we know.

This is also where we learn about and practice Lateral ReadingIf nothing else, learn about and teach lateral reading. This is a totally transferable skill that is used by professional fact-checkers and will serve our students (and everyone, really) well in our increasingly online worlds, even as we ramp up interactions with generative AI.

One of my favorite lateral reading thought experiments with our students is talking to them about how to lateral read a social media post. People speak with such confidence on TikTok, YouTube, etc... how do you know you can trust what they say on a topic? How do you investigate the user behind the username? We've gotten to put this into action in a really cool assignment for World Literature I (the brainchild of my colleague, Kirsten Reed) wherein students are actually required to use some type of social media as a source. The feedback from the students has been overwhelmingly positive, and it has the twin benefits of transferring this skill to their real online lives and forcing them to understand how to construct a properly formatted citation by hand (I have yet to find an automation for generating a citation from a TikTok URL). 

Relevance. 

What is it? (I always start with that question to make sure we've operating from the same base). How do we determine what is relevant? This question is definitely geared more toward school-based research (some of what we felt was missing from the SIFT approach), but it also offers an opportunity to talk about how the current iteration of the internet works. 

I saw a demonstration of Google searching at a professional workshop a few years back that Blew. My. Mind. So of course I replicate it for my students whenever I get a chance! I usually do this in Google News, but a regular old Google search works too.
  1. Perform a search while logged into my work account. 
  2. Open a window in another browser and perform the same search while logged into my personal Google account. 
  3. Open an Incognito window and perform the same search without being logged in to a Google account. 
  4. Show all 3 windows side-by-side.  The results are always different, though sometimes the differences are subtler than others.

This is a segue into a discussion about the algorithms that personalize everything we experience online, from our Google searches to our social media feeds to our streaming media recommendations. We talk about the dangers of echo chambers, how important it is to recognize our personal biases, and some strategies for doing good research in spite of this (using library resources, naturally, but also using Incognito windows and comparing search results with friends).

That's just level one of Relevance. 

Level two is mostly subconscious: if I'm writing about high-speed rail, do a search for "trains", and there's a result about training animals, my brain filters that out for me automatically.

Level three is more intentional: I'm writing about high-speed rail, do a search for "trains," and have to choose between an article about existing train systems in the US and existing train systems in Japan-- I'll probably need to look into those sources to decide which one is more relevant to my topic. This is a great place to dive into search strategies that can help dial in those investigations to return a limited number of results that are most relevant to your topic. (This is really a whole lesson in and of itself.)

Level four is what students seem to struggle with the most: they have a good source, but they need to pull out the most relevant quote to support their argument. As this is a little more English teacher territory, I usually do a short practice with my students so they know where it fits, then move on to the next letter. 

Intention.

What do we mean by Intention? How do we determine the Intention of whoever created this information/media? And why does Intention matter? 

Most of my upper-level students have encountered the media intention (or purpose) acronym PIE- Persuade, Inform, Entertain- but I like to use the News Literacy Project's breakdown of six intentions to dig a little deeper. (You will need to create a free account to access the link above.) NLP proposes that the six primary intentions of media are
  • to Document (raw footage or audio with little to no commentary; primary source)
  • to Inform ("news" in the traditional sense; straight-forward reporting with some added context)
  • to Entertain (self-explanatory; sometimes overlaps with media in other categories)
  • to Persuade (to change someone's mind)
  • to Sell (to persuade you to part with your money)
  • to Provoke (to persuade you to react strongly; to persuade you to take a certain action)

This is a great place to talk about misinformation, disinformation, and malinformation (and to once again plug the benefits of using vetted library resources) and tie back to any classroom discussions students may have had about propaganda. If the intention behind media or information is difficult to determine, this is also a good place to call back to and practice lateral reading. 

Additionally, we can talk about what types of sources are appropriate for school research when, covering the differences between Scholarly vs. Trade vs. Popular sources, and when it might make sense to use social media in school projects-- for example, you would not cite an Instagram post, even one meant to inform, as factual evidence on its own, but you could cite it as an example of the cultural opinion on a topic. 

Finally, we come to 

Timeliness.

What do we mean by Timeliness? How do we determine publication date, and when/why does it matter?
 
Nadia and I felt it was important to include a piece related to publication date (the C in the ol' CRAAP test) because our students are assigned a variety of projects-- some are historical, in which case students need to be cognizant of whether their sources are primary sources from the time in question or secondary (or even tertiary) reports and analyses, and some are focused on current events, in which case our students need information that may change literally minute-to-minute depending on their topics.
 
This explicit reminder to check the publication date seemed too important to let go of. How many times have you encountered an article shared by a friend on Facebook accompanied by an outraged caption and a call to action, only to realize the article was published four years ago? Surely I can't be alone in that. This is the easiest piece of evaluation to accomplish.

Conclusion

I have been covering CRIT with students for a year and a half now, and they seem receptive to it. My English teachers love it, and we are working on making it an acronym that travels outside of library lessons to infiltrate classroom vocabulary as well. We also have a handy CRIT graphic organizer (I LOVE a good graphic organizer) for students to use as they are researching (one page covers a single source). I am always looking to update and improve, and I'm sure this concept will expand as we use more and more generative AI, but this is working for us for now. 

Hopefully this was helpful! Feel free to copy & modify any of the resources linked below as needed, and leave suggestions in the comments!

 

Thursday, February 1, 2024

Revamping Historical Fiction: A Meditation on Genrefying, Dynamic Shelving, and Representation

 We genrefied our fiction and adapted Dewey in our non-fiction back in 2021 (another blog post about that process is forthcoming), and by and large we've been happy with the switch. But Historical Fiction, in spite of its bright yellow spine labels, seemed to be generally overlooked. So after kicking the idea around for a few months and seeing no major drawbacks, we decided to give HF a facelift and further subdivide by era! 

before (well, actually, mid-process, hence the empty shelves and table piles)

Step One: Analyze collection to determine optimal era breakdown

What this actually looked like: piling HF books on tables in a very rough timeline so I could see how narrow or broad my categories needed to be. I weeded about 20 books, switched the genres of a few, and ended up with roughly 600 books subdivided into 8 categories. 

Step Two: Design era signs & determine how to indicate


The section I struggled with the most was 1600s - 1860s, which includes the Salem Witch Trials, American Revolution, the French Revolution, Regency novels, most of Dickens, the American West, and the American Civil War. In our collection, though, all of that only spans 6 shelves, and I didn't want to have a new era on every shelf, so... I reserve the right to recategorize at a later date if necessary. 

A note to anyone who (like me) feels a little nauseous seeing the 2000s included in Historical Fiction: As of right now, most of the 2000s novels that we include here are based around a historically significant event like the September 11th attacks. Since 2001 was well before my high school students were born, it only makes sense that they would look for those books in the Historical Fiction section. We do still have some early 2000s novels in our Realistic & Relationships section as well, but only if they don't have time-specific references. 

Era indicator signs were designed by me in Google Slides and are sized 5x5" to fit our acrylic magnetic picture frames. Those have been around longer than I have, so I have no product link.

To update the organization, we used these prelaminated 1/4" dots from Demco and matched the dots for each era to the font & frame color on the era signs. As of right now, we are not planning to further update Sublocation or Copy Category in Destiny-- these will remain Historical Fiction unless a need arises to get more specific. 

 

Step Three: Reshelve, but make it dynamic

Since we were taking every book off the shelf anyway, we also figured: why not employ dynamic shelving when we put them back? Let's breathe ALL the new life into this section! 

We've dabbled in dynamic shelving before, but this time it feels like we "got" it. We'll continue to adjust with book stands and book ends to keep things upright, but the difference in vibe is palpable.We ended up adding two more shelves at the very bottom to make room for everything because I couldn't bring myself to weed too much. I am in love with the way this looks!

One super important thing to keep in mind when switching to dynamic shelving (and displays, and posters, and collection development) is balanced representation. I weighed a few factors as I was choosing which books to face out in this initial process: 

  • Books with multiple copies: Often (but not always) my faceouts have multiple copies. This means that if one is checked out, there will be another behind it to maintain the structural integrity of the display.

  • Books with interesting covers: The purpose of dynamic shelving is to draw the eye, so I tried to select faceouts whose designs would appeal to my students

  • Books in a series: I like to stack books #2+ in the series horizontally and display #1 as a faceout in front of the series stack.

  • How many faceouts I can fit on a shelf without it looking overwhelming: 2-3 seems to be a good balance on my shelves

  • What messages the covers were sending: I tried to maintain a balance of gender, skin tone, and couple types across the whole section, and we'll keep that in mind as books circulate. 

We have already had more students browsing the Historical Fiction section and it's only been finished for a day. I'm looking forward to seeing our end-of-year circ stats! 


refreshed!

 

Thursday, January 25, 2024

The Evolution of our Library Cafe


2019-2022- starting out, stopping for Covid, and starting again

Equipment: 

  • (2) 45-cup coffee urns 

  • Bulk Coffee (started with donations from a local coffee place; switched to Folgers Classic Roast)

  • Bulk Swiss Miss hot cocoa

  • ¼ cup measuring scoop (for hot cocoa)

  • Assorted tea bags, packaged individually (recommend surveying your population for tea flavors)

  • 10 oz paper hot cups with lids & sleeves (recommend buying sets so your cups and lids always match)

  • Individual coffee creamers 

    • Classic, French Vanilla, and occasional seasonal flavors

  • Individual sweetener packets

    • Sugar and Stevia have been most popular in our building

  • Stirrer sticks

  • Napkins

  • Cash box & coin wrappers

  • Hand sanitizer

  • Lysol wipes


Logistics:

  • available for students before school only; available for staff throughout the day

    • our library is open for 40 minutes before the school day begins

  • located at the Circ desk

    • benefits: allows for library staff oversight & "water cooler" chat with staff

  • handled by library staff (1 FT librarian & 2 FT library assistants)

    • one staff member rinses urns & resets everything at the end of the day

    • urns are plugged into a power strip, so the first one in each morning just has to turn on a power strip


Pricing, etc:

  • $1.00 per cup (hot)

    • easy to remember, easy to make change

  • Cash only
    Funds deposited regularly to an ECA account associated with our Library Club

    • money used to replenish makerspace supplies, purchase props for library escape rooms, and to provide prizes for reading challenges

  • Pre-paid punch cards available for staff

    • up to $20; library staff keeps the cards and marks them off as credits are used

  • Reusable mug reward cards available for staff

    • library staff keeps the cards and marks them off

    • after 5 uses of a reusable mug, staff earns a card redeemable for a free beverage

      • free bev. cards can be used by the staff member or given to a student as an incentive

      • library staff use free bev. cards as prizes for reading challenges, etc.

  • All cards were designed by me and printed on different colored index cards by our in-house printer, but you could easily print them on cardstock and cut them. Just make sure there's something distinct enough about them to discourage counterfeits!



2023- added iced coffee & student volunteers

We were so busy in the morning with library & IT-related responsibilities that we decided to turn the cafe over to some student volunteers. Everything was still set up next to our Circ desk so we could oversee operations, but we rearranged some tables so our volunteers were not inside the Circ desk with us. Our volunteers earn service hours.

Then in the spring we wanted to try selling iced coffee with minimal investment to see how our students responded. They LOVED it. Here's how we approached it.


Additional equipment:

  • 2 ice cube trays (I'm not linking the ones we started with because they were terrible silicon hexagon trays that were very cheap and took forever to freeze)

    • we borrowed freezer space from a faculty break room freezer and replenished ice every day

  • a large plastic bowl that we already had 

    • we stored excess ice cubes in gallon ziploc bags that sat in this bowl

  • (2) empty gallon water jugs

  • an ice scoop

  • 12 oz plastic cold cups with sipper lids

    • we opted for sipper lids rather than straws to help reduce the amount of plastic we were using

  • Torani flavored syrups: French Vanilla and Caramel

  • French Vanilla Concentrated Creamer

  • Measuring cups

  • Funnel


Logistics: 

  • Make bulk batches of iced coffee concentrate 2-3 times a week (depending on demand)

    • this gets poured over a full 12oz cup of ice

    • recipes linked; try and adjust to your taste- we had several student taste-testers

  • Store unused iced coffee in refrigerator; discard after 3 days


Pricing, etc.: 

  • $1.00 for hot drinks

  • $2.00 for iced coffee

  • All staff cards still in operation

  • Free beverage cards can be exchanged for a hot or iced drink

Fall 2023- changed location & staffing

With the addition of iced coffee in the spring of 2023, we had too many students crowded at the Circ desk to manage successfully, so we borrowed some furniture for other places and created a Cafe nook that was still close enough to keep an eye on. Our student volunteers run the mornings entirely, and library staff continues to provide the behind-the-scenes support.


Additional / Updated Equipment:



Notes and other thoughts:

We started with the coffee urns, bulk coffee, and bulk hot cocoa rather than K-cups because we wanted reduce the amount of waste generated in the library. It has meant more oversight in the process–our volunteers have to scoop the hot cocoa and fill the water rather than just letting a student grab a K-cup, for instance–but it meant a lower starting cost. And now when we need to make our iced coffee, we just make a larger amount in the urn and let it cool rather than requiring an additional coffee pot or using a ton of K-cups. 


We have done additional fundraisers with knit and crocheted coffee cozies ($2.00) to some success. 


It has been over a semester since we moved the cafe away from the Circ desk. I love that our students have largely taken ownership of the morning operations, but I do miss those opportunities to chat with colleagues while they fill their coffee cups.