Rising to the challenge: library leaders share their top strategies for AI literacy education
How academic libraries are leveraging strategic thinking, technical experimentation and principles of knowledge integrity to build institutional capacity in AI education.
by Susan Jenkins

As students, academics, and major industries face the new frontier of generative AI (GenAI) technologies, many higher education institutions find themselves struggling to catch up and integrate AI literacy skills in curricula and educational missions.
Recent surveys1, 2, 3 find that many current university students feel unprepared for a future in an AI-influenced world. Some are questioning the value of higher education itself as the springboard to a stable and worthwhile career. At the same time, educators and researchers are concerned with the loss of valuable cognitive skills through misapplication of AI’s capabilities.
This points to the delicate balancing act higher education institutions need to manage to achieve a key mission: beneficial outcomes for society.
Josh Sendall, Director of Library Services at University of Leeds in the UK puts it this way: “If a university’s purpose is to make a positive difference for human flourishing - and we can apply AI in ways that enable them, the AI-augmented graduate may be a promising proposition for that future.”
Through decades of information and digital literacy initiatives, academic libraries have evolved significant capacities to meet this moment. Many now see these challenges as opportunities.
“In December of 2022, we had students coming into our libraries asking us to find books that didn’t exist—hallucinations that could have been teachable moments if we’d been AI literate,” explains Leo S. Lo, Dean of the College of University Libraries and Learning Sciences at the University of New Mexico (UNM).
The rapid growth of GenAI has, at times, even overwhelmed faculty within its home disciplines. Haoyong Lan, STEM Librarian and liaison for the Department of Engineering and Computer Science at Carnegie Mellon University (CMU) recalls, “when I initially talked with the faculty about the tools we were providing and prompt engineering techniques, they were astounded. I thought they would be familiar, but many were not.”
"When I initially talked with the faculty about the tools we were providing and prompt engineering techniques, they were astounded. I thought they would be familiar, but many were not."

HL
Haoyong Lan
STEM Librarian and liaison for the Department of Engineering and Computer Science at Carnegie Mellon University
These libraries have taken multifaceted approaches to build an understanding of AI’s impacts and capabilities, define the essential elements of AI literacy, and integrate tools in their education initiatives. Simultaneously they are positioning themselves as strategic collaborators in developing AI literacy education at their institutions.
In this article, they share their approaches and discuss the issues they hope to address through establishing responsible AI use, including thoughts on how AI tool developers could support their efforts as the technology and social consequences of generative AI continue to unfold. Introduction section
Gaining expertise in AI
Many libraries’ first serious encounters with AI followed ChatGPT’s introduction in late 2022. To support students and researchers in understanding the responsible use of AI, libraries first have to become literate themselves.
Student Focus Groups
Leeds takes a student-centered approach in designing educational initiatives. Curious about student’s experiences with GenAI tools, the library’s Learning Development team, one of several complementary teams that serve student and researcher needs across the university, organized a series of student focus groups. During those sessions, students expressed a desire for a structured approach for learning about AI, especially around ethical use and implications of AI in academic and professional contexts. Josh recalls, “the real sentiment we perceived was nervousness - they want to do the right things.”
These focus groups provided library staff with a clear direction for exploring the tools that students were using and a scope for addressing the anxieties students themselves were raising. They also provided the basis for the Learning Development team to create dedicated AI pages in the university website’s Academic Skills guide, “a comprehensive suite of resources aimed at demystifying GenAI and embedding critical engagement” explains Josh. “Wherever we can, we need to provide clarity, so they (and we) feel we’re on the right track.”

Surveying the academic landscape for priority training needs
In the wake of his library’s experience with requests for hallucinated books, Leo took a research-oriented approach, conducting two national surveys of academic libraries’ GenAI knowledge and skills. He used the findings as a basis for developing a 12-week training program at UNM.
Leo explains, “The first priority was increasing librarians’ skill in using GenAI, specifically ChatGPT.” After the first cohort of librarians completed the course, he offered it for faculty and more recently to academic advisors. It’s also the basis for a new course offered earlier this year at his college, but it is open to any student in UNM’s system.
For Leo, increasing librarians’ skills with AI tools is essential to building AI literacy awareness: “In order to support students, faculty and librarians have to be at a level to guide them. When they start using it, they think about all the other things essential to literacy: ethical awareness, critical thinking, how it impacts their work and information literacy.”
Forming communities of practice
Creating opportunities for different types of users to learn together and share characterizes CMU’s approach. Although based in the Engineering college, Haoyong’s workshops on AI literacy skills are geared towards the entire campus. More recently, the libraries began hosting a monthly “AI Sandbox’ where faculty and staff can try out different AI tools and resources via demonstrations, hands-on learning, and group conversations.
The AI Sandbox concept is also integral at Leeds – “it enables all of our staff, irrespective of the team they sit in, to gain the awareness, understanding and insight to refer students to the right support when they have enquiries about AI” explains Josh.
A result of all these approaches is that libraries have become the binding factor in their institutions’ response to GenAI. Leo notes, “we coordinate the AI resources hub for the university - this is a strategic opportunity for the libraries to bring everybody together.”
This also reflects the experience at Leeds, according to Josh. “The library was one of the first professional service areas in the university where educators wanted to come together and ask, ‘what do we do here?’”
Strategic collaborations
Leveraging knowledge sharing opportunities outside their institutions also helps these libraries build their expertise and contribute to the wider conversation about AI in education. Josh observes, “I would say there’s a real, critical discourse now around AI - in the academic library and higher-ed spaces; there isn’t a conference of late that isn’t dedicating airtime to it.”
Josh is active in the UK’s UKSG and SCONUL organizations for academic libraries, and Leo has held the Presidency at the Association of College and Research Libraries (ACRL) for the past year, all settings that provide opportunities for advocacy in addition to knowledge exchange.
Haoyong gains insights he can apply to CMU’s initiatives through collaborating with library colleagues in the International Federation of Library Associations and Institutions (IFLA) Artificial Intelligence Special Interest Group and as a member of the POEM research project, where he curates open educational resources (OER) about generative AI literacy concepts and frameworks.
By participating in these forums, these libraries expand AI literacy knowledge across their teams and build synergies that help all institutions. As Leo notes, “It's hard for just a small group of people to come up with the solutions. The best way is to get everybody trying new things. A promising direction will evolve, but it only works if we have a lot of people doing it.”

Frameworks for teaching AI literacy
One of those synergies is the emergence of frameworks that inform education practices and create a standard of AI literacy for institutions to rally around. Many librarians see aspects of AI literacy as mirroring the guiding principles behind information literacy. Haoyong explains, “Information literacy is about critically assessing the information from academic research databases and other scholarly resources, and AI literacy is also about critically understanding and assessing the AI-generated responses.”
But there are key differences that distinguish AI literacy requirements. Leo suggests, “societal impact, job displacement, environmental impact – are not part of information literacy. One thing that distinguishes AI from information or even digital literacy is the possibility – that’s a big if - it gets to AGI or ASI4, where it surpasses humans. If we get close to that, it’s more than just information and digital literacy.”
Leo’s research recently led him to publish a framework, AI Literacy: A Guide for Academic Libraries, which offers a set of key components and strategic advice for libraries.
The key components in his framework are:
Technical Knowledge – grasping essential concepts
Ethical Awareness – considering potential biases, accountability gaps and privacy concerns.
Critical Thinking – leveraging information literacy skills to critically evaluate model outputs
Practical Skills – being able to use AI tools effectively
Societal Impact – understanding how AI reshapes the world culturally, economically, and environmentally
Many international organizations are also publishing guidelines for AI literacy, but Leo says having many frameworks isn’t a problem. “I like that people propose a new framework - but honestly, they are all very similar, so whichever you support will be OK. The point is to have something to work from instead of just ad hoc approaches.”
“I like that people propose a new framework - but honestly, they are all very similar, so whichever you support will be OK. The point is to have something to work from instead of just ad hoc approaches.”

LSL
Leo S. Lo
Dean of the College of University Libraries and Learning Sciences, at University of New Mexico
Haoyong has focused on practical skills as the basis for educating students and staff in other aspects of AI literacy. “All students need to grasp prompt engineering, because that's essentially the art and science of interacting with, and also efficiently and ethically using, the generative AI tools.” CMU’s workshops include specific techniquessuch as one-shot, few-shot, and chain-of-thought prompting, which help students to develop critical evaluative skills as well as efficiency.
At Leeds, the Learning Development team develops workshops on GenAI for undergraduates with input from the Student Success team, while the Research Services team supports postgraduate learning. Josh characterizes their multi-team approach as integral to the university’s broader mission: supporting “digitally literate, responsible global citizens - equipping students with the skills to critically assess AI tools, understand their limitations, and use them ethically and effectively.”
Choosing and integrating tools in AI literacy education
Just as information literacy education is dependent on having high-quality bibliographic databases in the learning environment, each library's approach to integrating AI tools into their programs reflects the need to balance the penetration of dominant technologies with academic rigor.
Haoyong uses Gemini and ChatGPT to teach his prompting workshops but notes that “a distinct feature of using academic-specific tools is it helps teach students specific research skills. We use Scopus AI to teach students faster research discovery and brainstorming of research topics, but we also find more niche tools useful to teach citation analysis or demonstrate how to leverage an article’s keywords to find related literature in our university collections.”
Some reflect a hesitation in committing budget to subscriptions for academic-purpose tools before there are clear standouts to invest in. In UNM’s courses, “we are providing support for people to use general-purpose AI. During one upskilling cohort, we used one tool for literature reviews to find out whether we want to subscribe to more niche programs. Our faculty also brought in their favorite tools to share which is very helpful for us. But I’m honestly not committing to any that are very purpose-built. For us, it’s a little too early to commit to these programs.”
Josh, however, identifies advantages to using AI tools integrated with existing research databases: “the thing that is different about a GenAI layer rendered on top of a research collection is that it draws on that peer-reviewed corpus. Deploying retrieval augmented generation (RAG) solutions is eminently preferable to something that is much more opaque – like early conversational AI chatbots built on large language models (LLMs), which may have ingested the whole darn internet before regurgitating a deceptively appealing ready meal (minus the ingredient list, nutritional information and expiry date!).”

How developers of AI tools can support AI literacy education
While various big GenAI companies are moving heavily into the academic market with campaigns aimed at students, there are already over 100 products currently available for academic purposes, according to the Ithaka S&R Generative AI product tracker.
All agree that AI developers could take a more inclusive approach. Haoyong suggests that “developers could work with librarians to co-design AI literacy learning modules to enhance students' AI literacy skills while ensuring the AI tool design aligns with responsible AI principles.”
Leo would go a step further: “A great partnership would be for librarians to work with AI developers to incorporate information literacy principles into their products - they could then say ‘library-approved.’ That’s something that I am looking for – how we can be an active voice in the development.”
Josh also sees developers as having a unique opportunity to collaborate with academic librarians, especially now with the benefit of insight gained from the process of creating resources and teaching responsible AI approaches to staff and students. “It presents significant potential for information professionals and AI tool developers alike.”
He also emphasizes the need for more transparency and explainability, noting “if developers would create explainers alongside their development workflows, it would meet the librarian’s professional imperative to ensure our users understand how outputs are generated, what data is used, and where biases may lie. This would empower libraries in teaching how to interrogate and use AI responsibly."
Challenges for AI literacy
Despite strong frameworks and communities of practice, it’s not yet clear how to address some of the complex, intradisciplinary issues, such as questions around GenAI usage undermining critical thinking skills.
This is especially concerning as a few of the publicly available GenAI tools now offer research functions that use agentic AI processes, which perform in-depth, multi-step research tasks with a higher degree of autonomy than previous releases of GenAI models. Unless the technology is transparent and can leave room for creativity and critical human intelligence, there is a risk of undermining the integrity of research or blurring the line between “tool” and “collaborator.” Josh explains, “technology has forever been about amplifying, extending, leveraging human abilities…but we have to be careful that the human in the loop is the one controlling the loop.”
Unless the technology is transparent and can leave room for creativity and critical human intelligence, there is a risk of undermining the integrity of research or blurring the line between “tool” and “collaborator.”

JS
Josh Sendall
Director of Library Services at University of Leeds
Leo adds, “studies show it can both hurt and benefit us in critical thinking. Good students can use it to enhance their learning, but students who just want to take shortcuts absolutely will use it for that.”
The other side of this issue is resistance to engaging with AI technology. Josh notes that “the student focus groups revealed a diversity of perspectives from enthusiastic to resistant – particularly those in the humanities were critical of potentially outsourcing analytical thinking.”
This also concerns Leo, who observes that, despite good enrollment in the courses, “a lot of people are refusing to engage with it – so how do we reach that population?”
Perhaps the largest concern, however, is entrenched social inequities that one way or another limit access on individual and institutional levels. Leo points out that “elite universities are able to invest in AI, but that’s a handful of places. Most don’t have that level of resources, especially in this time of funding uncertainties.”
Josh adds, “we know from our experience of information literacy and digital literacy as well that for different student groups - mature students, ethnic minority students, neurodivergent students – you need a ‘universal design principles’ approach to ensure equitable access.”

Despite these concerns, all are looking optimistically towards the future. Leo is setting up a review committee to adjust the AI literacy guide as GenAI evolves. For him, the focus on ethics is essential to building resilience for the unknowns. “If we compare AI proliferation to the internet, we are at the dial-up stage now. There are going to be so many ethical dilemmas that will come up because of AI Agents and other things we can’t yet imagine.”
For Haoyong, investing in prompt engineering skills will continue to be important, because “even though Generative AI tools will get smarter and require less input from humans, it’s always better for the human users to tell them exactly what they intend.”
Josh observes the knowledge integrity aspect will entwine academic libraries even beyond AI literacy. “What’s fundamentally different about this technology is that we’re moving to a future environment where libraries are not only accommodating human readers but potentially facilitating machines as readers.”
But he adds, “I’m inclined towards optimism on AI’s role for education and research, because I know that there are so many good people who are leaning into this space.”
Elsevier is committed to Responsible AI Principles. Learn more

Empowering skills at under-resourced institutions
The Elsevier Foundation and ACRL have teamed up for a new initiative, “Bridging the Gap: an AI Community of Practice” to bring AI literacy skills to under-resourced institutions, led by Leo Lo. He notes, “AI can both narrow and widen the digital divide - I want under-resourced libraries and universities to have channels to do something and not just get left behind.” Launched in June of 2025, the first pilot includes 11 institutions, and will address the growing need for AI literacy training among early-career librarians, who are uniquely positioned as future leaders and change-makers.
Stay informed
Sign up for our AI in Higher Education newsletter to receive updates on AI-related content, featuring thought-leadership, practical insights, and Elsevier's newest offerings in research and education.
About our interviewees:

HL
Haoyong Lan
STEM Librarian and liaison for the Department of Engineering and Computer Science
Carnegie Mellon University
Read more about Haoyong Lan
LSL
Leo S. Lo
Dean of the College of University Libraries and Learning Sciences,
University of New Mexico
Read more about Leo S. Lo
JS
References:
Cengage, US Survey “2024 Graduate Employability Report” https://cengage.widen.net/s/bmjxxjx9mm/cg-2024-employability-survey-report
Digital Educational Council Global AI Student Survey 2024 “AI or Not AI: What Students Want” https://26556596.fs1.hubspotusercontent-eu1.net/hubfs/26556596/Digital%20Education%20Council%20Global%20AI%20Student%20Survey%202024.pdf?utm_medium=email&_hsmi=92199303&utm_content=92199303&utm_source=hs_automation
Higher Education Policy Institute (HEPI) & Kortext: “Student Generative AI Survey 2025” https://www.hepi.ac.uk/2025/02/26/hepi-kortext-ai-survey-shows-explosive-increase-in-the-use-of-generative-ai-tools-by-students/
Refers to two terms frequently seen in the media about artificial intelligence, AGI is artificial general intelligence, and ASIis artificial superintelligence. Currently, the meaning of these terms is evolving. AGI is more concrete but hypothetical: the ability of a programmed system to function at the same capacity as a human intelligence. ASI is a speculative concept referring to the ability of a programmed system to exhibit intelligence that surpasses humans.