Keeping the Human in the Loop: Lessons on AI, Ethics, and Learning Across Three Conferences

Reflections on Generative AI Across Three Conferences

In the last fortnight, I’ve been fortunate to be invited to present at three very different yet equally inspiring conferences – the International Staff Training Week at the Silesian University of Technology (SUT) in Gliwice, Poland, the Research Application in Information and Library Studies (RAILS) conference in North Sydney, and the South West School Libraries Network (SWSSLN) Conference in Campbelltown, NSW.

Photos from the most recent ISTW conference in Poland

The groups at these conferences – university academics, library and information practitioners and researchers, and school library professionals – may seem disparate at first. Yet they were united by several key themes: a drive to serve those they work with, a shared passion for learning and professional development, and a focus on understanding the impact of Generative AI on their work.

Depending on your professional networks, you may feel that GenAI has dominated conversations over the past 18 months. Many people laugh and say, “I’m so sick of AI!”—and I understand that fatigue. Still, as overwhelming as it can feel, this evolution cannot be ignored. So, I offer here a reflection on the common GenAI themes that emerged across these events. The fact that these diverse communities share so much common ground reveals not only the cross-disciplinary impact of this technology but also the enduring importance of the human at the centre of all this disruption.

Across the three conferences, three interconnected themes emerged strongly: the importance of retaining human insight and critical judgment, the need for ethical use and trust, and the recognition that AI literacy is a shared responsibility. Considering these themes helps educators, school library professionals, and researchers find practical ways to navigate a professional environment that shifts almost daily.

Human Insights and Critical Judgment

Humans need to be in the loop. Photo by Constanze Marie: https://www.pexels.com/photo/a-grayscale-the-cloud-gate-in-illinois-5913629/

A recurring question across the presentations I attended was how we can preserve human distinctiveness in an age of automation. Carlo Iacono, Executive Director of the University Library at Charles Sturt University and keynote speaker at the RAILS conference, described this as avoiding “abdication” in favour of “augmentation.” He explained that we need to use AI to extend—not replace—human thinking. Professors Lyenov and Kuzior of SUT warned that overreliance on AI can dull independent thought, leading to what they called “infantilisation.” For educators, this means ensuring that learners—whether in primary school or postgraduate studies—experience cognitive friction: pausing, questioning, and wrestling with ideas rather than accepting ready-made answers.

It is difficult to take the hard road of learning when an easier path lies open. To help students understand the value of effort, we need to shift focus from product to process. Renewed attention to project-based and inquiry learning, which ask students to document their learning journey, aligns well with this. Though GenAI brings challenges, it also offers a chance to redesign assessment and pedagogy to emphasise creativity, innovation, and critical reflection. This moves us away from the factory model of education and towards the kind of networked, inquiry-based, and critically digital pedagogies I have written about before on this blog: exploring the pedagogical potential of learning networksinquiry learning, and why we need to design critical digital pedagogy.

When designing learning and assessment, a strong strategy is to ask students to show their thinking both with and without AI. This helps them build their skills and reminds them that human creativity and insight still matter, encouraging thoughtful engagement rather than dependency.

Ethical Use and the Importance of Trust

Across all events, speakers highlighted that the value we gain from AI depends on what they termed trust infrastructure. This means having systems that are open and transparent, with users understanding how decisions are made; strong data protection; and a shared level of ethical literacy so that educators and students alike can recognise bias and use AI responsibly. At the SWSSLN Conference, Nick Coucouvinis, Leader of Learning Tools and Publications for the NSW Department of Education, described how EduChat sets clear boundaries around student data, cultural sensitivity, and community alignment—a practical example of ethical AI use in action.

Across the sector, everyone has a role in ensuring AI is used responsibly. Academics can embed ethical principles into their teaching and policy work. Teacher librarians can guide colleagues and students to recognise bias and understand the cultural messages AI-generated content might send. Information and library researchers can develop ways to evaluate and monitor bias, strengthen data governance, and study how people and AI systems collaborate. A simple, effective action across all settings is to create an AI use charter outlining shared expectations, reviewed regularly to keep pace with emerging tools and risks.

AI Literacy as a Shared Responsibility

The final common theme was the recognition that AI literacy is not a purely technical skill but a social and ethical capacity that must be co-taught, co-designed, and continually re-evaluated. As Hubertus Weyer noted at the SUT conference, collaboration—not competition—should be the foundation of learning in an AI world.

Building AI literacy cannot rest with one profession or role. It must be a shared responsibility across education. When educators collaborate across contexts, consistent messages about responsible AI use become embedded in the culture and practice of learning. Planning for AI literacy means building everyday opportunities for reflection and discussion, not treating it as an add-on. Educators can prompt students to question how GenAI tools produce results, whose voices are represented, and what values are reflected. By modelling curiosity, transparency, and ethical care, teachers help students understand not just how to use AI, but how to think about its implications for learning and fairness.

It’s too big to go alone on this Photo by Brett Sayles: https://www.pexels.com/photo/you-are-not-alone-quote-board-on-brown-wooden-frame-2821220/

Reframing Ethical Scholarship

In my keynote at the ISTW conference in Poland, I explored how generative AI challenges traditional ideas about originality, authorship, and attribution. When tools can produce text or images instantly, we must move beyond compliance and toward a culture of ethical scholarship. Library and teaching staff can lead this shift by explaining how they use GenAI, demonstrating appropriate citation practices, and setting clear boundaries for acceptable use. Assessment design should encourage reflective commentaries, declarations of AI use, and tasks where students analyse or critique AI-generated outputs. Many students want to use GenAI ethically but are unsure what that looks like. Institutions need coordinated approaches that embed AI literacy throughout curriculum and pedagogy.

Ultimately, AI-literate students are not simply competent users of technology; they are critical thinkers who act ethically and adapt with integrity. All educational institutions—from primary schools to universities—must nurture cultures that support responsible, creative, and inclusive engagement with AI. There is enormous potential to begin this even with the youngest learners, and the urgency of doing so grows daily as GenAI’s influence expands.

Looking Forward: Human Capacity at the Core

Together, these themes remind us that the future of education with AI will not be measured by how quickly we adopt new tools, but by how intentionally we strengthen human capacity to use them with wisdom and care. The opportunity lies in helping learners think critically, act ethically, and stay curious in a changing digital world. When we do this, technology becomes a partner in learning rather than its driver, and education remains rooted in what makes it powerful—our shared ability to question, connect, and create meaning.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.