Collaborating with AI: Beyond Assessments

The rise of generative AI in higher education has sparked intense debate, mostly about cheating and the imminent degeneration of academic integrity. However, the impact of AI extends beyond assessments.

Universities can use the capabilities of generative AI to enhance teaching and expand the ways students learn. This ultimately involves human-AI collaboration. From co-creating lesson materials to coding interactive simulations, and from AI-powered teaching assistants to the deployment of local large language models (LLMs).

This post identifies five key areas that I consider to be suitable and ready for the integration of AI.

 

1. Human–AI Collaboration in Teaching Workflows

AI can serve as a personal teaching assistant, helping with lesson planning, content creation and administrative tasks. Instead of replacing educators, AI tools have the potential to assist their preparation, drafting, lesson planning and other time-sensitive activities. By offloading such preparatory work to AI, educators free up more time to spend where it’s needed most – with students.

This human–AI partnership can make teaching more efficient and inclusive. For example, AI is highly proficient at summarising lesson content and tailoring content to different student needs. Generative AI tools can swiftly produce accessible formats (like simplified summaries or multilingual resources), helping educators support students of varying abilities. In these ways, AI can act as a catalyst for educators to quickly adapt content.

Beyond the potential time-saving benefits of generative AI, any collaborative use of these tools should begin with a simple question: How will AI meaningfully enhance my teaching? As with any new digital innovation, AI should serve a clear pedagogical purpose, while the human relationship between teacher and student remains central to the learning experience.

Furthermore, generative AI is prone to “hallucinations” and inaccuracies, which require human oversight and judgement. Educators must double-check AI-generated content for accuracy, relevance and bias before using it in class. When teachers and AI co-create materials iteratively, with the human providing context, judgment and ethics, the result is a more efficient workflow that still upholds academic quality.

 

2. AI-Generated Interactive Simulations and Coding Support

Generative AI models can now produce and debug code and facilitate programmers by acting as co-pilots. Educators can utilise these LLM capabilities to create simulations or interactive visualisations for their courses. The beauty of this approach is that no coding experience is required.

For example, an instructor might prompt an AI using natural language to generate a Python script that simulates a scientific phenomenon or an economic model and then use that simulation to engage students with concepts in class.

This lowers the barrier to introducing coding-based simulations across disciplines. Early adopters have collaborated with AI to design custom educational games and problem scenarios. In this way, AI acts like a coding partner by translating a teacher’s ideas into executable programs or digital content that enriches learning.

However, there are some caveats. Designing effective simulations with LLMs requires thoughtful prompt engineering and a considered iterative approach. Appropriate prompts should contain the necessary information, including parameters and behaviour, so the LLM can execute commands appropriately. LLMs also have the propensity for code regression or loss of memory with long chains, which sometime necessitates a re-calibration and a new workflow.

Nevertheless, the iterative approach is relatively straightforward and time efficient. For example, I generated an evolution simulation in a day, albeit with a second workflow that required 29 iteration points. The output was a frictionless html file that opens easily in a web browser interface. The result was a fully functioning simulation that enables students to explore evolutionary dynamics between three species across three trophic levels (plants, herbivores, predators).

I’ve developed a second model that simulates innate and adaptive immune responses to a pathogen, including pathogen load, mortality risk, T cells, B cells and antibody responses. Students can therefore adjust various parameters and observe the immune response in real-time, thereby promoting the conceptualisation of complex immune system dynamics.

 

3. Co-Developing Teaching Materials with Generative AI

Generative AI is proving to be a powerful collaborator for developing other instructional materials, such as lecture slides, reading guides and discussion prompts. Using tools like ChatGPT, educators can brainstorm and generate content much faster, then use their own judgement and expertise to refine it.

An iterative workflow is also key here, whereby the instructor uses AI to generate a first draft. The educator then curates, corrects and adapts this output. Follow-up prompts to the AI are then used to refine the output. This human-AI collaboration process can save time and provide creative inspiration. Generative AI therefore becomes a creative partner, helping teachers diversify their pedagogy.

However, quality control and academic rigour remain firmly in human hands. As noted earlier, AI content can contain errors, outdated information or cultural biases. A seemingly polished summary might omit important nuances.  Therefore, educators must critically evaluate AI-generated material.

 

4. Retrieval-Augmented Generative Agents for Course Support

One of the most promising advances in educational AI is the use of Retrieval-Augmented Generation (RAG) agents. These are AI assistants that are augmented with specific knowledge bases (such as course materials, institutional databases or library resources).

Instead of relying solely on a base language model, these agents can pull in relevant information from approved sources to generate answers or guidance. In practice, this means educators could deploy an AI tutor that is instructed to generate exact content or based answers primarily on course material.

The aim is to reduce the risk of hallucinations and the generation of misinformation. Custom GPTs, with built-in RAG support, can also be instructed to generate specific answers, reply in certain styles (e.g. as a Socratic tutor) and provide based on course material. Carefully designed system prompts are important here, whereby instructions are given to the model to behave in a certain way.

Introduced properly (integrated into the class and clearly explained), students perceive these AI tutors as a valuable supplement to their core teaching activities. Custom GPTs provide a way to get quick help that is consistent with the instructor’s expectations, rather than a generic chatbot. These agents also encourage students to engage more deeply with course content.

However, data security should always be considered when deploying custom GPTs. Institutional licences that ‘guarantee’ data privacy is one solution. Paid-for GPT services also offer higher levels of data security. Care should also be taken to avoid the use of copyrighted and sensitive information when deploying RAG agents.

 

5. Local LLMs: Data Security, Customization, and Compliance

As universities embrace AI, data privacy and compliance remains a critical consideration. Popular AI services (e.g. OpenAI’s ChatGPT) process queries on external servers, which raises concerns about sending sensitive data into the cloud. AI companies can also use that data to train their models, raising further concerns about data privacy.

This has prompted interest in deploying local LLMs. These model systems tend to be hosted on personal computers or institutional machines. The major benefit of hosting local LLMs is that your prompts and data never leave your local environment.  Therefore, sensitive data is never sent to third-party server, reducing exposure risk and compliance concerns.

Local LLM deployments also offer customisation benefits over cloud-based services. Local models can be fine-tuned on bespoke material and tailored to use a preferred style or to prioritise certain values and guidelines in its responses. It also allows integration with internal systems (learning management systems, library databases) in a more seamless and secure way. Open access to the model’s parameters is also a valued feature of local LLMs.

That said, going local is not a panacea. Running advanced LLMs on a local computer requires significant computing power and technical expertise, which can present a barrier. Even with hardware in place, open models may be less capable than cutting-edge commercial models.

Hosting local LLMs is made easier by using a dedicated desktop application, such as LM Studio or Ollama. For example, LM Studio downloads (e.g. via Hugging Face) and runs models directly on your machine using optimised inference backends (e.g., GGUF models via llama.cpp). They offer a clean user interface (UI) similar to cloud-based chatbots like ChatGPT, Gemini etc.

 

Opportunities and Limitations

The examples above paint an exciting picture of AI-augmented higher education. The opportunities are indeed compelling. AI potentially gives educators more bandwidth for creativity and student mentorship. It can help create multiple learning pathways that support diverse learners and provide rapid feedback and tutoring at scale.

Instead of framing AI purely as a threat to academic integrity, teaching students how to collaborate with AI tools for research and study will help break down barriers and taboos surrounding AI use at university. Working with students so that they treat AI as a trusted tool, rather than a cheat, empowers students to be partners in deploying AI successfully.

Conversely, concerns remain about the reliability of AI.  For example, students might accept AI outputs uncritically or over-rely on AI generated content, which may lead to a “cognitive debt,”. Maintaining human oversight, as well as teaching students how to verify AI-assisted work, is vital to avoid such pitfalls.

Another concern centres on the technical and AI literacy of teaching staff. Academics have busy schedules, conflicting deadlines and numerous responsibilities. Finding the time and effort to up-skill in a rapidly moving space is a significant challenge and concern, which will hinder successful deployment of AI across institutions unless addressed.

Integrating AI beyond assessments is about thoughtfully enhancing the core elements of education. Here are some strategies for success:

  • Invest in AI literacy and training: Ensure faculty and staff have opportunities to learn about AI tools and their pedagogical applications
  • Start with pilots: Rather than a top-down mandate, encourage small-scale experimentation and gradual roll-out of AI applications.
  • Focus on collaborative uses, not prohibitive policies: Shift the narrative by highlighting AI-for-learning initiatives and normalise AI use as a beneficial tool to be used responsibly.
  • Partner with students: Deploying AI tools successfully needs the support of students, who are best placed to evaluate applications and offer creative solutions, based on end-user needs.
  • Update policies and encourage reflection: Revise academic policies to clarify AI’s acceptable use in coursework and involve students in that conversation.

In moving beyond assessments, the goal is to integrate AI in a way that strengthens the human elements of education, such as creativity, critical inquiry, problem-solving and the exchange of knowledge. As more sophisticated AI models become available, a clearer strategy into how we learn with these tools will become increasing important. By staying informed, universities and educators can navigate this transformative era with confidence. In doing so, AI will become a valuable and positive tool that supplements core teaching, rather than a threat to pedagogy and academic integrity.

 

References:

Students want to see generative AI integrated throughout curriculum, despite increased concerns around ethics and equity – Jisc  https://www.jisc.ac.uk/news/all/students-want-to-see-generative-ai-integrated-throughout-curriculum-despite-increased-concerns-around-ethics-and-equity

Artificial intelligence (AI) can make learning more inclusive for all – Jisc https://www.jisc.ac.uk/blog/artificial-intelligence-ai-can-make-learning-more-inclusive-for-all

AI and the Future of Universities – HEPI https://www.hepi.ac.uk/reports/right-here-right-now-new-report-on-how-ai-is-transforming-higher-education/

Williams, A. (2025). Critical evaluation of the potential of large language models in bioscience higher education: A conversation with ChatGPT. International Journal of Research in Education https://doi.org/10.46328/ijres.1302


Leave a Reply

Your email address will not be published. Required fields are marked *