I used AILA to create a lesson, and here's what I thought.

Monday 24th March 2025



Background

A little background. In January, during one of our Teach First training days, we were introduced and provided a demonstration of the AILA Lesson Generator: a tool developed by Oak National Academy, leveraging artificial intelligence and the large corpus of lesson plans on the website to draw from. We were given the opportunity to try out the tool during the training day and I noticed a number of intriguing quirks and wanted to explore this tool further.

I waited for a suitable point in my own lesson planning to leverage this tool for. That opportunity presented itself when revising for the Year 11 mock exams in February. After identifying a gap following the previous round of mocks, a revision lesson was needed that covered searching and sorting algorithms. This the initial learning of this topic earlier in the GCSE course would normally span many lessons, so AILA seemed like the perfect tool to bring these two topics together into one lesson, while still ensuring the lesson had enough depth to be worthwhile.

After all, I believe "AI is at its strongest when being used to take incongruent ideas and bring them together into a cohesive concept. Although Searching and sorting algorithms are not truly incongruent, AI may be effective at blending a multi-lesson, into a single revision lesson. With my experience with experimenting and analysing AI tools, I thought it would be valuable to summarise my initial findings before I explore with further experiments.

The prompt I used to generate the initial idea was as follows:

KS4 GCSE Computer Science OCR. Revision. Searching and Sorting Algorithms.

Although not a detailed prompt, the AILA model is able to identify the key parts of the prompt and find the associated resources from it's dataset.After a few back and forth exchanges with the models (providing small amounts of feedback to refine the output), I was left with a list of resources generated by the model.

All the resources generated by AILA can be found here



Advantages

The most useful result from the AILA model was the lesson plan. The structure of the generated material was based on the corpus of Oak National Academy resources, which typically use a learning cycles structure, which split the lesson into three parts: Searching Algorithms, Sorting Algorithms and Comparison and Efficiency. These learning cycles created a clear frame for the lesson, that I was able to build upon in my own style. The detailed breakdown of the learning cycles were able to be adapted into effective slide notes / reminders of important talking and learning points. Included in the learning cycles were useful check for understanding questions that I could use / or not use depending on how the class progresses through the content.

Learning cycles
Recall and describe key searching algorithms.
Practice the application of sorting algorithms on datasets.
Evaluate the efficiency and differences between various algorithms."
~ Extract from the worksheet generated by AILA ~

The additional resources / worksheets generated were substantial. The starter and exit quiz clearly encapsulated useful prior knowledge and plenary assessment at the end of the lesson. I was unable to detect mistakes in knowledge within the lesson plan or resources. This seems down to the training data being focused on existing fact-checked resources from the academy. I wonder if, presented with a topic with no reference in the sample data, it would hallucinate facts and data to try and fill up the whole lesson resources.

Disadvantages

However, the generated questions are limited to assessing only recall and weren't providing a challenge to the pupils, exploring skills such as analyse and evaluate.

What is the main advantage of using binary search over linear search?
A. It is faster for small datasets
B. It reduces the number of comparisons needed ✓
C. It works on any type of list"
~ Extract from the worksheet generated by AILA ~

The least useful aspect of the generated lesson was the lesson slides. It was clear that the generation was following a series of template slides and pushing the generated material into them. I felt that they were unsatisfactory for what I needed for the lesson and instead opted to create my own slides, using a combination of the generated lesson plan, generated slides and my own thoughts and ideas. I restructured the slides by adding more visuals and breaking down the complex ideas and explanations into smaller chunks.

However, the generated slides would be acceptable if pushed on time, and having them export to PDFs, PowerPoint and Google Slides would be useful to teachers using a variety of different slide deck solutions. One of the most lacking features of the slides were the images, as AILA currently can't generate images to go with the slides. "AILA currently can't generate images to go with the slides." While teachers will likely want to add their own, it would be cool to see it generate some stock images for he slides similar to the way we can use Magic Media in Canva to add AI-images into our slides.

As with other AI-generated mediums, oddities were present in the resources generated by AILA. The most obvious of which can be found in the lesson plan, where the learning points found in the first column of learning cycles 2 and 3 were the same. An oversight as lesson cycle 3 was meant to shift focus towards efficiency and a comparison of the different algorithms. When creating my own resources inspired by the AILA generation I wrote my own notes and learning points for the third learning cycle to fill this gap. It really emphasises the importance of checking through AI-generated content, instead of taking it is accurate automatically. A warning of this is present on the generated materials "Created with Aila, Oak’s AI lesson assistant. Check content carefully before use."

Throughout the prompting process, I asked the model to incorporate some practical programming practice (using Python) into the independent activity / worksheet, and I feel that it's response feels tacked on. It is clear that the model had created an existing task and then added on my request as an extra sentence. When combining this with the already packed and not structured activity ( "Perform..." and "Then, explain ..." feels like they could be two separate questions) meant the whole activity needed some tweaking before it could be presented to any pupils. Perhaps AILA could be enhanced with a better understanding of structuring scaffolded coding tasks and how to integrate them into the lesson cycles.

Perform a bubble sort on the following list: [34, 12, 25, 16, 22]. Write the list after each pass. Then, explain why merge sort might be preferable for larger datasets. Additionally, implement a simple bubble sort algorithm in Python and test it with a list of numbers.
~ Extract from the worksheet generated by AILA ~

Conclusions

Despite the weaker aspects of the result generated by the tool, the lesson it described was definitely teachable and, after some modifications and additions, proved an incredibly well-structured recap, review and practice of prior learning. This tool seems best used to generate a frame or outline of the lesson that can then be built upon by a teacher. I would be interested to try this out on two (or more) completely incongruent lesson topics and see how the model blends them together.

I think there is definitely scope for improvements and features to be added to this tool. I've already mentioned about the image generation, but I think it's biggest missed opportunity is to generate long-term and medium term plans, based on the resources found on the academy. With the plethora of resources that they have, it could really help to map out a curriculum and that it remain engaging and varied for the pupils. With medium plans, it would be really effective for it to plan a term's lessons rather than be solely focused on a single lesson. Forming links between the lessons would be something that AI can excel at.