Every classroom has students with different backgrounds, strengths, and learning speeds. As a teacher, you want to make sure each student is challenged but not overwhelmed. This is where differentiation comes in — adapting your instruction to meet the needs of all learners.
LLMs can help you quickly create different versions of assignments or assessments. This saves you time and helps you support every student more effectively. In this lesson, you will learn how to use prompt engineering to ask LLMs for differentiated materials.
Differentiation means changing your teaching or assignments to match the needs of different students. For example, you might give some students simpler problems and others more complex ones, but both groups practice the same skills.
Here are a few simple examples:
- For a reading assignment, you might give one group a text with simpler vocabulary and another group a more advanced article on the same topic.
- In math, you might ask some students to solve basic equations, while others work on word problems that use the same math skills.
The goal is to make sure everyone is learning, just at the right level for them.
Let’s see how you can use LLMs to help with differentiation. The key is to write clear prompts that tell the model exactly what you want. Suppose you have a trigonometry assessment and want easier and harder versions. You might start with a basic prompt:
However, this prompt is too general. The model might change the questions too much or not focus on the right skills. To get better results, you need to be more specific. For example, you can say:
This is better, but sometimes the model still doesn’t follow your instructions closely. That’s where strict instructions come in. When you want the model to follow a rule exactly, use the word MUST in capital letters. This makes your instruction stand out and tells the model it cannot ignore it.
For example:
By using MUST, you make it clear that the model cannot change the skills being tested. This usually leads to more accurate and useful results.
Let’s put it all together using the example:
Explanation:
- The context gives the model all the information it needs.
- The
MUSTin the constraints makes it clear that the skills cannot change. - Asking for an explanation helps you understand the model’s thinking and makes it easier to review the changes. This does not usually change the quality of the questions the model generates, but it does help you review and understand the output. You can see why the model made certain changes and decide if they fit your needs.
In this lesson, you learned how to use LLMs to help differentiate instruction by creating easier and harder versions of assessments. You saw how to write clear prompts, use strict instructions with MUST, and ask for explanations to make reviewing easier.
Next, you will get a chance to practice writing your own prompts for differentiation. You will use the CodeSignal IDE to chat with an LLM and see how your prompts work in real time. This hands-on practice will help you become more confident in using LLMs to support all your students.
