Introduction: The Need for Assessment Variations

As an educator, you know that giving the same assessment to every student or class can lead to issues like cheating. Creating multiple versions of an assessment helps keep things fair and encourages proper understanding. However, making these variations by hand can be time-consuming and repetitive.

This is where LLMs can help. Once you have crafted a well-thought-out assessment, you can quickly generate new versions of it that test the same skills and knowledge. In this lesson, we will show you how to use LLMs to create assessment variations efficiently and effectively.

Key Elements to Keep Consistent Across Variations

Before we start building prompts, let’s quickly review what needs to stay the same when making assessment variations. This is important to ensure fairness and consistency for all students.

  • Question Types: If your original assessment has multiple-choice, true/false, and short-answer questions, you might consider making each variation have the same types.
  • Number of Questions: Each version should have the same number of questions.
  • Skills/Knowledge Tested: Each question in the new version should test the same skill or concept as the original.
  • Difficulty Level: The complexity of each question should be similar across all versions.

For example, if your original assessment asks about the musical alphabet, a variation should also ask about the musical alphabet, but in a slightly different way.

Crafting Effective Prompts for Generating Variations

Now, let’s build a prompt step by step to instruct the LLM to generate assessment variations.

Ask the LLM to create new versions. Be specific about how many variations you want.

Add Constraints for Consistency

To make sure the variations are fair and aligned, add clear constraints:

These constraints help the LLM avoid making questions that are too easy, too hard, or off-topic. Note how we utilize strict instructions with MUST.

Combine Everything into a Full Prompt

Now, let’s put it all together:

This prompt gives the LLM everything it needs: the original assessment, a clear request, and specific rules to follow.

Pitfalls

This specific example was chosen because it involves questions that are difficult to create variations for. It is easy to make a variation for the second question; simply replace the sharp sign with a flat or natural one. LLM handles it perfectly. However, for questions 3 and 4, creating variations is not easy, and LLM might be confused. It can't come up with another variant of such question, but it is asked to change something, so instead it rephrases it! For example, the initial question is:

And the variation model suggested is:

It is the same question, phrased differently!

Solution

There are multiple ways to address it.

  • If you are okay with leaving the same questions across variations, simply instruct the model to do so in the constraints:
  • As we already learned, the best way to explain the desired output to the model is to provide it with an example. For instance, you can include the following:

The example will help the model grasp the possible ways to vary the question. Note that we have explained the example. It is a nice trick to give the model more context and generalize its task understanding.

Summary and What’s Next

In this lesson, you learned how to use LLMs to generate multiple variations of an assessment. We covered how to:

  • Build a clear and effective prompt, step by step
  • Which pitfalls might be encountered in such task
  • How to provide LLM with additional context using an example with an explanation.

Next, you’ll get a chance to practice writing your own prompts and reviewing LLM-generated assessment variations. This hands-on practice will help you become more confident and efficient in using LLMs for your everyday teaching needs.

Sign up
Join the 1M+ learners on CodeSignal
Be a part of our community of 1M+ users who develop and demonstrate their skills on CodeSignal