Section 1 - Instruction

We've trained our network, but is it actually smart? A network can learn too little (underfitting) or learn the wrong things (overfitting).

Engagement Message

It's like preparing for an exam. What are the different ways a student can fail an exam?

Section 2 - Instruction

First, let's talk about underfitting. This is like the student who didn't study enough. The model is too simple to even understand the training data.

It performs poorly on everything—both the practice questions and the real exam.

Engagement Message

What might cause a model to be "too simple"?

Section 3 - Instruction

Now for overfitting. This is like the student who memorized every single practice question but didn't learn the underlying concepts.

The model learns the training data perfectly, including all its noise and random quirks.

Engagement Message

What happens when this student sees a new question on the exam?

Section 4 - Instruction

An overfit model fails on new, unseen data because it didn't learn the general pattern. It's brittle and not useful in the real world.

Our goal is a model that "generalizes"—one that performs well on data it has never seen before.

Engagement Message

In your own words, what does it mean for a model to generalize?

Section 5 - Instruction

The ideal model has a "good fit." It's like the student who truly understood the material.

It learns the underlying patterns from the training data without memorizing the noise. It performs well on both the practice questions and the final exam.

Engagement Message

This is always our goal in machine learning. What is one clear sign that your model has a good fit?

Section 6 - Instruction

So, to recap the three scenarios:

  • Underfitting: Fails everywhere.
  • Overfitting: Aces training, fails on new data.
  • Good Fit: Succeeds everywhere.

Engagement Message

Which of these three scenarios do you think is the most deceptive when training a model?

Section 7 - Instruction

Overfitting happens when a model is too complex for the amount of training data—it memorizes the data instead of learning general patterns.

For example, a large neural network trained on a small dataset may overfit. In contrast, a model that's too simple can't capture important patterns and underfits.

Deeper networks can learn complex relationships but are more likely to overfit, while simpler models may miss key details. Balance is essential for good generalization:

Choosing the right architecture and depth helps your model generalize well.

Section 8 - Practice

Type

Swipe Left or Right

Practice Question

Swipe each statement left or right to match the correct category.

Labels

  • Left Label: Underfitting
  • Right Label: Overfitting

Left Label Items

  • Model is too simple
  • High training error
  • Fails to capture pattern

Right Label Items

  • Model is too complex
  • Memorizes noise
  • Low training error, high new-data error
Sign up
Join the 1M+ learners on CodeSignal
Be a part of our community of 1M+ users who develop and demonstrate their skills on CodeSignal