Section 1 - Instruction

You've learned how a neuron calculates its weighted sum and how an activation function transforms that result. Now, let's put it all together to see how a neuron produces its final output.

Engagement Message

Ready to practice the full two-step process?

Section 2 - Practice

Type

Fill In The Blanks

Markdown With Blanks

A neuron receives an input of 4, with a weight of 0.5 and a bias of -3. It uses the ReLU activation function. What is the final output?

Fill in the blanks to solve it:

Weighted Sum = (4 * 0.5) + (-3) = [[blank:-1]] Final Output = ReLU(-1) = [[blank:0]]

Suggested Answers

  • -1
  • 0
Section 3 - Practice

Type

Multiple Choice

Practice Question

Which of these values could be an output from a neuron using a Sigmoid activation function?

A. 1.5 B. -0.7 C. 0.85 D. 10

Suggested Answers

  • A
  • B
  • C - Correct
  • D
Section 4 - Practice

Type

Swipe Left or Right

Practice Question

Let's practice identifying activation function outputs. Swipe each value to the function that's most likely to have produced it.

Labels

  • Left Label: ReLU
  • Right Label: Tanh

Left Label Items

  • 12.5
  • 5
  • 100

Right Label Items

  • -0.9
  • 0.5
  • -1
Section 5 - Practice

Type

Multiple Choice

Practice Question

A neuron's weighted sum calculation results in -5. Which activation function would produce an output of 0?

A. Sigmoid B. ReLU C. Tanh D. None of the above

Suggested Answers

  • A
  • B - Correct
  • C
  • D
Section 6 - Practice

Type

Sort Into Boxes

Practice Question

Sort these concepts into the correct part of a neuron's process.

Labels

  • First Box Label: Weighted Sum
  • Second Box Label: Activation

First Box Items

  • Inputs
  • Weights
  • Bias

Second Box Items

  • ReLU
  • Sigmoid
  • Non-linearity
Section 7 - Practice

Type

Fill In The Blanks

Markdown With Blanks

Fill in the blanks to describe the roles of different activation functions.

For a binary classification problem where the output must be between 0 and 1, the [[blank:Sigmoid]] function works well. For cases where we want negative inputs to become zero, [[blank:ReLU]] can be used.

Suggested Answers

  • ReLU
  • Sigmoid
  • Tanh
Sign up
Join the 1M+ learners on CodeSignal
Be a part of our community of 1M+ users who develop and demonstrate their skills on CodeSignal