Introduction

Welcome to the final lesson of "Building and Applying Your Neural Network Library"! Congratulations on making it this far — you've accomplished something truly remarkable. Over the course of this path, you've built a complete, modular neural network library from scratch, learning the inner workings of layers, activations, optimizers, loss functions, and the orchestration that brings them all together. You've also mastered the essential data preparation techniques needed for real-world machine learning applications.

Today, we're going to experience the incredible satisfaction of seeing all your hard work come together. We'll use our custom-built neural network library to tackle a real regression problem: predicting California housing prices. You'll see how the modular architecture you've carefully constructed makes it surprisingly straightforward to define complex neural networks, train them efficiently, and evaluate their performance on real data.

This lesson represents the culmination of your journey — the moment when theory meets practice, and your carefully crafted code proves its worth on a meaningful problem. Let's put your neural network library to the ultimate test!

Setting Up the Data

Let's start by importing our components and setting up the data preprocessing pipeline. Since you mastered data preparation in the previous lesson, we'll handle this efficiently and effortlessly:

Defining the Neural Network Architecture

Now comes the exciting part — defining our neural network architecture. We'll create a multi-layer perceptron (MLP) with two hidden layers, perfectly suited for this regression task:

This architecture represents a sophisticated neural network design. The first hidden layer takes our 8 input features and expands them to 64 neurons, allowing the network to learn complex feature combinations. The ReLU activation introduces nonlinearity, enabling the network to model complex relationships between housing features and prices.

The second hidden layer gradually reduces the dimensionality from 64 to 32 neurons, creating a funnel-like architecture that progressively distills the learned features into more refined representations. Finally, the output layer uses linear activation to produce a single continuous value — the predicted house price.

Compiling the Model

Now we'll compile our model with appropriate training configurations so that it's ready to begin the learning process:

The compilation step configures our training setup. We're using SGD (Stochastic Gradient Descent) with a learning rate of 0.005, which is slightly smaller than in our previous examples. Real-world datasets often benefit from more conservative learning rates that allow for stable, consistent learning across the diverse feature landscape.

When you run this code, you'll see:

Training the Network

Now for the moment of truth — training our neural network on real data:

This single method call triggers a sophisticated training process. Your model will process 200 epochs of training, where each epoch involves multiple mini-batches of 32 samples each. The verbose output shows the learning progress:

The decreasing loss values demonstrate successful learning! Your network started with a loss of 1.000 and steadily improved to 0.201 — a clear sign that it's learning meaningful patterns in the housing data. The gradual, consistent decrease indicates stable training without the erratic behavior that can plague poorly configured networks.

Making Predictions and Evaluating Performance

With our model trained, we can now make predictions on the test set and evaluate how well it generalizes to unseen data:

This produces our first evaluation metric:

The test MSE of 0.2083 is quite close to our final training loss of 0.201, which is excellent news! This similarity indicates that our model is generalizing well rather than overfitting to the training data. When test performance closely matches training performance, it suggests we've found genuine patterns rather than memorizing specific training examples.

However, this scaled MSE, while useful for training, doesn't give us an intuitive sense of prediction accuracy. Housing prices measured in standardized units don't mean much to us humans who think in dollars!

Interpreting Results in Real-World Terms

To truly understand our model's performance, we need to transform our predictions back to the original price scale:

This gives us a more interpretable performance metric:

Remember that our target values are in units of $100,000, so an MSE of 0.2785 corresponds to a typical prediction error of approximately $52,800 (the square root of 0.2785 × $100,000). For California housing prices, this represents reasonably accurate predictions!

Looking at Sample Predictions

Let's examine some specific predictions to get a concrete sense of performance:

The output reveals individual prediction quality:

These sample predictions show both the strengths and limitations of our model. We see excellent predictions like the first case (0.48 vs 0.48 — perfect!) and the last case (2.78 vs 2.78). We also see some challenges, like the second prediction where we predicted $111,000 for a house actually worth $46,000. However, most predictions are quite reasonable, especially considering the complexity of real estate valuation.

Conclusion

Congratulations! You've successfully completed the entire journey of building and applying your own neural network library. What you've accomplished in this final lesson represents the true power of the modular, well-designed system you've constructed over the past five lessons. Your achievement is remarkable — you've taken raw California housing data and successfully trained a multi-layer neural network to predict house prices with meaningful accuracy.

The elegance of your solution demonstrates the value of good software design, with clean, readable code that seamlessly integrates data preprocessing, model definition, training, and evaluation. In the upcoming practice exercises, you'll have the opportunity to apply these skills hands-on, building confidence in your ability to tackle real-world machine learning problems with your custom-built neural network library.

Sign up
Join the 1M+ learners on CodeSignal
Be a part of our community of 1M+ users who develop and demonstrate their skills on CodeSignal