Introduction

Welcome to the first lesson of "Building and Applying Your Neural Network Library", the fourth and final course in our Neural Networks from Scratch using JavaScript path!

So far, we've built a strong foundation in neural network concepts and algorithms. We've explored the theory behind neural networks, implemented forward propagation and activation functions, and learned about backpropagation and optimization techniques. Now, it's time to take the next step: transforming our code into a reusable, modular neural network library in JavaScript.

In this course, we'll take the code we've developed in previous lessons and restructure it into a well-organized, maintainable framework — similar in spirit to popular libraries, but built from scratch in JavaScript! Our first task is to modularize the core components we've already built: dense layers and activation functions. By the end of this lesson, you'll have created a clean JavaScript project structure that separates concerns and makes your neural network code more maintainable and extensible.

The Importance of Software Engineering in ML

Before we dive into implementation details, let's talk about why we're restructuring our code. So far, we've focused on understanding the algorithms that power neural networks — the math, the theory, and the implementation of key concepts. While this understanding is crucial, there's another dimension to building effective machine learning systems: software engineering.

Software engineering principles are vital when building machine learning systems for several key reasons:

  • Maintainability: As models grow in complexity, well-structured code becomes easier to debug and update.
  • Reusability: Modular components can be reused across different projects.
  • Testability: Isolated components with clear interfaces are easier to test.
  • Collaboration: Well-organized code enables multiple people to work on different parts simultaneously.
  • Extensibility: Adding new features becomes simpler when code is properly modularized.

In the industry, machine learning practitioners rarely write monolithic scripts. Instead, they organize code into folders and modules with clearly defined responsibilities. This is the approach we'll take as we build our neural network library in JavaScript.

Our project will be called neuralnets, and we'll structure it with subfolders for different components. This structure separates concerns: activation functions live in their own module, layer implementations in another, and so on. As we continue through this course, we'll expand this structure to include losses, optimizers, and model classes.

Project Directory Structure Overview

Let's take a look at the complete directory structure we'll be building throughout this course, using JavaScript conventions.

This structure follows JavaScript project conventions and separates different concerns into their own modules:

  • activations/: Contains activation functions and their derivatives.
  • layers/: Houses different layer types (starting with our dense layer).
  • losses/: Will contain loss functions for training (coming in later lessons).
  • models/: Will include high-level model classes, such as model.js, sequential.js, and mlp.js, for building and managing neural network architectures.
  • optimizers/: Will contain optimization algorithms (coming in later lessons).
  • normalization/: Will contain weight initialization strategies.

In this lesson, we'll focus on implementing the activations/ and layers/ modules, along with the main project structure. The other modules will be added as we progress through the course, building up our complete neural network library step by step.

Creating a Modular JavaScript Project Structure

Let's begin by setting up the basic structure of our project. In JavaScript, a module is simply a file that exports code (functions, classes, variables) using the export keyword. Other files can then import these exports using the import keyword.

First, we need to configure our project to use ES modules by creating a package.json file:

The "type": "module" field tells Node.js to treat .js files as ES modules, allowing us to use import and export statements.

Next, let's create our package's main entry point in neuralnets/index.js:

This file serves as the main entry point for our library. It re-exports key components from submodules, making them directly accessible from the package. For example, a user can write:

Next, we need to create the subdirectories and their respective index.js files. For the activations submodule (neuralnets/activations/index.js):

And for the layers submodule (neuralnets/layers/index.js):

These index.js files create a clean API for our package, making it easy for users to import what they need. With this structure, a user could write import { DenseLayer } from './index.js' rather than having to know the exact module path.

Understanding JavaScript Module Mechanics

Now that we've set up our project structure, let's take a moment to understand some important JavaScript module concepts that will help you work more effectively with modular code and avoid common pitfalls.

  • Named and default exports: In JavaScript, you can export multiple named values from a module using the export keyword. For example, export function sigmoid(x) { ... }. You can also export a single default value using export default. In this project, we'll use named exports for clarity and flexibility.

  • Module resolution: When you import a module like import { DenseLayer } from './layers/index.js', JavaScript resolves the path to the specified file. If you import from a folder (e.g., ./layers), JavaScript will look for an index.js file in that folder by default. This is why we use index.js files to re-export components.

  • Best practices for organizing code: It's good practice to keep related code in separate modules and use index.js files to aggregate and re-export key components. This makes your codebase easier to navigate and your API easier to use. Avoid deeply nested imports and keep your public API clean by only exporting what users need.

  • Running your code: You can run your JavaScript code using Node.js by executing node main.js from your project directory. With ES modules enabled via package.json, your files can use import/export syntax seamlessly.

Implementing Activation Functions

As you may recall from our previous courses, these functions implement three common activation functions and their derivatives:

  • Sigmoid: A smooth, S-shaped function that maps any input to a value between 0 and 1.
  • ReLU (Rectified Linear Unit): Returns the input if positive; otherwise, returns 0.
  • Linear: Simply returns the input unchanged (used in regression tasks).

Each activation function has a corresponding derivative function used during backpropagation. Recall that our derivative functions expect the output of the activation function rather than the original input, which is a common optimization.

By isolating these functions in their own module, we make them easier to test, document, and extend with new activation functions in the future.

We'll create a file in the activations subfolder (neuralnets/activations/functions.js) and implement these three activation functions:

For now, we'll use these three implemented activation functions (sigmoid, relu, and linear) in our library.

Implementing the Dense Layer

Now let's implement our DenseLayer class in its own module. Since we've already built and understood the implementation details of this fully connected layer in previous courses, we'll focus on how it fits into our new modular structure and what interface it provides.

The key change in our modularized version is how we import the activation functions. Instead of defining them in the same file, our DenseLayer now imports from our organized package structure (neuralnets/activations/index.js):

This import statement demonstrates the power of our modular approach—the layer can now cleanly access activation functions without needing to know their implementation details.

By organizing our layer implementation this way, we've created a self-contained component with a clear interface. Users of our library don't need to understand the internal mathematics—they simply create layer instances and call forward and backward methods. This encapsulation is a fundamental principle of good software design and makes our neural network library much more user-friendly.

The modular structure also makes our code more maintainable. If we want to add new activation functions, we define them in the activations module. To use them in a layer like DenseLayer, we also import and handle them there—keeping logic clean and modular. This separation of concerns is exactly what makes professional ML libraries so powerful and extensible.

Testing Our Modular Structure: Network Setup

Now that we have our core components in place, let's create a main script to test everything. Let's start by creating our main script to import our modules and set up a simple two-layer network in main.js:

Testing Our Modular Structure: Forward Pass

Now let's extend our script to perform a forward pass through the network and examine the results:

Conclusion and Next Steps

You've successfully transformed our neural network code into a well-structured JavaScript project using modern ES modules. By separating our code into distinct modules with clear responsibilities, we've taken a big step toward building a robust, reusable neural network library. This modular design provides the foundation for the rest of this course, where we'll continue expanding our library by adding modules for loss functions, optimizers, and a high-level model class.

The journey from understanding neural network principles to building a complete, well-structured library mirrors the path many practitioners take in the field. As you progress through this course, you'll not only deepen your understanding of neural networks but also develop valuable software engineering skills that are essential for real-world machine learning applications in JavaScript. Now, it's time to get ready for some practice — happy coding!

Sign up
Join the 1M+ learners on CodeSignal
Be a part of our community of 1M+ users who develop and demonstrate their skills on CodeSignal