Introduction

Welcome back! In the previous lesson, you established your project-level constitution with CODEX.md, which defines the rules and standards that govern your entire project. Now we're going to zoom in one level deeper to feature-level specifications - the detailed blueprints that describe individual features within your project.

This lesson will teach you how to break down any feature into a complete specification that leaves no room for ambiguity. You'll learn what sections a specification needs, how to use AI to help generate them, and how to evaluate whether a specification is truly complete. A complete specification is especially critical when working with APIs, where clear communication between different parts of a system can make or break your project.

Key Elements Review

Before we dive into the details, let's refresh our understanding of what a specification actually is. A specification, often shortened to "spec," is a document that describes what a feature or component should do. Notice the emphasis on "what" rather than "how" - a good specification focuses on behavior and outcomes, not implementation details.

In the previous lesson, you encountered a simple specification for an API endpoint. Here's that example again:

This example is deliberately simple, containing just the bare minimum. Real-world features are more complex, which means their specifications need more depth and detail. The rest of this lesson will show you how to expand from this simple structure into complete, production-ready specifications.

We'll use a running example throughout this lesson: a user profile feature that includes an avatar, bio, and location. This is complex enough to demonstrate all the key concepts while remaining easy to understand.

The Purpose Section

Every specification begins with a purpose statement. This is your one-sentence explanation of what the feature does and why it exists. Think of it as the elevator pitch for your feature.

Here's the purpose statement for our user profile feature:

A good purpose statement is clear, concise, and gives immediate context to anyone reading the specification. It answers the fundamental question: "What is this feature for?" Anyone on your team should be able to read the purpose and immediately understand the feature's role in the larger system.

The purpose section serves as the anchor point for the rest of the specification. Every other section should support and elaborate on this central purpose. If you find yourself writing details that don't relate back to the stated purpose, you may need to either revise the purpose or reconsider whether those details belong in this specification.

The Interface Section

The interface section defines the contract between your feature and the rest of the system. This is where you specify exactly what data goes in and what data comes out, along with their types and any constraints.

For our user profile feature, the interface looks like this:

Notice how each input and output includes several pieces of information. First, we have the name of the field, which should be descriptive and follow your project's naming conventions. Next comes the type - string, object, boolean, etc. Finally, we include constraints like "optional" or "max 160 characters" that further restrict what's valid.

These constraints are crucial. They prevent errors before they happen by establishing clear boundaries. When you specify "max 160 characters" for the bio, you're not just documenting a limit - you're creating a contract that both the sender and receiver must honor. This clarity eliminates ambiguity and makes validation straightforward.

The interface section should be complete enough that someone could implement the feature without asking you any questions about what data is needed or provided. If you find yourself repeatedly answering questions about inputs or outputs, your interface section needs more detail.

The Behavior Section

While the interface tells us what data flows in and out, the behavior section describes what happens to that data. This is where you explain the feature's actions without diving into implementation specifics.

Here's the behavior description for our user profile feature:

The key to writing good behavior descriptions is focusing on the "what" rather than the "how." We don't say "the system calls the database update function" or "the controller processes the request" - those are implementation details. Instead, we describe the observable behavior: "the system saves the new values" and "previous values are kept if fields are missing."

This approach has a powerful benefit: it gives implementers flexibility in how they achieve the behavior while ensuring the outcome is consistent. Different developers might implement the same behavior using different techniques, but as long as the observable behavior matches the specification, the feature works correctly.

Think of the behavior section as describing a black box. You can see what goes in, you can see what comes out, and you can describe what transformations occur, but you don't need to see inside the box to understand what it does.

The Constraints Section

Constraints are the rules that must always be true about your feature. They represent invariants - conditions that the system must maintain at all times. While the interface section includes some constraints alongside the data definitions, this section captures broader rules and relationships.

For our user profile feature, the constraints are:

Each constraint should be testable and verifiable. You should be able to write a check that determines whether the constraint is satisfied. These aren't suggestions or preferences - they're hard rules that the system must enforce.

Constraints often come from business requirements, technical limitations, or data integrity needs. The 160-character limit on the bio might come from a product decision, while the URL validation for the avatar ensures data quality and security.

When writing constraints, be specific and complete. Instead of "bio should be short," write "bio must not exceed 160 characters." The more precise your constraints, the more useful they are for implementation and testing.

The Edge Cases Section

Edge cases are the unusual or boundary situations that might not be obvious from the normal behavior description. They represent scenarios that are technically valid but push against the limits of what your specification covers.

Here are the edge cases for our user profile feature:

Think of edge cases as the scenarios that make you pause and ask, "What should happen here?" They often involve extremes - empty values, maximum values, unusual combinations of inputs, or borderline-invalid data.

Identifying edge cases requires thinking creatively about how your feature might be used in unexpected ways. What happens when a user provides the minimum possible input? What about the maximum? What if they mix optional and required fields in unusual ways?

Good edge case documentation prevents bugs and confusion. When an implementer encounters one of these scenarios, they can refer back to the specification instead of making assumptions. Even better, edge cases often translate directly into test cases, making your testing strategy more comprehensive.

The Error Conditions Section

While edge cases describe unusual but valid scenarios, error conditions describe situations where the feature should explicitly fail. This section defines when things go wrong and how the system should respond.

For our user profile feature, the error conditions are:

Each error condition should specify two things: the condition that triggers the error and what happens as a result. "Return an error" is somewhat generic here - in a real specification for an API, you'd also specify the exact error code and message format.

Error conditions are about defensive programming. They define the boundaries of acceptable input and behavior. When you specify an error condition, you're saying "the system must reject this situation and communicate the problem clearly."

Good error condition documentation makes debugging easier and helps create better user experiences. When something goes wrong, clear error specifications ensure that the error messages are helpful and consistent across your entire system.

The Examples Section

Examples transform abstract specifications into concrete reality. They show exactly what inputs produce what outputs, making the specification tangible and testable.

Here are examples for our user profile feature:

Notice how we provide both success and failure examples. The first example shows the happy path - everything works as expected. The second example demonstrates an error condition with its specific error message.

Good examples are specific and complete. They don't say "some string" - they show actual values. They don't just describe the output - they show the exact format. Examples serve as both documentation and test cases, making them one of the most valuable parts of any specification.

When writing examples, aim to cover the most common scenarios plus at least one example for each edge case and error condition. This gives implementers a comprehensive picture of how the feature should behave across different situations.

Complete Template Reference

Now that you understand each section in detail, here's the complete template you can use for any feature specification. This template brings together all the elements we've discussed into a single, reusable structure.

For API endpoints specifically, you'll want to add two additional sections after the examples. These sections address HTTP-specific concerns:

Use this template whenever you're designing a new feature, documenting an API endpoint, or clarifying requirements before implementation. The template's structure ensures you don't forget critical details and provides consistency across all your specifications.

Using AI To Generate Specifications

Creating specifications from scratch can be time-consuming, especially when you're starting out. This is where AI can accelerate your workflow significantly. You can use a structured prompt pattern to have AI generate a complete specification based on a brief feature description.

Here's the prompt pattern that works well for specification generation:

Let's see this in action with our user profile feature. Here's how you'd prompt the AI:

The AI would respond with something like this:

The key to using AI effectively is providing clear structure in your prompt. The template portion tells the AI exactly what sections to include and what level of detail to provide. You can then take the AI's output, review it, and refine it based on your specific needs.

Remember that AI-generated specifications are starting points, not final products. You should always review them for completeness, accuracy, and alignment with your project's standards. The AI might miss domain-specific constraints or edge cases that you know about from experience.

Real World Examples

Understanding the template is one thing, but seeing it applied to real features brings the concepts to life. Let's examine two specifications from the TaskMaster project: the User Model and the Authentication API. These demonstrate how the template works in practice and show you what production-quality specifications look like.

The User Model Specification

The User Model specification describes the core data structure for authenticated users in the system. Here's the complete specification:

This specification demonstrates several strong qualities. The purpose is crystal clear - in one sentence, you know this is about authenticated users with secure credentials. The interface (represented here as Fields) is detailed and specific, including types, constraints, and defaults. The behavior section describes methods without revealing implementation, and the constraints are explicit and testable.

However, there are areas where this specification could be improved. It lacks concrete examples showing how to create a user or verify a password. The edge cases section is missing entirely - what happens when someone tries to register with an email that differs only in case from an existing user? These gaps don't make the specification bad, but they do leave room for interpretation.

The Authentication API Specification

The Authentication API specification describes how users register and log in to the system. Here's a key portion of that specification:

This API specification excels at concrete examples. Every endpoint shows exact JSON structures for both requests and responses. The HTTP status codes are included with the responses, making it clear what "success" means for each endpoint. The security section adds important context about token expiration and password hashing.

Where this specification could improve is in error handling. While it shows successful responses, it doesn't provide examples of error responses. What does a 409 Conflict look like when someone tries to register with a duplicate email? How are validation errors formatted? Adding an error response format section and examples of common error cases would make this specification more complete.

These real-world examples show that even production specifications have room for improvement. The goal isn't perfection - it's clarity and completeness. When evaluating a specification, ask yourself: "Could someone implement this feature without asking me any questions?" If the answer is yes, you have a good specification. If not, identify the gaps and fill them.

Summary And Next Steps

You've now learned the complete anatomy of a feature specification, from the single-sentence purpose to the detailed examples. Each section serves a specific role in creating a clear, implementable blueprint for your features. The template provides structure, but the quality of a specification comes from the thought you put into each section.

Good specifications eliminate ambiguity and answer questions before they're asked. They guide implementation, facilitate communication, enable testing, and serve as documentation. The specifications you write will be used by developers, AI, and future maintainers to understand exactly how features should work.

In the next practice session, you'll apply everything you've learned by using AI to generate a specification for a user profile feature, then reviewing and validating its completeness. This hands-on experience will cement your understanding and help you develop the critical eye needed to write and review specifications effectively.

Sign up
Join the 1M+ learners on CodeSignal
Be a part of our community of 1M+ users who develop and demonstrate their skills on CodeSignal