Introduction

Enabling LLM continuation rails made our chatbot more adaptable by letting the language model handle anything not covered by your rules. However, this approach has limits: you can’t directly control the prompts sent to the LLM or how it gathers and formats information. For more advanced, context-aware bots, you need greater control. That’s where variables and Natural Language Description (NLD) come in—they let you store and manipulate data, craft precise prompts, and guide the LLM’s output to fit your needs.

In this lesson, we’ll cover how to use variables in Colang 2.0 to store and manage data, and how to leverage Natural Language Description (NLD) to craft precise prompts and control LLM output. You’ll learn about variable assignment, types, and mutability, as well as best practices for extracting and validating structured information from the LLM.

Variables in Colang 2.0: Assignment, Types, and Mutability

Colang 2.0 allows you to use variables much like Python. Variables start with a $ and are assigned values using =. The variable’s type is inferred from the assigned value.

  • Immutable types (strings, integers, floats) are copied when assigned to another variable.
  • Mutable types (lists, sets, dictionaries) are referenced, so changes to one variable affect all references.

Examples:

Mutability in Practice:

Variables help you capture user input, maintain state, and transfer data between flows. Understanding mutability helps prevent unexpected changes.

Flow Parameters

Flows in Colang 2.0 can accept parameters, allowing you to pass data into a flow when you activate it. Parameters are defined in the flow header using the $ prefix. This makes flows reusable and flexible.

Example:

In this example, the greet_user flow takes a $name parameter. When you call the flow from main, you pass in the value of $user_name. Inside the flow, you can use $name just like any other variable.

Key Points:

  • Parameters are only accessible within the flow where they are defined.
  • You can pass multiple parameters by listing them in the flow header: flow example $param1 $param2.
  • When activating a flow, provide arguments in the same order as the parameters.

Using parameters helps you write modular, maintainable Colang code by avoiding unnecessary global variables and making flows more general-purpose.

Global Variables

By default, variables defined inside a flow have local scope and are not accessible from other flows. To share information between flows, declare a variable as global using the global keyword at the top of each flow that needs access.

Example:

You must declare the variable as global in every flow that needs to access or modify it. Otherwise, a local variable with the same name will hide the global one. Use global variables sparingly, as overuse can indicate a non-optimal Colang design.

Natural Language Description (NLD) and the Generation Operator

NLD lets you use everyday language to tell the LLM what you want. The generation operator ... sends your prompt to the LLM and stores the result in a variable. You don’t need to import anything to use the generation operator in Colang 2.0. It’s built into the language and available by default.

For example, you can ask:

Here, the prompt inside the quotes is sent to the LLM, and its response (e.g., Paris) is stored in the $capital_city variable.

You can ask the LLM to format and extract structured data:

The LLM returns a list you can use in your logic.

You can also reference variables in your prompts with curly braces:

The LLM sees the actual list and generates a summary.

Best Practices:

  • Specify the format you want (e.g., “Return a list of items in JSON array format”).
  • Use {$variable} to reference variables in prompts.
  • Check the output with helper functions like is_str() or is_list().
Defining Flows with NLD Docstrings

The magic doesn't stop here. You can use the generation operator and NLD to define not only variables, but entire flows. You can describe a flow’s intent and expected behavior using a docstring at the top. When you use a standalone ... operator inside the flow, Colang uses the docstring as the LLM prompt.

Example:

The docstring sets the assistant’s role and response style. The ... operator triggers the LLM to generate the next step using the docstring as guidance.

When using NLD with the generation operator inside a flow, the LLM’s output is interpreted as is by Colang. This means you must instruct the LLM to format its response exactly as Colang expects. For example, if you want the bot to reply to the user, your prompt or docstring should specify:

This ensures the LLM generates output that Colang can execute directly. If you don’t specify the format, the LLM might return plain text or an unexpected structure, which could cause errors or unintended behavior.

Having a conversation with the LLM through this setup could turn out like this:

Important Notes and Troubleshooting NLD
  • NLD may not always yield the format you want. If the LLM’s output isn’t as expected, clarify your prompt (e.g., “Return a string in single quotes”).
  • Always validate outputs with type-checking functions to ensure correct data types.
  • Be precise in your instructions, especially when you need structured data or a specific response format.
  • Use {$var} inside prompts sent to the LLM with the generation operator (...). This inserts the value of the variable directly into the prompt string.
  • Use {{ var }} inside docstrings at the top of a flow. When the docstring is used as an LLM prompt (e.g., with a standalone ...), {{ var }} is replaced with the value of the variable. This is useful for templating the docstring prompt with dynamic values.
Summary

Variables and NLD in Colang 2.0 give you fine-grained control over data and LLM-driven actions. You can store and manipulate information, extract structured responses, and guide the LLM to generate the exact output your application needs. This enables you to build more capable, adaptable bots. In the next practice, you’ll apply these features to create advanced conversational flows.

Sign up
Join the 1M+ learners on CodeSignal
Be a part of our community of 1M+ users who develop and demonstrate their skills on CodeSignal