In this lesson, you’ll learn how to make your Colang 2.0 bots more conversational and adaptive by integrating Large Language Model (LLM) capabilities using the llm
module. We’ll see how LLM flows can help your bot handle a wider range of user messages and generate more natural replies. By the end, you’ll understand the difference between strict, rule-based flows and the dynamic flexibility that LLMs bring to Colang.
Without LLMs, your bot can only respond to the exact phrases you define. If a user says something unexpected, the bot won’t know how to reply.
For example:
Sample conversation:
Here, the bot only responds to "help" or "support". Any other input is ignored. This isn’t much different from using traditional string comparison if statements. To move beyond these limitations and make your bot truly conversational, you can bring in the power of LLMs.
To enable LLM-powered features in your Colang project, import the llm
module along with the core
module at the top of your .co
file:
import core
: Loads the essential flows and actions from the Colang Standard Library.import llm
: Adds LLM-driven flows and actions to your project.
By importing the llm
module and activating the llm continuation
flow, your bot can handle a much wider variety of user inputs. The llm continuation
flow lets the LLM generate responses for anything not covered by your defined flows.
Sample conversation:
- The first and third inputs are handled by the LLM.
- The second input matches the help flow.
The llm
module also enables more conversational and varied responses using the bot say something like $text
action. This action prompts the LLM to generate a response similar to your prompt, but in more natural language.
Example:
Sample conversation:
Here, the LLM creates a friendly, context-aware greeting, rather than repeating the exact phrase.
By importing and activating the llm
module in Colang 2.0, you can transform your bot from a rigid, rule-based system into a flexible, intelligent conversational agent. The llm continuation
flow and bot say something like
action allow your bot to handle a wide range of user inputs and respond in a more natural, engaging way.
In the next practice, you’ll get to experiment with LLM flows and see how they can improve your own Colang projects.
