Liman
Getting Started

2. Adding Tools

Create your first custom tool that provides user data to the LLM

Overview

Tools allow your agent to perform actions beyond chatting. This guide shows how to create a tool that returns registered users from a service, providing context for the LLM.

This builds on the Getting Started guide.

Step 1: Create the Tool Specification

Create specs/get_user_tool.yaml:

specs/get_user_tool.yaml
kind: ToolNode
name: get_user_by_name
description: Find a specific user by their name
func: main.get_user_by_name
arguments:
  - name: user_name
    type: string
    description: The name of the user to find

Step 2: Implement the Function

Add get_user_by_name function in your main.py:

main.py
def get_user_by_name(user_name: str) -> str:
    users = [
        {"id": "1", "name": "Alice", "email": "alice@example.com"},
        {"id": "2", "name": "Bob", "email": "bob@example.com"}
    ]

    for user in users:
        if user["name"].lower() == user_name.lower():
            return f"Found user: {user['name']} (ID: {user['id']}, Email: {user['email']})"

    return f"User '{user_name}' not found. Available users: Alice, Bob"

Step 3: Update Your LLM Node

Add tool name to your LLM spec:

specs/chat.yaml
kind: LLMNode
name: chat
prompts:
  system:
    en: |
      You are a helpful assistant.
      Always be polite and provide clear answers.
tools: 
  - get_user_by_name

Step 4: Test the Tool

Run your agent:

bash
python main.py

# for detailed output, you can enable debug mode:
# LIMAN_DEBUG=1 python main.py

Try these examples:

You: Who is Alice?
Agent: Alice is a user with the email address alice@example.com and an ID of 1.

You: What emails Bob and Pavel have?
Agent: I found the details for Bob: his email is bob@example.com. Unfortunately, I couldn't find any user named Pavel. The available users are Alice and Bob.

How It Works

  1. User asks about a specific user by name
  2. LLM extracts the name from the user's request (e.g., "Alice" from "Who is Alice?")
  3. Tool is called with the extracted name as parameter
  4. Function searches the user database and returns result or "not found"
  5. LLM formats the response naturally for the user

This demonstrates how tools can accept parameters and how LLMs can extract relevant information from natural language to call functions properly.

Next Steps

Now you have a custom tool that provides data to your LLM! But creating multiple tools can be tedious.
In the next guide, we will explore how to automatically generate tools from OpenAPI specifications.

Last updated on