When building AI agents, you often need to connect them to existing APIs. This usually means writing custom functions for each endpoint.
What if you could skip that work entirely?
With Liman's OpenAPI integration, you can generate agent tools automatically from the API specification.
Picture this: your business has a CMS with dozens of endpoints. Instead of building admin panels, your team just talks to it: "Make mike@example.com an admin", "Archive all drafts from last month", "Show me Alice's orders".
You can integrate it in a few clicks.
Here's how it works.
How Liman Generates Tools
Liman works with declarative yaml approach. It reads your OpenAPI specification and creates tools for each endpoint automatically.
1. Specification Analysis
When you call load_openapi(OPENAPI_SPEC_URL)
, Liman:
- Fetches and parses the OpenAPI document
- Extracts endpoint paths and HTTP methods
- Maps parameter definitions (path, query, body parameters)
- Converts operation descriptions into tool
2. Function Creation
Liman generates python functions that make HTTP requests. These functions handle:
- URL construction with path parameters
- Request formatting for different content types
- Error handling and response parsing
The generated functions are stored in a dynamically created module.
Real Example: User Management API
I built a user management API to demonstrate this. Watch how it works:
Try it yourself: Clone the example repo and follow the README instructions.
Liman Auto-Generated Agent Architecture
Here's the complete agent that Liman generated from the OpenAPI specification:
What Liman created:
- 1 LLMNode that orchestrates conversations
- 6 OpenAPI tools auto-generated from API endpoints
- 1 custom tool for user confirmations
- Automatic connections between all components
Step-by-Step Implementation
Want to build this yourself? Here's how to create an AI agent for OpenAPI-compliant service, or just clone the sample repo.
Step 1. Project Setup
Create a new directory for your project and navigate into it:
uv init liman-openapi-agent
cd liman-openapi-agent
uv venv
source .venv/bin/activate
Step 2. Create the API Server
First, install FastAPI:
uv add "fastapi[standard]"
Create a simple user management API (copy-paste this code):
from typing import Any
from fastapi import FastAPI, HTTPException
from fastapi.routing import APIRoute
from pydantic import BaseModel
app = FastAPI(
title="User Management API Liman Demo",
description="FastAPI implementation based of Liman Simple OpenAPI samle",
version="1.0.0",
)
class User(BaseModel):
id: str
name: str
email: str
region: str
is_admin: bool
class UpdateUserRequest(BaseModel):
name: str | None = None
email: str | None = None
is_admin: bool | None = None
users = {
"12345": {
"id": "12345",
"name": "John",
"email": "john@example.com",
"region": "US",
"is_admin": True,
},
"67890": {
"id": "67890",
"name": "Max",
"email": "max@example.com",
"region": "EU",
"is_admin": True,
},
"54321": {
"id": "54321",
"name": "Alice",
"email": "alice@example.com",
"region": "US",
"is_admin": False,
},
"17821": {
"id": "17821",
"name": "Sofia",
"email": "sofia@example.com",
"region": "EU",
"is_admin": False,
},
"99821": {
"id": "99821",
"name": "Bob",
"email": "bob@example.com",
"region": "EU",
"is_admin": False,
},
}
@app.get("/users/{user_id}", response_model=User)
def get_user(user_id: str) -> User:
"""Get a user by their ID"""
user = users.get(user_id)
if not user:
raise HTTPException(status_code=404, detail=f"User with ID {user_id} not found")
return User.model_validate(user)
@app.get("/users", response_model=list[User])
def list_users() -> list[User]:
"""List all users in the system"""
return [User.model_validate(user) for user in users.values()]
@app.get("/users/search/name/{name}", response_model=User)
def search_user(name: str) -> User:
"""Search a user by name"""
for user in users.values():
if user["name"].lower() == name.lower():
return User.model_validate(user)
raise HTTPException(status_code=404, detail=f"No user found with name {name}")
@app.put("/users/{user_id}", response_model=dict[str, Any])
def update_user(user_id: str, request: UpdateUserRequest) -> dict[str, Any]:
"""Update a user by ID"""
if user_id not in users:
raise HTTPException(status_code=404, detail=f"User with ID {user_id} not found")
user = users[user_id]
changed = False
for field in ["name", "email", "is_admin"]:
value = getattr(request, field, None)
if value is not None and value != user.get(field):
user[field] = value
changed = True
return {"was_changed": changed, "user": user}
@app.get("/users/search/region/{region}", response_model=list[str])
def search_by_region(region: str) -> list[str]:
"""Find user IDs by region"""
user_ids = [
user["id"]
for user in users.values()
if user["region"].lower() == region.lower()
]
if not user_ids:
raise HTTPException(
status_code=404, detail=f"No users found in region {region}"
)
return user_ids
@app.delete("/users/{user_id}")
def delete_user(user_id: str) -> dict[str, str]:
"""Delete a user by ID"""
if user_id in users:
del users[user_id]
else:
raise HTTPException(status_code=404, detail=f"User with ID {user_id} not found")
return {"message": f"User with ID {user_id} has been deleted"}
def use_route_names_as_operation_ids(app: FastAPI) -> None:
"""
Simplify operation IDs so that generated API clients have simpler function
names.
Should be called only after all routes have been added.
"""
for route in app.routes:
if isinstance(route, APIRoute):
route.operation_id = route.name
use_route_names_as_operation_ids(app)
Start the server:
fastapi run server.py
Check the auto-generated OpenAPI docs at http://localhost:8000/docs.
Step 3. Agent Configuration
Liman uses declarative yaml specifications. Create an LLMNode configuration in specs/chat.yaml
:
kind: LLMNode
name: chat
prompts:
system:
en: |
You are a helpful assistant that works through chat.
You can execute tools in parallel or sequentially as needed.
Always reconsider your approach if the initial solution doesn't work.
Add the auto-generated tools. They follow the pattern {prefix}__{operationId}
where prefix defaults to "OpenAPI":
kind: LLMNode
name: chat
prompts:
system:
en: |
You are a helpful assistant that works through chat.
You can execute tools in parallel or sequentially as needed.
Always reconsider your approach if the initial solution doesn't work.
tools:
- OpenAPI__get_user
- OpenAPI__search_user
- OpenAPI__search_by_region
- OpenAPI__list_users
The tools with OpenAPI__
prefix are automatically generated from the API specification.
With this declaration you will get such node graph:
Step 4. Install Dependencies
Install Liman and LangChain:
uv add liman liman-openapi langchain langchain-openai langchain-google-genai
Create the agent (minimal code required):
import asyncio
from liman.agent import Agent
from liman_openapi import create_tool_nodes, load_openapi
from langchain_openai import ChatOpenAI
async def main():
llm = ChatOpenAI(
api_key="", # Replace with your OpenAI API key
model="gpt-4o"
)
agent = Agent("./specs", start_node="LLMNode/chat", llm=llm)
# Generate tools from OpenAPI
openapi = load_openapi("http://localhost:8000/openapi.json")
create_tool_nodes(openapi, agent.registry, base_url="http://localhost:8000")
# Agent is ready to use API endpoints as tools
while True:
user_input = input("Input: ")
response = await agent.step(user_input)
print(f"Agent: {response}")
print("-" * 20)
if __name__ == "__main__":
asyncio.run(main())
The create_tool_nodes
function generates and registers all tools from the OpenAPI specification.
Step 5. Test Your Agent
Start the agent:
python main.py
Input: What can you do?
Agent: I can interact with various tools to help you manage user data. Here are some of the things I can do:
1. **Get a User by ID**: Retrieve detailed information about a user by their unique ID.
2. **Search Users by Name**: Find users based on their name.
3. **Find User IDs by Region**: Search for user IDs in a specific region.
4. **List All Users**: Provide a list of all users in the system.
Let me know if you need help with any of these tasks!
--------------------
Input: Find the users Alice and Pavel
Agent: I found the user Alice in the system:
- **Alice**
- ID: 54321
- Email: alice@example.com
- Region: US
- Is Admin: No
However, I couldn't find any user with the name Pavel in the system. If you have more details or another query, let me know!
--------------------
Input:
Step 6. Add POST/PUT/DELETE requests
Liman parses all HTTP methods including POST/PUT with request bodies and DELETE operations. It also handles OpenAPI $ref
schema definitions for complex input validation.
Add the new tools to your agent:
kind: LLMNode
name: chat
prompts:
system:
en: |
You are a helpful assistant that works through chat.
You can execute tools in parallel or sequentially as needed.
Always reconsider your approach if the initial solution doesn't work.
tools:
- OpenAPI__get_user
- OpenAPI__search_user
- OpenAPI__search_by_region
- OpenAPI__list_users
- OpenAPI__update_user
- OpenAPI__delete_user
Step 7. Add confirmation for safety
For destructive operations like deleting or updating users, add a confirmation step.
Create a custom ToolNode in specs/confirm_tool.yaml
:
kind: ToolNode
name: confirm_action
description: |
You need to call this function on any destructive action like deleting a user or any updates
this function will ask for confirmation from the user before proceeding with the action.
func: main.confirm_action
arguments:
- name: user_id
type: number
description: id of the user to confirm action for
- name: action
type: string
description: The action to confirm, e.g., "delete user"
- name: title
type: string
description: The title of the confirmation dialog with all needful info
Add the function referenced in func: main.confirm_action
to your main.py
:
def confirm_action(title: str, action: str, user_id: str) -> str:
print(f"{title}? Y/N", end=": ")
res = input()
if res.lower() == "y":
return f"{action} confirmed. User ID: {user_id}"
else:
return "{action} cancelled."
Now we need to add this tool to the LLMNode configuration:
kind: LLMNode
name: chat
prompts:
system:
en: |
You are a helpful assistant that works through chat.
You can execute tools in parallel or sequentially as needed.
Always reconsider your approach if the initial solution doesn't work.
tools:
- OpenAPI__get_user
- OpenAPI__search_user
- OpenAPI__search_by_region
- OpenAPI__list_users
- OpenAPI__update_user
- main.confirm_action
With this setup, the agent will now ask for confirmation before updating users:
The flow now includes a confirmation step where the LLM first calls the confirm tool, waits for user input, and only then proceeds with the update operation if confirmed.
Step 8. Run the Agent
Let's try something more complex
python main.py
Input: Revoke all admins rights for users in EU region, and explain the decision steps you did 1 by 1.
Revoke admin rights for Max (User ID: 67890) in the EU region? Y/N: y
Agent: The admin rights for Max (User ID: 67890) have been successfully revoked. Here's a summary of the action taken:
1. **Identified Users in EU Region:** Retrieved users with IDs 67890, 17821, and 99821 in the EU region.
2. **Checked Admin Status:** Confirmed that only Max had admin rights.
3. **Confirmed Action:** Secured confirmation to proceed with revoking admin rights for Max.
4. **Updated User:** Successfully revoked admin rights for Max, changing his admin status to `false`.
If you need further actions or assistance, feel free to ask!
--------------------
Input: Delete all users that name starts from A
Delete user Alice (User ID: 54321)? Y/N: y
Agent: Alice (User ID: 54321) has been successfully deleted from the system. If you need any additional changes or have further queries, feel free to ask!
Multiple APIs?
You can connect multiple APIs to the same agent:
# User management API
user_api = load_openapi("http://localhost:8000/openapi.json")
create_tool_nodes(user_api, agent.registry, base_url="http://localhost:8000", prefix="Users")
# Payment API
payment_api = load_openapi("http://localhost:8001/openapi.json")
create_tool_nodes(payment_api, agent.registry, base_url="http://localhost:8001", prefix="Payments")
Why This Matters?
Less Boilerplate
Instead of writing custom wrapper functions for each API endpoint, you get them automatically. No more requests.get()
with manual error handling for every single endpoint.
No More API Documentation Drift
Your agent tools stay in sync with your API automatically. When you update an endpoint, the tool updates too.
Skip the UI Development
Building admin panels takes weeks. Building forms for every endpoint is tedious. With Liman, your team gets a chat interface that understands all your APIs instantly.
Reduce Onboarding Time
New team members don't need to learn your internal UI. They just ask: "Show me all users from last month" or "Archive inactive projects". The learning curve decreases.
Eliminate Integration Maintenance
API changes? No problem. Schema updates? Handled automatically. You restart the agent and everything just works. No more hunting down broken integration code across multiple repositories.
What's Next?
You may ask about authentication and state management - they are coming in future articles. For now, check out:

Do you like what you see?
Give a ⭐ on GitHub.