Anam’s tool calling system enables AI personas to perform actions beyond conversation. During a session, the LLM can decide when to invoke tools based on user intent, making your personas capable of:
Searching knowledge bases using semantic search (RAG)
Calling external APIs via webhooks
Executing custom business logic
Tools make your AI personas agentic, capable of taking actions to help users accomplish their goals.
Beta Feature: Tool calling is currently in beta. You may encounter some
issues as we continue to improve the feature. Please report any feedback or
issues to help us make it better.
The system executes the tool based on its type: - Client tools: Event sent to your client application - Knowledge tools: Semantic search performed on your documents - Webhook tools: HTTP request sent to your API endpoint
5
Response integration
The tool result is returned to the LLM, which incorporates it into a natural language response.
The user hears a complete, informed answer without knowing the technical orchestration behind the scenes.
Client tools trigger events in your client application, enabling the persona to control your user interface.Common use cases:
Opening product pages or checkout flows
Displaying modals or notifications
Navigating to specific sections of your app
Updating UI state based on conversation
Copy
Ask AI
{ type: 'client', name: 'open_checkout', description: 'Opens the checkout page when user wants to purchase', parameters: { type: 'object', properties: { productId: { type: 'string', description: 'The ID of the product to checkout' } }, required: ['productId'] }}
Client tools work well for creating voice-driven user experiences
where the AI can guide users through your application.
Knowledge tools enable semantic search across your uploaded documents using Retrieval-Augmented Generation (RAG).Common use cases:
Answering questions from product documentation
Searching company policies or FAQs
Retrieving information from manuals or guides
Providing accurate, source-based responses
Copy
Ask AI
{ type: 'server', subtype: 'knowledge', name: 'search_documentation', description: 'Search product documentation when user asks questions', documentFolderIds: ['550e8400-e29b-41d4-a716-446655440000', '6ba7b810-9dad-11d1-80b4-00c04fd430c8']}
Folder IDs are UUIDs. You can find them in the Anam Lab UI at /knowledge or
via the API when creating folders.
Knowledge tools handle search automatically:
They understand the user’s intent, not just keywords.
They find the most relevant snippets from your documents.
They provide this information to the AI to form an accurate answer.
Knowledge tools require you to upload and organize documents in knowledge
folders before they can be used. Learn more in the Knowledge Base
documentation.
Webhook tools call external HTTP endpoints, allowing your persona to integrate with any API. This enables your persona to interact with external systems and perform actions.Common use cases:
Checking order or shipment status
Creating support tickets
Updating CRM records
Fetching real-time data (weather, stock prices, etc.)
Triggering workflows in external systems
Copy
Ask AI
{ type: 'server', subtype: 'webhook', name: 'check_order_status', description: 'Check the status of a customer order', url: 'https://api.example.com/orders', method: 'POST', headers: { 'Authorization': `Bearer ${process.env.API_SECRET}`, 'X-Organization-ID': 'org-uuid' }, parameters: { type: 'object', properties: { orderId: { type: 'string', description: 'The order ID to check' } }, required: ['orderId'] }, awaitResponse: true}
Set awaitResponse: false for fire-and-forget webhooks like logging or
notifications where you don’t need the response data.
For client tools, use registerToolCallHandler to define per-tool handlers, this will automatically emit completed or failed events when the handler completes.
The tool description helps the LLM understand when to use the tool. Be specific and include context.
Copy
Ask AI
{ name: 'check_order_status', description: 'Check the status of a customer order when they ask about delivery, tracking, or order updates. Use the order ID from the conversation.'}
Use JSON Schema to define parameters with detailed descriptions:
Copy
Ask AI
parameters: { type: 'object', properties: { orderId: { type: 'string', description: 'The order ID mentioned by the user, typically in format ORD-12345' }, includeTracking: { type: 'boolean', description: 'Whether to include detailed tracking information' } }, required: ['orderId']}
The LLM extracts parameter values from the conversation. If a required
parameter isn’t available, the LLM may ask the user for clarification or skip
the tool call.