Custom Tools
Integrate your custom API endpoints into your LLMs
What are Custom Tools?
Custom Tools enable AI agents to execute custom actions by integrating with your API systems and services.
Tool Provider
Before creating a custom tool, it’s important to understand how tools are organized in Stack AI. Tools are grouped under “Providers” - these are the main services or systems that contain related functionality. Think of a Provider as a container for multiple related tools.
For example:
- Salesforce (Provider)
- Create Lead (Tool)
- Update Contact (Tool)
- Search Records (Tool)
This organization makes it easy to find and use related tools. Additionally, providers share authentication headers and common access methods, allowing tools within the same provider to seamlessly utilize the same authentication and connection details when performing their actions.
Custom Tool
A custom tool represents a specific API endpoint and its functionality. Each tool has several key components:
-
Name: A unique identifier for the tool that can be referenced in LLM prompts. For example, if you name a tool
addPet
, you would reference it as “addPet” when instructing the LLM to use it. -
Description: A clear explanation of what the tool does. This helps the LLM understand when and how to use the tool appropriately.
-
Path: The API endpoint path that the tool will call (e.g.,
/api/v1/pets
) -
Method: The HTTP method to use (GET, POST, PUT, DELETE, etc.)
When the LLM needs to create a new pet, it can reference the addPet
tool by name and provide the necessary parameters based on the tool’s description and requirements.
How the Tools are called
When you define API endpoints in your custom tools, each endpoint becomes a distinct tool that the LLM can utilize. The LLM intelligently determines when and how to call these tools based on the context of the conversation and user inputs.
Here’s how it works:
- The LLM analyzes the user’s request or query to understand what action needs to be taken
- It identifies which tool (API endpoint) is most appropriate for fulfilling that request
- It automatically constructs the API request by filling in:
- Query parameters
- Body parameters
- Path parameters
- Headers
- Any other required request data
Create a Custom Tool
Custom tools are defined through API services, allowing you to integrate external functionality into your LLM. When you create a custom tool, you’ll describe your API endpoints and their capabilities.
Each API endpoint becomes a distinct tool that represents a specific action or operation in your system. The LLM will automatically understand how to use these endpoints and fill in the required parameters (like body and query parameters) based on the context and user input.
For example, if you have an e-commerce API:
- The
/products
endpoint becomes a tool for retrieving product information, where the LLM can fill search parameters - The
/orders/create
endpoint becomes a tool for placing new orders, with the LLM providing order details in the request body - The
/inventory/update
endpoint becomes a tool for managing stock levels, where the LLM determines the updated quantities
This approach lets you transform your existing APIs into reusable tools that can be easily incorporated into any LLM, making your external services and systems accessible to AI agents. The LLM handles the complexity of constructing proper API requests by intelligently filling parameters based on the conversation context.
To create a custom tool:
- Navigate to an LLM Node that supports Tools (like GPT-4 or Claude)
- Click the “Tools” button in the Tools section
- Select the “Custom tools” tab where your custom tools will appear. Click the “Add Custom Tool” button.
This will open the custom tool creation interface where you can define your tool’s functionality.
Adding Tool Information
To create a custom tool, you need to include:
-
Tool Provider Name: Give your tool provider a descriptive name that represents the service or system
-
OpenAPI Schema: Provide the OpenAPI specification that defines your API endpoints. The schema must include:
- Server URLs for the API endpoints
- Complete endpoint definitions with:
- Important! Clear descriptions explaining what each endpoint does and its purpose to help the LLM understand how to use them correctly
- HTTP methods (GET, POST, PUT, etc.)
- Path parameters
- Query parameters for GET requests
- Detailed request body schemas for POST/PUT requests
- Response schemas
- Required headers specific to endpoints
-
Common Headers (Optional): Define headers that should be applied across all endpoints, such as:
- Authentication headers (e.g. API keys)
- Custom headers required by your API
Each API endpoint defined in your OpenAPI schema will be automatically transformed into an individual tool that you can use in your LLMs. Taking time to properly configure these settings will make your tools more user-friendly and reliable.
Your custom tool will now appear in the tools panel and can be dragged into any LLM.
Include your custom tool into the LLM Node
Once you’ve created your custom tool, you can add it to any LLM node that supports tools:
- Click on the “Tools” button in the LLM node settings
- Navigate to the “Custom tools” tab
- Select your custom tool from the list.
The LLM will now be able to use your custom tool’s functionality when processing requests. You can add multiple custom tools to the same LLM node to combine different capabilities.
Custom tools help you build more maintainable and scalable flows by promoting code reuse and modular design.
Optimizing Tool Usage with Prompting
When using custom tools with an LLM node, it’s important to provide clear prompting to help the LLM understand how and when to use your tools effectively:
-
Describe the Tool’s Purpose: Include a clear description of what the tool does and when it should be used in your system prompt. For example: “Use the addPet tool to add a new pet to the store database.”
-
Provide Usage Examples: Give examples of proper tool usage in your prompts to demonstrate the expected input/output patterns. For example: “addPet(name=‘Max’, category=‘dog’, status=‘available’)”
-
Set Clear Instructions: Specify any requirements or constraints for using the tool in your prompts. For example: “When using addPet, ensure all required fields (name, category, status) are provided.”
-
Handle Errors: Include guidance on how to handle potential errors or edge cases when using the tool. For example: “If addPet returns an error, verify the input data and try again with corrected values.”
Example system prompt: