Skip to content

Tools

Tools extend agents with callable functions that LLMs can trigger during generation. ActiveAgent provides a unified interface across providers while highlighting provider-specific capabilities.

Quick Start

Define a method in your agent and register it as a tool:

ruby
class WeatherAgent < ApplicationAgent
  generate_with :openai, model: "gpt-4o"

  def weather_update
    prompt(
      input: "What's the weather in Boston?",
      tools: [ {
        type: "function",
        name: "get_weather",
        description: "Get current weather for a location",
        parameters: {
          type: "object",
          properties: {
            location: { type: "string", description: "City and state" }
          },
          required: [ "location" ]
        }
      } ]
    )
  end

  def get_weather(location:)
    { location: location, temperature: "72°F", conditions: "sunny" }
  end
end
ruby
response = WeatherAgent.weather_update.generate_now

The LLM calls get_weather automatically when it needs weather data, and uses the result to generate its response.

Provider Support Matrix

ProviderFunctionsServer-side ToolsMCP SupportNotes
OpenAI🟩🟩🟩Server-side tools and MCP require Responses API
Anthropic🟩🟩🟨MCP in beta
OpenRouter🟩🟦MCP via converted tool definitions; model-dependent capabilities
Ollama🟩Model-dependent capabilities
Mock🟦Accepted but not enforced

Functions (Universal Support)

Functions are the core tool capability supported by all providers. Define methods in your agent that the LLM can call with appropriate parameters.

Basic Function Registration

Register functions by passing tool definitions to the tools parameter:

ruby
class WeatherAgent < ApplicationAgent
  generate_with :anthropic, model: "claude-sonnet-4-20250514"

  def weather_update
    prompt(
      message: "What's the weather in San Francisco?",
      max_tokens: 1024,
      tools: [ {
        name: "get_weather",
        description: "Get the current weather in a given location",
        input_schema: {
          type: "object",
          properties: {
            location: {
              type: "string",
              description: "The city and state, e.g. San Francisco, CA"
            }
          },
          required: [ "location" ]
        }
      } ]
    )
  end

  def get_weather(location:)
    { location: location, temperature: "72°F", conditions: "sunny" }
  end
end
ruby
class WeatherAgent < ApplicationAgent
  generate_with :ollama, model: "qwen3:latest"

  def weather_update
    prompt(
      message: "What's the weather in Boston?",
      tools: [ {
        type: "function",
        function: {
          name: "get_current_weather",
          description: "Get the current weather in a given location",
          parameters: {
            type: "object",
            properties: {
              location: {
                type: "string",
                description: "The city and state, e.g. San Francisco, CA"
              },
              unit: {
                type: "string",
                enum: [ "celsius", "fahrenheit" ]
              }
            },
            required: [ "location" ]
          }
        }
      } ]
    )
  end

  def get_current_weather(location:, unit: "fahrenheit")
    { location: location, unit: unit, temperature: "22" }
  end
end
ruby
class WeatherAgent < ApplicationAgent
  generate_with :openai, model: "gpt-4o"

  def weather_update
    prompt(
      input: "What's the weather in Boston?",
      tools: [ {
        type: "function",
        name: "get_current_weather",
        description: "Get the current weather in a given location",
        parameters: {
          type: "object",
          properties: {
            location: {
              type: "string",
              description: "The city and state, e.g. San Francisco, CA"
            },
            unit: {
              type: "string",
              enum: [ "celsius", "fahrenheit" ]
            }
          },
          required: [ "location" ]
        }
      } ]
    )
  end

  def get_current_weather(location:, unit: "fahrenheit")
    { location: location, unit: unit, temperature: "22", conditions: "sunny" }
  end
end
ruby
class WeatherAgent < ApplicationAgent
  generate_with :openrouter, model: "google/gemini-2.0-flash-001"

  def weather_update
    prompt(
      message: "What's the weather in Boston?",
      tools: [ {
        type: "function",
        function: {
          name: "get_current_weather",
          description: "Get the current weather in a given location",
          parameters: {
            type: "object",
            properties: {
              location: {
                type: "string",
                description: "The city and state, e.g. San Francisco, CA"
              },
              unit: {
                type: "string",
                enum: [ "celsius", "fahrenheit" ]
              }
            },
            required: [ "location" ]
          }
        }
      } ]
    )
  end

  def get_current_weather(location:, unit: "fahrenheit")
    { location: location, unit: unit, temperature: "22" }
  end
end

When the LLM decides to call a tool, ActiveAgent routes the call to your agent method and returns the result automatically.

Tool Choice Control

Control which tools the LLM can use:

ruby
# Let the model decide (default)
prompt(message: "...", tools: tools, tool_choice: "auto")

# Force the model to use a tool
prompt(message: "...", tools: tools, tool_choice: "required")

# Prevent tool usage
prompt(message: "...", tools: tools, tool_choice: "none")

# Force a specific tool (provider-dependent)
prompt(message: "...", tools: tools, tool_choice: { type: "function", name: "get_weather" })

Server-Side Tools (Provider-Specific)

Some providers offer built-in tools that run on their servers, providing capabilities like web search and code execution without custom implementation.

OpenAI Built-in Tools

OpenAI's Responses API provides several built-in tools (requires GPT-5, GPT-4.1, o3, etc.) including Web Search for current information, File Search for querying vector stores, and other tools like image generation, code interpreter, and computer use. For complete details and examples, see OpenAI's tools documentation and the OpenAI Provider documentation.

Anthropic Built-in Tools

Anthropic provides web access and specialized capabilities including Web Search for real-time information, Web Fetch (Beta) for specific URLs, Extended Thinking to show reasoning processes, and Computer Use (Beta) for interface interaction. For complete details and examples, see Anthropic's tool use documentation.

Model Context Protocol (MCP)

MCP (Model Context Protocol) enables agents to connect to external services and APIs. Think of it as a universal adapter for integrating tools and data sources.

OpenAI MCP Integration

OpenAI supports MCP through their Responses API in two ways: pre-built connectors for popular services (Dropbox, Google Drive, GitHub, Slack, and more) and custom MCP servers. For complete details on OpenAI's MCP support, connector IDs, and configuration options, see OpenAI's MCP documentation.

Anthropic MCP Integration

Anthropic supports MCP servers via the mcp_servers parameter (beta feature). You can connect up to 20 MCP servers per request. For the latest on Anthropic's MCP implementation and configuration, see Anthropic's MCP documentation.

OpenRouter MCP Integration

Coming Soon

MCP support for OpenRouter is currently under development and will be available in a future release.

Troubleshooting

Tool Not Being Called

If the LLM doesn't call your function when expected, improve the tool description or use tool_choice: "required" to force tool usage.

Invalid Parameters

If the LLM passes unexpected parameters, add detailed parameter descriptions with enum for restricted choices and mark required parameters explicitly.