Skip to content

Tools

Tools extend agents with callable functions that LLMs can trigger during generation. ActiveAgent provides a unified interface across providers while highlighting provider-specific capabilities.

Quick Start

Define a method in your agent and register it as a tool:

ruby
class WeatherAgent < ApplicationAgent
  generate_with :openai, model: "gpt-4o"

  def weather_update
    prompt(
      input: "What's the weather in Boston?",
      tools: [ {
        name: "get_weather",
        description: "Get current weather for a location",
        parameters: {
          type: "object",
          properties: {
            location: { type: "string", description: "City and state" }
          },
          required: [ "location" ]
        }
      } ]
    )
  end

  def get_weather(location:)
    { location: location, temperature: "72°F", conditions: "sunny" }
  end
end
ruby
response = WeatherAgent.weather_update.generate_now

The LLM calls get_weather automatically when it needs weather data, and uses the result to generate its response.

Provider Support Matrix

ProviderFunctionsServer-side ToolsNotes
OpenAI🟩🟩Server-side tools require Responses API
Anthropic🟩🟩Full support for built-in tools
OpenRouter🟩Model-dependent capabilities
Ollama🟩Model-dependent capabilities
Mock🟦Accepted but not enforced

For MCP (Model Context Protocol) support, see the MCP documentation.

Functions

Functions are callable methods in your agent that LLMs can trigger with appropriate parameters. All providers support the common format described above.

Basic Function Registration

Using the common format, register functions by passing tool definitions to the tools parameter:

ruby
class WeatherAgent < ApplicationAgent
  generate_with :anthropic, model: "claude-sonnet-4-20250514"

  def weather_update
    prompt(
      message: "What's the weather in San Francisco?",
      max_tokens: 1024,
      tools: [ {
        name: "get_weather",
        description: "Get the current weather in a given location",
        parameters: {
          type: "object",
          properties: {
            location: {
              type: "string",
              description: "The city and state, e.g. San Francisco, CA"
            }
          },
          required: [ "location" ]
        }
      } ]
    )
  end

  def get_weather(location:)
    { location: location, temperature: "72°F", conditions: "sunny" }
  end
end
ruby
class WeatherAgent < ApplicationAgent
  generate_with :ollama, model: "qwen3:latest"

  def weather_update
    prompt(
      message: "What's the weather in Boston?",
      tools: [ {
        name: "get_current_weather",
        description: "Get the current weather in a given location",
        parameters: {
          type: "object",
          properties: {
            location: {
              type: "string",
              description: "The city and state, e.g. San Francisco, CA"
            },
            unit: {
              type: "string",
              enum: [ "celsius", "fahrenheit" ]
            }
          },
          required: [ "location" ]
        }
      } ]
    )
  end

  def get_current_weather(location:, unit: "fahrenheit")
    { location: location, unit: unit, temperature: "22" }
  end
end
ruby
class WeatherAgent < ApplicationAgent
  generate_with :openai, model: "gpt-4o"

  def weather_update
    prompt(
      input: "What's the weather in Boston?",
      tools: [ {
        name: "get_current_weather",
        description: "Get the current weather in a given location",
        parameters: {
          type: "object",
          properties: {
            location: {
              type: "string",
              description: "The city and state, e.g. San Francisco, CA"
            },
            unit: {
              type: "string",
              enum: [ "celsius", "fahrenheit" ]
            }
          },
          required: [ "location" ]
        }
      } ]
    )
  end

  def get_current_weather(location:, unit: "fahrenheit")
    { location: location, unit: unit, temperature: "22", conditions: "sunny" }
  end
end
ruby
class WeatherAgent < ApplicationAgent
  generate_with :openrouter, model: "google/gemini-2.0-flash-001"

  def weather_update
    prompt(
      message: "What's the weather in Boston?",
      tools: [ {
        name: "get_current_weather",
        description: "Get the current weather in a given location",
        parameters: {
          type: "object",
          properties: {
            location: {
              type: "string",
              description: "The city and state, e.g. San Francisco, CA"
            },
            unit: {
              type: "string",
              enum: [ "celsius", "fahrenheit" ]
            }
          },
          required: [ "location" ]
        }
      } ]
    )
  end

  def get_current_weather(location:, unit: "fahrenheit")
    { location: location, unit: unit, temperature: "22" }
  end
end

When the LLM decides to call a tool, ActiveAgent routes the call to your agent method and returns the result automatically.

ActiveAgent supports a universal common format for tool definitions that works seamlessly across all providers. This format eliminates the need to learn provider-specific syntax and makes your code portable.

Format Specification

ruby
{
  name: "function_name",              # Required: function name to call
  description: "What it does",        # Required: clear description for LLM
  parameters: {                       # Required: JSON Schema for parameters
    type: "object",
    properties: {
      param_name: {
        type: "string",
        description: "Parameter description"
      }
    },
    required: ["param_name"]
  }
}

Cross-Provider Example

The same tool definition works everywhere:

ruby
# Define once, use with any provider
module WeatherTool
  extend ActiveSupport::Concern

  WEATHER_TOOL = {
    name: "get_weather",
    description: "Get current weather for a location",
    parameters: {
      type: "object",
      properties: {
        location: { type: "string", description: "City and state" },
        unit: { type: "string", enum: [ "celsius", "fahrenheit" ] }
      },
      required: [ "location" ]
    }
  }

  def get_current_weather(location:, unit: "fahrenheit")
    { location: location, unit: unit, temperature: "22" }
  end
end
ruby
class AnthropicAgent < ApplicationAgent
  include WeatherTool
  generate_with :anthropic, model: "claude-sonnet-4-20250514"

  def check_weather
    prompt(message: "What's the weather?", tools: [ WEATHER_TOOL ])
  end
end
ruby
class OllamaAgent < ApplicationAgent
  include WeatherTool
  generate_with :ollama, model: "qwen3:latest"

  def check_weather
    prompt(message: "What's the weather?", tools: [ WEATHER_TOOL ])
  end
end
ruby
class OpenAIAgent < ApplicationAgent
  include WeatherTool
  generate_with :openai, model: "gpt-4o"

  def check_weather
    prompt(input: "What's the weather?", tools: [ WEATHER_TOOL ])
  end
end
ruby
class OpenRouterAgent < ApplicationAgent
  include WeatherTool
  generate_with :openrouter, model: "google/gemini-2.0-flash-001"

  def check_weather
    prompt(message: "What's the weather?", tools: [ WEATHER_TOOL ])
  end
end

Alternative: input_schema Key

You can also use input_schema instead of parameters - both work identically:

ruby
{
  name: "get_weather",
  description: "Get current weather",
  input_schema: {  # Alternative to 'parameters'
    type: "object",
    properties: { ... }
  }
}

ActiveAgent automatically converts between common format and each provider's native format behind the scenes.

Tool Choice Control

Control when and which tools the LLM uses with the tool_choice parameter:

ruby
# Auto (default) - Let the model decide whether to use tools
prompt(message: "...", tools: tools, tool_choice: "auto")

# Required - Force the model to use at least one tool
prompt(message: "...", tools: tools, tool_choice: "required")

# None - Prevent tool usage entirely
prompt(message: "...", tools: tools, tool_choice: "none")

# Specific tool - Force a particular tool (common format)
prompt(message: "...", tools: tools, tool_choice: { name: "get_weather" })

ActiveAgent automatically maps these common values to provider-specific formats:

  • OpenAI: "auto", "required", "none", or {type: "function", function: {name: "..."}}
  • Anthropic: {type: :auto}, {type: :any}, {type: :tool, name: "..."}
  • OpenRouter: "auto", "any" (equivalent to "required")
  • Ollama: Model-dependent tool choice support

Server-Side Tools (Provider-Specific)

Some providers offer built-in tools that run on their servers, providing capabilities like web search and code execution without custom implementation.

OpenAI Built-in Tools

OpenAI's Responses API provides several built-in tools (requires GPT-5, GPT-4.1, o3, etc.) including Web Search for current information, File Search for querying vector stores, and other tools like image generation, code interpreter, and computer use. For complete details and examples, see OpenAI's tools documentation and the OpenAI Provider documentation.

Anthropic Built-in Tools

Anthropic provides web access and specialized capabilities including Web Search for real-time information, Web Fetch (Beta) for specific URLs, Extended Thinking to show reasoning processes, and Computer Use (Beta) for interface interaction. For complete details and examples, see Anthropic's tool use documentation.

Troubleshooting

Tool Not Being Called

If the LLM doesn't call your function when expected, improve the tool description or use tool_choice: "required" to force tool usage.

Invalid Parameters

If the LLM passes unexpected parameters, add detailed parameter descriptions with enum for restricted choices and mark required parameters explicitly.