Skip to content

Tools

This page adapts the original AI SDK documentation: Tools.

While large language models (LLMs) have incredible generation capabilities, they struggle with discrete tasks (e.g., mathematics) and interacting with the outside world (e.g., getting the weather).

Tools are actions that an LLM can invoke. The results of these actions can be reported back to the LLM to be considered in the next response.

For example, when you ask an LLM for the “weather in London”, and there is a weather tool available, it could call a tool with London as the argument. The tool would then fetch the weather data and return it to the LLM. The LLM can then use this information in its response.

A tool is an object that can be called by the model to perform a specific task. You can use tools with generateText and streamText by passing one or more tools to the tools parameter.

A tool consists of three main properties:

  • description: Optional description influencing when the tool is chosen.
  • inputSchema: typically a Codable type (the SDK derives the JSON Schema automatically). Fall back to FlexibleSchema<MyType>.jsonSchema(...) for custom validators or non-Codable shapes.
  • execute: Optional async function that receives the arguments and returns a result.

Note: Swift has no Zod. Most of the time you can lean on tool(inputSchema: YourCodable.self); when you need custom validators, reach for FlexibleSchema<YourType>.jsonSchema(...) (which wraps Schema.codable) or other helpers exposed via SwiftAISDK.

If the LLM decides to use a tool, it emits a tool call. Tools with execute run automatically; results come back as tool‑result parts and can be fed into the next step (multi‑step calls).

Schemas define tool parameters and validate tool calls. In Swift you usually just pass a Codable type to tool(inputSchema: ...), and the SDK derives the JSON Schema automatically. If you need custom validation you can still build one manually with FlexibleSchema<MyType>.jsonSchema(...) or Schema.codable(...).

Basic tool returning an immediate (.value) result:

import SwiftAISDK
import OpenAIProvider
struct WeatherQuery: Codable, Sendable {
let city: String
let unit: String
}
struct WeatherReport: Codable, Sendable {
let city: String
let temperature: Double
let unit: String
}
let getWeather = tool(
description: "Fetch current weather by city",
inputSchema: WeatherQuery.self
) { query, _ in
// ... fetch weather for `query.city` (omitted)
WeatherReport(city: query.city, temperature: 21, unit: query.unit)
}

Use .tool when you need the erased representation for [String: Tool] dictionaries (or call .eraseToTool() if you prefer a method).

Dynamic tool (type .dynamic) via dynamicTool(...):

struct EchoPayload: Codable, Sendable { let message: String }
let echo = dynamicTool(
description: "Echo input back",
inputSchema: EchoPayload.self
) { payload, _ in
payload
}

A tool can stream results via ToolExecutionResult.stream:

struct EmptyInput: Codable, Sendable {}
struct CounterStep: Codable, Sendable { let value: Int }
let countToThree = tool(
description: "Stream numbers 1..3",
inputSchema: EmptyInput.self
) { _, _ in
.stream(AsyncThrowingStream<CounterStep, Error> { continuation in
Task {
for i in 1...3 {
continuation.yield(CounterStep(value: i))
try? await Task.sleep(nanoseconds: 200_000_000)
}
continuation.finish()
}
})
}

Note: To normalize streaming outputs, use executeTool(...) which yields .preliminary for each element and one .final.

Built‑in approval support:

struct SensitivePayload: Codable, Sendable {
let sensitive: Bool
let message: String
}
let sensitiveTool = tool(
description: "Access sensitive resource",
inputSchema: SensitivePayload.self,
needsApproval: .conditional { payload, _ in payload.sensitive }
) { payload, _ in
payload
}

You can automatically pass tool results back to the model in multi‑step calls:

import SwiftAISDK
import OpenAIProvider
let result = try await generateText(
model: openai("gpt-4o"),
tools: ["getWeather": getWeather.tool],
prompt: "What's the weather in London?"
)
print(result.text)

Or with streaming:

let stream = try streamText(
model: openai("gpt-4o"),
tools: [
"getWeather": getWeather.eraseToTool(),
"countToThree": countToThree.tool
],
messages: [.user(UserModelMessage(content: .text("Count and tell weather")))]
)

In the TypeScript ecosystem there are many toolkits; these primarily target JS/TS. The Swift port already exposes several provider‑defined tools (e.g., Google/Groq/Anthropic web/code tools) inside their packages (Sources/*Provider/Tool/*).

Note: The third‑party toolkit list in the original article is TS‑specific and may not match Swift yet.

  • More on tool calling: Tools and Tool Calling (AI SDK Core)
  • Agents overview: Agents