Tools
This page adapts the original AI SDK documentation: Tools.
While large language models (LLMs) have incredible generation capabilities, they struggle with discrete tasks (e.g., mathematics) and interacting with the outside world (e.g., getting the weather).
Tools are actions that an LLM can invoke. The results of these actions can be reported back to the LLM to be considered in the next response.
For example, when you ask an LLM for the “weather in London”, and there is a weather tool available, it could call a tool with London as the argument. The tool would then fetch the weather data and return it to the LLM. The LLM can then use this information in its response.
What is a tool?
Section titled “What is a tool?”A tool is an object that can be called by the model to perform a specific task. You can use tools with generateText and streamText by passing one or more tools to the tools parameter.
A tool consists of three main properties:
description: Optional description influencing when the tool is chosen.inputSchema: typically aCodabletype (the SDK derives the JSON Schema automatically). Fall back toFlexibleSchema<MyType>.jsonSchema(...)for custom validators or non-Codable shapes.execute: Optional async function that receives the arguments and returns a result.
Note: Swift has no Zod. Most of the time you can lean on
tool(inputSchema: YourCodable.self); when you need custom validators, reach forFlexibleSchema<YourType>.jsonSchema(...)(which wrapsSchema.codable) or other helpers exposed viaSwiftAISDK.
If the LLM decides to use a tool, it emits a tool call. Tools with execute run automatically; results come back as tool‑result parts and can be fed into the next step (multi‑step calls).
Schemas
Section titled “Schemas”Schemas define tool parameters and validate tool calls. In Swift you usually just pass a Codable type to tool(inputSchema: ...), and the SDK derives the JSON Schema automatically. If you need custom validation you can still build one manually with FlexibleSchema<MyType>.jsonSchema(...) or Schema.codable(...).
Creating a tool
Section titled “Creating a tool”Basic tool returning an immediate (.value) result:
import SwiftAISDKimport OpenAIProvider
struct WeatherQuery: Codable, Sendable { let city: String let unit: String}
struct WeatherReport: Codable, Sendable { let city: String let temperature: Double let unit: String}
let getWeather = tool( description: "Fetch current weather by city", inputSchema: WeatherQuery.self) { query, _ in // ... fetch weather for `query.city` (omitted) WeatherReport(city: query.city, temperature: 21, unit: query.unit)}Use .tool when you need the erased representation for [String: Tool] dictionaries (or call .eraseToTool() if you prefer a method).
Dynamic tool
Section titled “Dynamic tool”Dynamic tool (type .dynamic) via dynamicTool(...):
struct EchoPayload: Codable, Sendable { let message: String }
let echo = dynamicTool( description: "Echo input back", inputSchema: EchoPayload.self) { payload, _ in payload}Streaming tool output
Section titled “Streaming tool output”A tool can stream results via ToolExecutionResult.stream:
struct EmptyInput: Codable, Sendable {}struct CounterStep: Codable, Sendable { let value: Int }
let countToThree = tool( description: "Stream numbers 1..3", inputSchema: EmptyInput.self) { _, _ in .stream(AsyncThrowingStream<CounterStep, Error> { continuation in Task { for i in 1...3 { continuation.yield(CounterStep(value: i)) try? await Task.sleep(nanoseconds: 200_000_000) } continuation.finish() } })}Note: To normalize streaming outputs, use
executeTool(...)which yields.preliminaryfor each element and one.final.
Approval (needsApproval)
Section titled “Approval (needsApproval)”Built‑in approval support:
struct SensitivePayload: Codable, Sendable { let sensitive: Bool let message: String}
let sensitiveTool = tool( description: "Access sensitive resource", inputSchema: SensitivePayload.self, needsApproval: .conditional { payload, _ in payload.sensitive }) { payload, _ in payload}Using tools with generateText/streamText
Section titled “Using tools with generateText/streamText”You can automatically pass tool results back to the model in multi‑step calls:
import SwiftAISDKimport OpenAIProvider
let result = try await generateText( model: openai("gpt-4o"), tools: ["getWeather": getWeather.tool], prompt: "What's the weather in London?")print(result.text)Or with streaming:
let stream = try streamText( model: openai("gpt-4o"), tools: [ "getWeather": getWeather.eraseToTool(), "countToThree": countToThree.tool ], messages: [.user(UserModelMessage(content: .text("Count and tell weather")))])Toolkits
Section titled “Toolkits”In the TypeScript ecosystem there are many toolkits; these primarily target JS/TS. The Swift port already exposes several provider‑defined tools (e.g., Google/Groq/Anthropic web/code tools) inside their packages (Sources/*Provider/Tool/*).
Note: The third‑party toolkit list in the original article is TS‑specific and may not match Swift yet.
Learn more
Section titled “Learn more”- More on tool calling: Tools and Tool Calling (AI SDK Core)
- Agents overview: Agents