Anthropic
The Anthropic provider contains language model support for the Anthropic Messages API.
The Anthropic provider is available in the AnthropicProvider module. Add it to your Swift package:
dependencies: [ .package(url: "https://github.com/teunlao/swift-ai-sdk", from: "1.0.0")],targets: [ .target( name: "YourTarget", dependencies: [ .product(name: "SwiftAISDK", package: "swift-ai-sdk"), .product(name: "AnthropicProvider", package: "swift-ai-sdk") ] )]Provider Instance
Section titled “Provider Instance”You can import the default provider instance anthropic from AnthropicProvider:
import AnthropicProvider
// Use the global anthropic instancelet model = anthropic("claude-3-haiku-20240307")If you need a customized setup, you can use createAnthropicProvider to create a provider instance with your settings:
import AnthropicProvider
let customAnthropic = createAnthropicProvider( settings: AnthropicProviderSettings( baseURL: "https://custom.api.com/v1", apiKey: "your-api-key" ))You can use the following optional settings to customize the Anthropic provider instance:
-
baseURL string
Use a different URL prefix for API calls, e.g. to use proxy servers. The default prefix is
https://api.anthropic.com/v1. -
apiKey string
API key that is being sent using the
x-api-keyheader. It defaults to theANTHROPIC_API_KEYenvironment variable. -
headers Record<string,string>
Custom headers to include in the requests.
-
fetch (input: RequestInfo, init?: RequestInit) => Promise<Response>
Custom fetch implementation. Defaults to the global
fetchfunction. You can use it as a middleware to intercept requests, or to provide a custom fetch implementation for e.g. testing.
Language Models
Section titled “Language Models”You can create models that call the Anthropic Messages API using the provider instance.
The first argument is the model id, e.g. claude-3-haiku-20240307.
Some models have multi-modal capabilities.
let model = anthropic("claude-3-haiku-20240307")You can use Anthropic language models to generate text with the generateText function:
import SwiftAISDKimport AnthropicProvider
let result = try await generateText( model: anthropic("claude-3-haiku-20240307"), prompt: "Write a vegetarian lasagna recipe for 4 people.")let text = result.textAnthropic language models can also be used in the streamText, generateObject, and streamObject functions
(see AI SDK Core).
The following optional provider options are available for Anthropic models:
-
sendReasoningbooleanOptional. Include reasoning content in requests sent to the model. Defaults to
true.If you are experiencing issues with the model handling requests involving reasoning content, you can set this to
falseto omit them from the request. -
thinkingobjectOptional. See Reasoning section for more details.
Reasoning
Section titled “Reasoning”Anthropic has reasoning support for claude-opus-4-20250514, claude-sonnet-4-20250514, and claude-3-7-sonnet-20250219 models.
You can enable it using the thinking provider option
and specifying a thinking budget in tokens.
import SwiftAISDKimport AnthropicProvider
let result = try await generateText( model: anthropic("claude-opus-4-20250514"), prompt: "How many people will live in the world in 2040?", providerOptions: [ "anthropic": [ "thinking": ["type": "enabled", "budgetTokens": 12000] ] ])
print(result.reasoning) // reasoning textprint(result.reasoningDetails) // reasoning details including redacted reasoningprint(result.text) // text responseSee AI SDK UI: Chatbot for more details on how to integrate reasoning into your chatbot.
Cache Control
Section titled “Cache Control”In the messages and message parts, you can use the providerOptions property to set cache control breakpoints.
You need to set the anthropic property in the providerOptions object to { cacheControl: { type: 'ephemeral' } } to set a cache control breakpoint.
The cache creation input tokens are then returned in the providerMetadata object
for generateText and generateObject, again under the anthropic property.
When you use streamText or streamObject, the response contains a promise
that resolves to the metadata. Alternatively you can receive it in the
onFinish callback.
import SwiftAISDKimport AnthropicProvider
let errorMessage = "... long error message ..."
let result = try await generateText( model: anthropic("claude-3-5-sonnet-20240620"), messages: [ [ "role": "user", "content": [ ["type": "text", "text": "You are a JavaScript expert."], [ "type": "text", "text": "Error message: \(errorMessage)", "providerOptions": [ "anthropic": ["cacheControl": ["type": "ephemeral"]] ] ], ["type": "text", "text": "Explain the error message."] ] ] ])
print(result.text)print(result.providerMetadata?.anthropic as Any)// e.g. { cacheCreationInputTokens: 2118 }You can also use cache control on system messages by providing multiple system messages at the head of your messages array:
let result = try await generateText( model: anthropic("claude-3-5-sonnet-20240620"), messages: [ [ "role": "system", "content": "Cached system message part", "providerOptions": [ "anthropic": ["cacheControl": ["type": "ephemeral"]] ] ], [ "role": "system", "content": "Uncached system message part" ], [ "role": "user", "content": "User prompt" ] ])Cache control for tools:
struct CityQuery: Codable, Sendable { let city: String }
let result = try await generateText( model: try anthropic("claude-3-5-haiku-latest"), tools: [ "cityAttractions": tool( inputSchema: CityQuery.self, providerOptions: [ "anthropic": [ "cacheControl": ["type": "ephemeral"] ] ] ) ], messages: [ [ "role": "user", "content": "User prompt" ] ])Longer cache TTL
Section titled “Longer cache TTL”Anthropic also supports a longer 1-hour cache duration.
Here’s an example:
let result = try await generateText( model: anthropic("claude-3-5-haiku-latest"), messages: [ [ "role": "user", "content": [ [ "type": "text", "text": "Long cached message", "providerOptions": [ "anthropic": [ "cacheControl": ["type": "ephemeral", "ttl": "1h"] ] ] ] ] ] ])Limitations
Section titled “Limitations”The minimum cacheable prompt length is:
- 1024 tokens for Claude 3.7 Sonnet, Claude 3.5 Sonnet and Claude 3 Opus
- 2048 tokens for Claude 3.5 Haiku and Claude 3 Haiku
Shorter prompts cannot be cached, even if marked with cacheControl. Any requests to cache fewer than this number of tokens will be processed without caching.
For more on prompt caching with Anthropic, see Anthropic’s Cache Control documentation.
Bash Tool
Section titled “Bash Tool”The Bash Tool allows running bash commands. Here’s how to create and use it:
let bashTool = anthropic.tools.bash_20241022( execute: { command, restart in // Implement your bash command execution logic here // Return the result of the command execution })Parameters:
command(string): The bash command to run. Required unless the tool is being restarted.restart(boolean, optional): Specifying true will restart this tool.
Text Editor Tool
Section titled “Text Editor Tool”The Text Editor Tool provides functionality for viewing and editing text files.
let tools: [String: Tool] = [ // tool name must be str_replace_based_edit_tool "str_replace_based_edit_tool": anthropic.tools.textEditor_20250728( maxCharacters: 10000, // optional execute: { command, path, old_str, new_str in // ... } )]Parameters:
command(‘view’ | ‘create’ | ‘str_replace’ | ‘insert’ | ‘undo_edit’): The command to run. Note:undo_editis only available in Claude 3.5 Sonnet and earlier models.path(string): Absolute path to file or directory, e.g./repo/file.pyor/repo.file_text(string, optional): Required forcreatecommand, with the content of the file to be created.insert_line(number, optional): Required forinsertcommand. The line number after which to insert the new string.new_str(string, optional): New string forstr_replaceorinsertcommands.old_str(string, optional): Required forstr_replacecommand, containing the string to replace.view_range(number[], optional): Optional forviewcommand to specify line range to show.
Computer Tool
Section titled “Computer Tool”The Computer Tool enables control of keyboard and mouse actions on a computer:
let computerTool = anthropic.tools.computer_20241022( displayWidthPx: 1920, displayHeightPx: 1080, displayNumber: 0, // Optional, for X11 environments
execute: { action, coordinate, text in // Implement your computer control logic here // Return the result of the action
// Example code: switch action { case "screenshot": // multipart result: return [ "type": "image", "data": try Data(contentsOf: URL(fileURLWithPath: "./data/screenshot-editor.png")).base64EncodedString() ] default: print("Action:", action) print("Coordinate:", coordinate as Any) print("Text:", text as Any) return "executed \(action)" } },
// map to tool result content for LLM consumption: toModelOutput: { result in if let stringResult = result as? String { return [["type": "text", "text": stringResult]] } else if let dictResult = result as? [String: Any], let data = dictResult["data"] { return [["type": "image", "data": data, "mediaType": "image/png"]] } return [] })Parameters:
action(‘key’ | ‘type’ | ‘mouse_move’ | ‘left_click’ | ‘left_click_drag’ | ‘right_click’ | ‘middle_click’ | ‘double_click’ | ‘screenshot’ | ‘cursor_position’): The action to perform.coordinate(number[], optional): Required formouse_moveandleft_click_dragactions. Specifies the (x, y) coordinates.text(string, optional): Required fortypeandkeyactions.
These tools can be used in conjunction with the sonnet-3-5-sonnet-20240620 model to enable more complex interactions and tasks.
Web Search Tool
Section titled “Web Search Tool”Anthropic provides a provider-defined web search tool that gives Claude direct access to real-time web content, allowing it to answer questions with up-to-date information beyond its knowledge cutoff.
You can enable web search using the provider-defined web search tool:
import SwiftAISDKimport AnthropicProvider
let webSearchTool = anthropic.tools.webSearch_20250305( maxUses: 5)
let result = try await generateText( model: anthropic("claude-opus-4-20250514"), prompt: "What are the latest developments in AI?", tools: [ "web_search": webSearchTool ])Configuration Options
Section titled “Configuration Options”The web search tool supports several configuration options:
-
maxUses number
Maximum number of web searches Claude can perform during the conversation.
-
allowedDomains string[]
Optional list of domains that Claude is allowed to search. If provided, searches will be restricted to these domains.
-
blockedDomains string[]
Optional list of domains that Claude should avoid when searching.
-
userLocation object
Optional user location information to provide geographically relevant search results.
let webSearchTool = anthropic.tools.webSearch_20250305( maxUses: 3, allowedDomains: ["techcrunch.com", "wired.com"], blockedDomains: ["example-spam-site.com"], userLocation: [ "type": "approximate", "country": "US", "region": "California", "city": "San Francisco", "timezone": "America/Los_Angeles" ])
let result = try await generateText( model: anthropic("claude-opus-4-20250514"), prompt: "Find local news about technology", tools: [ "web_search": webSearchTool ])Web Fetch Tool
Section titled “Web Fetch Tool”Anthropic provides a provider-defined web fetch tool that allows Claude to retrieve content from specific URLs. This is useful when you want Claude to analyze or reference content from a particular webpage or document.
You can enable web fetch using the provider-defined web fetch tool:
import SwiftAISDKimport AnthropicProvider
let result = try await generateText( model: anthropic("claude-sonnet-4-0"), prompt: "What is this page about? https://en.wikipedia.org/wiki/Maglemosian_culture", tools: [ "web_fetch": anthropic.tools.webFetch_20250910(maxUses: 1) ])Configuration Options
Section titled “Configuration Options”The web fetch tool supports several configuration options:
-
maxUses number
The maxUses parameter limits the number of web fetches performed.
-
allowedDomains string[]
Only fetch from these domains.
-
blockedDomains string[]
Never fetch from these domains.
-
citations object
Unlike web search where citations are always enabled, citations are optional for web fetch. Set
"citations": {"enabled": true}to enable Claude to cite specific passages from fetched documents. -
maxContentTokens number
The maxContentTokens parameter limits the amount of content that will be included in the context.
Error Handling
Section titled “Error Handling”Web search errors are handled differently depending on whether you’re using streaming or non-streaming:
Non-streaming (generateText, generateObject):
Web search errors throw exceptions that you can catch:
do { let result = try await generateText( model: anthropic("claude-opus-4-20250514"), prompt: "Search for something", tools: [ "web_search": webSearchTool ] )} catch { if error.localizedDescription.contains("Web search failed") { print("Search error:", error.localizedDescription) // Handle search error appropriately }}Streaming (streamText, streamObject):
Web search errors are delivered as error parts in the stream:
let result = try streamText( model: anthropic("claude-opus-4-20250514"), prompt: "Search for something", tools: [ "web_search": webSearchTool ])
for try await part in result.textStream { if part.type == "error" { print("Search error:", part.error) // Handle search error appropriately }}Code Execution
Section titled “Code Execution”Anthropic provides a provider-defined code execution tool that gives Claude direct access to a real Python environment allowing it to execute code to inform its responses.
You can enable code execution using the provider-defined code execution tool:
import SwiftAISDKimport AnthropicProvider
let codeExecutionTool = anthropic.tools.codeExecution_20250522()let result = try await generateText( model: anthropic("claude-opus-4-20250514"), prompt: "Calculate the mean and standard deviation of [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]", tools: [ "code_execution": codeExecutionTool ])Error Handling
Section titled “Error Handling”Code execution errors are handled differently depending on whether you’re using streaming or non-streaming:
Non-streaming (generateText, generateObject):
Code execution errors are delivered as tool result parts in the response:
let result = try await generateText( model: anthropic("claude-opus-4-20250514"), prompt: "Execute some Python script", tools: [ "code_execution": codeExecutionTool ])
let toolErrors = result.content?.filter { content in content.type == "tool-error"}
toolErrors?.forEach { error in print("Tool execution error:", [ "toolName": error.toolName, "toolCallId": error.toolCallId, "error": error.error ])}Streaming (streamText, streamObject):
Code execution errors are delivered as error parts in the stream:
let result = try streamText( model: anthropic("claude-opus-4-20250514"), prompt: "Execute some Python script", tools: [ "code_execution": codeExecutionTool ])
for try await part in result.textStream { if part.type == "error" { print("Code execution error:", part.error) // Handle code execution error appropriately }}PDF support
Section titled “PDF support”Anthropic Sonnet claude-3-5-sonnet-20241022 supports reading PDF files.
You can pass PDF files as part of the message content using the file type:
Option 1: URL-based PDF document
let result = try await generateText( model: anthropic("claude-3-5-sonnet-20241022"), messages: [ [ "role": "user", "content": [ [ "type": "text", "text": "What is an embedding model according to this document?" ], [ "type": "file", "data": URL(string: "https://github.com/vercel/ai/blob/main/examples/ai-core/data/ai.pdf?raw=true")!, "mimeType": "application/pdf" ] ] ] ])Option 2: Base64-encoded PDF document
let result = try await generateText( model: anthropic("claude-3-5-sonnet-20241022"), messages: [ [ "role": "user", "content": [ [ "type": "text", "text": "What is an embedding model according to this document?" ], [ "type": "file", "data": try Data(contentsOf: URL(fileURLWithPath: "./data/ai.pdf")), "mediaType": "application/pdf" ] ] ] ])The model will have access to the contents of the PDF file and
respond to questions about it.
The PDF file should be passed using the data field,
and the mediaType should be set to 'application/pdf'.
Model Capabilities
Section titled “Model Capabilities”| Model | Image Input | Object Generation | Tool Usage | Computer Use | Web Search |
|---|---|---|---|---|---|
claude-sonnet-4-5 | |||||
claude-opus-4-1 | |||||
claude-opus-4-0 | |||||
claude-sonnet-4-0 | |||||
claude-3-7-sonnet-latest | |||||
claude-3-5-haiku-latest |