Skip to content

Providers and Models

This page adapts the original AI SDK documentation: Providers and Models.

Companies such as OpenAI, Anthropic, Google, and Groq offer access to families of large language models (LLMs) through distinct APIs. Each API has its own authentication story, request/response schema, streaming protocol, and feature gates—making it hard to switch providers without rewriting application code.

The Swift AI SDK mirrors the AI SDK language-model specification (AISDKProvider). Every provider module implements the same typed surface area, so calls like generateText, generateObject, and the streaming helpers stay identical even when you swap vendors.

AI SDK provider architecture diagram Illustration reused from the original AI SDK documentation.

The Swift port currently ships with the following provider packages. Each module exports a lazily configured global shortcut plus a factory for advanced configuration, matching the TypeScript behavior.

  • OpenAI Providerimport OpenAIProvider, then call openai("gpt-4o") or createOpenAIProvider(settings:).
  • Anthropic Providerimport AnthropicProvider, use anthropic("claude-3-5-sonnet") or createAnthropicProvider(settings:).
  • Google Generative AI Providerimport GoogleProvider, use google("gemini-1.5-pro") or createGoogleGenerativeAI(settings:).
  • Groq Provider — import GroqProvider, use groq("llama-3.1-8b-instant") or createGroqProvider(settings:). (Documentation coming soon)
  • OpenAI Compatible Provider — import OpenAICompatibleProvider, call createOpenAICompatibleProvider(settings:) for any OpenAI-compatible gateway. (Documentation coming soon)

Each shortcut performs lazy API-key loading (via environment variables such as OPENAI_API_KEY, ANTHROPIC_API_KEY, GOOGLE_API_KEY, GROQ_API_KEY) and returns a ready-to-use LanguageModelV3 instance for the higher-level Swift AI SDK APIs.

If you are integrating with a custom gateway that implements the OpenAI API surface (for example LM Studio or Together.ai), use the OpenAI Compatible provider. Override the base URL, API key, and optional headers in OpenAICompatibleProviderSettings and the rest of your Swift code stays identical.

The TypeScript AI SDK community also maintains dozens of additional providers (Amazon Bedrock, Mistral, Cohere, DeepSeek, and more). These have not yet been ported to Swift; visit the upstream documentation for details and to track parity:

ProviderTypical model identifiersTextEmbeddingsImagesAudio
OpenAI (openai)gpt-5, gpt-4.1-mini, gpt-4o-audio-preview
Anthropic (anthropic)claude-3-5-sonnet, claude-3-opus
Google Generative AI (google)gemini-1.5-pro, gemini-2.0-flash-exp
Groq (groq)llama-3.1-8b-instant, mixtral-8x7b-32768✅ (transcription)
OpenAI-compatible (createOpenAICompatibleProvider)Depends on upstream service

Model catalogs evolve quickly. Inspect the individual provider modules (Sources/OpenAIProvider, Sources/AnthropicProvider, Sources/GoogleProvider, Sources/GroqProvider, and Sources/OpenAICompatibleProvider) for authoritative ModelId helpers and supported feature sets.

By standardizing provider integrations behind the Swift AI SDK interfaces you can experiment with multiple vendors, fail over between services, or mix models in the same application without rewriting business logic.