Moonshot AI
This page adapts the original AI SDK documentation: Moonshot AI.
The Moonshot AI provider offers access to powerful language models through the Moonshot API, including the Kimi series of models with reasoning capabilities.
API keys can be obtained from the Moonshot Platform.
The Moonshot AI provider is available in the MoonshotAIProvider module. Add it to your Swift package:
// Package.swift (excerpt)dependencies: [ .package(url: "https://github.com/teunlao/swift-ai-sdk", from: "0.17.5")],targets: [ .target( name: "YourTarget", dependencies: [ .product(name: "SwiftAISDK", package: "swift-ai-sdk"), .product(name: "MoonshotAIProvider", package: "swift-ai-sdk") ] )]Provider Instance
Section titled “Provider Instance”You can import the default provider instance moonshotai:
import MoonshotAIProviderFor custom configuration, use createMoonshotAI and create a provider instance with your settings:
import MoonshotAIProvider
let moonshotai = createMoonshotAI(settings: MoonshotAIProviderSettings( apiKey: "your-api-key" // optional, defaults to MOONSHOT_API_KEY))You can use the following optional settings to customize the Moonshot AI provider instance:
-
baseURL
StringUse a different URL prefix for API calls. The default prefix is
https://api.moonshot.ai/v1. -
apiKey
StringAPI key that is being sent using the
Authorizationheader. It defaults to theMOONSHOT_API_KEYenvironment variable. -
headers
[String: String]Custom headers to include in the requests.
-
fetch
FetchFunctionCustom fetch implementation.
Language Models
Section titled “Language Models”You can create language models using a provider instance:
import SwiftAISDKimport MoonshotAIProvider
let result = try await generateText( model: try moonshotai.chatModel(modelId: "kimi-k2.5"), prompt: "Write a vegetarian lasagna recipe for 4 people.")
print(result.text)You can also use the .chatModel() or .languageModel() factory methods:
let chatModel = try moonshotai.chatModel(modelId: "kimi-k2.5")let languageModel = try moonshotai.languageModel(modelId: "kimi-k2.5")Moonshot AI language models can be used in the streamText function
(see AI SDK Core).
Reasoning Models
Section titled “Reasoning Models”Moonshot AI offers thinking models like kimi-k2-thinking that generate intermediate reasoning tokens before their final response. The reasoning output is streamed through the standard AI SDK reasoning parts.
import SwiftAISDKimport MoonshotAIProvider
let result = try await generateText( model: try moonshotai.chatModel(modelId: "kimi-k2-thinking"), providerOptions: [ "moonshotai": [ "thinking": [ "type": "enabled", "budgetTokens": 2048 ], "reasoningHistory": "interleaved" ] ], prompt: "How many \"r\"s are in the word \"strawberry\"?")
print(result.reasoningText ?? "")print(result.text)Provider Options
Section titled “Provider Options”The following optional provider options are available for Moonshot AI language models:
-
thinking
objectConfiguration for thinking/reasoning models like Kimi K2 Thinking.
-
type
'enabled' | 'disabled'Whether to enable thinking mode
-
budgetTokens
numberMaximum number of tokens for thinking (minimum 1024)
-
-
reasoningHistory
'disabled' | 'interleaved' | 'preserved'Controls how reasoning history is handled in multi-turn conversations:
'disabled': Remove reasoning from history'interleaved': Include reasoning between tool calls within a single turn'preserved': Keep all reasoning in history
Model Capabilities
Section titled “Model Capabilities”| Model | Image Input | Object Generation | Tool Usage | Tool Streaming |
|---|---|---|---|---|
moonshot-v1-8k | No | Yes | Yes | Yes |
moonshot-v1-32k | No | Yes | Yes | Yes |
moonshot-v1-128k | No | Yes | Yes | Yes |
kimi-k2 | No | Yes | Yes | Yes |
kimi-k2.5 | Yes | Yes | Yes | Yes |
kimi-k2-thinking | No | Yes | Yes | Yes |
kimi-k2-thinking-turbo | No | Yes | Yes | Yes |
kimi-k2-turbo | No | Yes | Yes | Yes |
Note: Please see the Moonshot AI docs for a full list of available models. You can also pass any available provider model ID as a string if needed.