Skip to content

Moonshot AI

This page adapts the original AI SDK documentation: Moonshot AI.

The Moonshot AI provider offers access to powerful language models through the Moonshot API, including the Kimi series of models with reasoning capabilities.

API keys can be obtained from the Moonshot Platform.

The Moonshot AI provider is available in the MoonshotAIProvider module. Add it to your Swift package:

// Package.swift (excerpt)
dependencies: [
.package(url: "https://github.com/teunlao/swift-ai-sdk", from: "0.17.5")
],
targets: [
.target(
name: "YourTarget",
dependencies: [
.product(name: "SwiftAISDK", package: "swift-ai-sdk"),
.product(name: "MoonshotAIProvider", package: "swift-ai-sdk")
]
)
]

You can import the default provider instance moonshotai:

import MoonshotAIProvider

For custom configuration, use createMoonshotAI and create a provider instance with your settings:

import MoonshotAIProvider
let moonshotai = createMoonshotAI(settings: MoonshotAIProviderSettings(
apiKey: "your-api-key" // optional, defaults to MOONSHOT_API_KEY
))

You can use the following optional settings to customize the Moonshot AI provider instance:

  • baseURL String

    Use a different URL prefix for API calls. The default prefix is https://api.moonshot.ai/v1.

  • apiKey String

    API key that is being sent using the Authorization header. It defaults to the MOONSHOT_API_KEY environment variable.

  • headers [String: String]

    Custom headers to include in the requests.

  • fetch FetchFunction

    Custom fetch implementation.

You can create language models using a provider instance:

import SwiftAISDK
import MoonshotAIProvider
let result = try await generateText(
model: try moonshotai.chatModel(modelId: "kimi-k2.5"),
prompt: "Write a vegetarian lasagna recipe for 4 people."
)
print(result.text)

You can also use the .chatModel() or .languageModel() factory methods:

let chatModel = try moonshotai.chatModel(modelId: "kimi-k2.5")
let languageModel = try moonshotai.languageModel(modelId: "kimi-k2.5")

Moonshot AI language models can be used in the streamText function (see AI SDK Core).

Moonshot AI offers thinking models like kimi-k2-thinking that generate intermediate reasoning tokens before their final response. The reasoning output is streamed through the standard AI SDK reasoning parts.

import SwiftAISDK
import MoonshotAIProvider
let result = try await generateText(
model: try moonshotai.chatModel(modelId: "kimi-k2-thinking"),
providerOptions: [
"moonshotai": [
"thinking": [
"type": "enabled",
"budgetTokens": 2048
],
"reasoningHistory": "interleaved"
]
],
prompt: "How many \"r\"s are in the word \"strawberry\"?"
)
print(result.reasoningText ?? "")
print(result.text)

The following optional provider options are available for Moonshot AI language models:

  • thinking object

    Configuration for thinking/reasoning models like Kimi K2 Thinking.

    • type 'enabled' | 'disabled'

      Whether to enable thinking mode

    • budgetTokens number

      Maximum number of tokens for thinking (minimum 1024)

  • reasoningHistory 'disabled' | 'interleaved' | 'preserved'

    Controls how reasoning history is handled in multi-turn conversations:

    • 'disabled': Remove reasoning from history
    • 'interleaved': Include reasoning between tool calls within a single turn
    • 'preserved': Keep all reasoning in history
ModelImage InputObject GenerationTool UsageTool Streaming
moonshot-v1-8kNoYesYesYes
moonshot-v1-32kNoYesYesYes
moonshot-v1-128kNoYesYesYes
kimi-k2NoYesYesYes
kimi-k2.5YesYesYesYes
kimi-k2-thinkingNoYesYesYes
kimi-k2-thinking-turboNoYesYesYes
kimi-k2-turboNoYesYesYes

Note: Please see the Moonshot AI docs for a full list of available models. You can also pass any available provider model ID as a string if needed.