Compatible Anthropic API
Call MiniMax models using the Anthropic SDK
To meet developers' needs for the Anthropic API ecosystem, our API now supports the Anthropic API format. With simple configuration, you can integrate MiniMax capabilities into the Anthropic API ecosystem.
Quick Start
1. Install Anthropic SDK
<CodeGroup> ```bash Python theme={null} pip install anthropic
```bash Node.js theme={null}
npm install @anthropic-ai/sdk</CodeGroup>
2. Configure Environment Variables
For international users, use https://api.minimax.io/anthropic; for users in China, use https://api.minimaxi.com/anthropic
```bash theme={null} export ANTHROPICBASEURL=https://api.minimax.io/anthropic export ANTHROPICAPIKEY=${YOURAPIKEY}
### 3. Call API
```python Python theme={null}
import anthropic
client = anthropic.Anthropic()
message = client.messages.create(
model="MiniMax-M2.1",
max_tokens=1000,
system="You are a helpful assistant.",
messages=[
{
"role": "user",
"content": [
{
"type": "text",
"text": "Hi, how are you?"
}
]
}
]
)
for block in message.content:
if block.type == "thinking":
print(f"Thinking:\n{block.thinking}\n")
elif block.type == "text":
print(f"Text:\n{block.text}\n")4. Important Note
In multi-turn function call conversations, the complete model response (i.e., the assistant message) must be append to the conversation history to maintain the continuity of the reasoning chain.
* Append the full response.content list to the message history (includes all content blocks: thinking/text/tool\_use)
Supported Models
When using the Anthropic SDK, the MiniMax-M2.1 MiniMax-M2.1-lightning MiniMax-M2 model is supported:
| Model Name | Description |
|---|---|
| MiniMax-M2.1 | Powerful Multi-Language Programming Capabilities with Comprehensively Enhanced Programming Experience (output speed approximately 60 tps) |
| MiniMax-M2.1-lightning | Faster and More Agile (output speed approximately 100 tps) |
| MiniMax-M2 | Agentic capabilities, Advanced reasoning |
<Note> The Anthropic API compatibility interface currently only supports the MiniMax-M2.1 MiniMax-M2.1-lightning MiniMax-M2 model. For other models, please use the standard MiniMax API interface. </Note>
Compatibility
Supported Parameters
When using the Anthropic SDK, we support the following input parameters:
| Parameter | Support Status | Description |
|---|---|---|
model | Fully supported | supports MiniMax-M2.1 MiniMax-M2.1-lightning MiniMax-M2 model |
messages | Partial support | Supports text and tool calls, no image/document input |
max_tokens | Fully supported | Maximum number of tokens to generate |
stream | Fully supported | Streaming response |
system | Fully supported | System prompt |
temperature | Fully supported | Range (0.0, 1.0], controls output randomness, recommended value: 1 |
tool_choice | Fully supported | Tool selection strategy |
tools | Fully supported | Tool definitions |
top_p | Fully supported | Nucleus sampling parameter |
metadata | Fully Supported | Metadata |
thinking | Fully Supported | Reasoning Content |
top_k | Ignored | This parameter will be ignored |
stop_sequences | Ignored | This parameter will be ignored |
service_tier | Ignored | This parameter will be ignored |
mcp_servers | Ignored | This parameter will be ignored |
context_management | Ignored | This parameter will be ignored |
container | Ignored | This parameter will be ignored |
Messages Field Support
| Field Type | Support Status | Description |
|---|---|---|
type="text" | Fully supported | Text messages |
type="tool_use" | Fully supported | Tool calls |
type="tool_result" | Fully supported | Tool call results |
type="thinking" | Fully supported | Reasoning Content |
type="image" | Not supported | Image input not supported yet |
type="document" | Not supported | Document input not supported yet |
Examples
Streaming Response
```python Python theme={null} import anthropic
client = anthropic.Anthropic()
print("Starting stream response...\n") print("=" * 60) print("Thinking Process:") print("=" * 60)
stream = client.messages.create( model="MiniMax-M2.1", max_tokens=1000, system="You are a helpful assistant.", messages=[ {"role": "user", "content": [{"type": "text", "text": "Hi, how are you?"}]} ], stream=True, )
reasoningbuffer = "" textbuffer = ""
for chunk in stream: if chunk.type == "contentblockstart": if hasattr(chunk, "contentblock") and chunk.contentblock: if chunk.content_block.type == "text": print("\n" + "=" * 60) print("Response Content:") print("=" * 60)
elif chunk.type == "contentblockdelta": if hasattr(chunk, "delta") and chunk.delta: if chunk.delta.type == "thinking_delta":
Stream output thinking process
newthinking = chunk.delta.thinking if newthinking: print(newthinking, end="", flush=True) reasoningbuffer += newthinking elif chunk.delta.type == "textdelta":
Stream output text content
newtext = chunk.delta.text if newtext: print(newtext, end="", flush=True) textbuffer += new_text
print("\n")
### Tool Use & Interleaved Thinking
Learn how to use M2.1 Tool Use and Interleaved Thinking capabilities with Anthropic SDK, please refer to the following documentation.
<Columns cols={1}>
<Card title="M2.1 Tool Use & Interleaved Thinking" icon="book-open" href="/guides/text-m2-function-call#anthropic-sdk" arrow="true" cta="Click here">
Learn how to leverage MiniMax-M2.1 tool calling and interleaved thinking capabilities to enhance performance in complex tasks.
</Card>
</Columns>
## Important Notes
<Warning>
1. The Anthropic API compatibility interface currently only supports the `MiniMax-M2.1` `MiniMax-M2` model
2. The `temperature` parameter range is (0.0, 1.0], values outside this range will return an error
3. Some Anthropic parameters (such as `thinking`, `top_k`, `stop_sequences`, `service_tier`, `mcp_servers`, `context_management`, `container`) will be ignored
4. Image and document type inputs are not currently supported
</Warning>
## Related Links
* [Anthropic SDK Documentation](https://docs.anthropic.com/en/api/client-sdks)
* [MiniMax Text Generation API](/api-reference/text-intro)
* [M2.1 Tool Use & Interleaved Thinking](/guides/text-m2-function-call)
## Recommended Reading
<Columns cols={2}>
<Card title="Text Generation" icon="book-open" href="/guides/text-generation" arrow="true" cta="Click here">
Supports text generation via compatible Anthropic API and OpenAI API.
</Card>
<Card title="Compatible OpenAI API" icon="book-open" href="/api-reference/text-openai-api" arrow="true" cta="Click here">
Use OpenAI SDK with MiniMax models
</Card>
<Card title="M2.1 for AI Coding Tools" icon="book-open" href="/guides/text-ai-coding-tools" arrow="true" cta="Click here">
MiniMax-M2.1 excels at code understanding, dialogue, and reasoning.
</Card>
<Card title="M2.1 Tool Use & Interleaved Thinking" icon="book-open" href="/guides/text-m2-function-call" arrow="true" cta="Click here">
AI models can call external functions to extend their capabilities.
</Card>
</Columns>
---
> To find navigation and other pages in this documentation, fetch the llms.txt file at: https://platform.minimax.io/docs/llms.txt