Loading source
Pulling the file list, source metadata, and syntax-aware rendering for this listing.
Source from repo
Build LLM-powered apps with the Anthropic Claude API or SDK across Python, TypeScript, Java, Go, Ruby, C#, and PHP.
Files
Skill
Size
Entrypoint
Format
Open file
Syntax-highlighted preview of this file as included in the skill package.
python/claude-api/batches.md
1# Message Batches API — Python23The Batches API (`POST /v1/messages/batches`) processes Messages API requests asynchronously at 50% of standard prices.45## Key Facts67- Up to 100,000 requests or 256 MB per batch8- Most batches complete within 1 hour; maximum 24 hours9- Results available for 29 days after creation10- 50% cost reduction on all token usage11- All Messages API features supported (vision, tools, caching, etc.)1213---1415## Create a Batch1617```python18import anthropic19from anthropic.types.message_create_params import MessageCreateParamsNonStreaming20from anthropic.types.messages.batch_create_params import Request2122client = anthropic.Anthropic()2324message_batch = client.messages.batches.create(25requests=[26Request(27custom_id="request-1",28params=MessageCreateParamsNonStreaming(29model="claude-opus-4-7",30max_tokens=16000,31messages=[{"role": "user", "content": "Summarize climate change impacts"}]32)33),34Request(35custom_id="request-2",36params=MessageCreateParamsNonStreaming(37model="claude-opus-4-7",38max_tokens=16000,39messages=[{"role": "user", "content": "Explain quantum computing basics"}]40)41),42]43)4445print(f"Batch ID: {message_batch.id}")46print(f"Status: {message_batch.processing_status}")47```4849---5051## Poll for Completion5253```python54import time5556while True:57batch = client.messages.batches.retrieve(message_batch.id)58if batch.processing_status == "ended":59break60print(f"Status: {batch.processing_status}, processing: {batch.request_counts.processing}")61time.sleep(60)6263print("Batch complete!")64print(f"Succeeded: {batch.request_counts.succeeded}")65print(f"Errored: {batch.request_counts.errored}")66```6768---6970## Retrieve Results7172> **Note:** Examples below use `match/case` syntax, requiring Python 3.10+. For earlier versions, use `if/elif` chains instead.7374```python75for result in client.messages.batches.results(message_batch.id):76match result.result.type:77case "succeeded":78msg = result.result.message79text = next((b.text for b in msg.content if b.type == "text"), "")80print(f"[{result.custom_id}] {text[:100]}")81case "errored":82if result.result.error.type == "invalid_request":83print(f"[{result.custom_id}] Validation error - fix request and retry")84else:85print(f"[{result.custom_id}] Server error - safe to retry")86case "canceled":87print(f"[{result.custom_id}] Canceled")88case "expired":89print(f"[{result.custom_id}] Expired - resubmit")90```9192---9394## Cancel a Batch9596```python97cancelled = client.messages.batches.cancel(message_batch.id)98print(f"Status: {cancelled.processing_status}") # "canceling"99```100101---102103## Batch with Prompt Caching104105```python106shared_system = [107{"type": "text", "text": "You are a literary analyst."},108{109"type": "text",110"text": large_document_text, # Shared across all requests111"cache_control": {"type": "ephemeral"}112}113]114115message_batch = client.messages.batches.create(116requests=[117Request(118custom_id=f"analysis-{i}",119params=MessageCreateParamsNonStreaming(120model="claude-opus-4-7",121max_tokens=16000,122system=shared_system,123messages=[{"role": "user", "content": question}]124)125)126for i, question in enumerate(questions)127]128)129```130131---132133## Full End-to-End Example134135```python136import anthropic137import time138from anthropic.types.message_create_params import MessageCreateParamsNonStreaming139from anthropic.types.messages.batch_create_params import Request140141client = anthropic.Anthropic()142143# 1. Prepare requests144items_to_classify = [145"The product quality is excellent!",146"Terrible customer service, never again.",147"It's okay, nothing special.",148]149150requests = [151Request(152custom_id=f"classify-{i}",153params=MessageCreateParamsNonStreaming(154model="claude-haiku-4-5",155max_tokens=50,156messages=[{157"role": "user",158"content": f"Classify as positive/negative/neutral (one word): {text}"159}]160)161)162for i, text in enumerate(items_to_classify)163]164165# 2. Create batch166batch = client.messages.batches.create(requests=requests)167print(f"Created batch: {batch.id}")168169# 3. Wait for completion170while True:171batch = client.messages.batches.retrieve(batch.id)172if batch.processing_status == "ended":173break174time.sleep(10)175176# 4. Collect results177results = {}178for result in client.messages.batches.results(batch.id):179if result.result.type == "succeeded":180msg = result.result.message181results[result.custom_id] = next((b.text for b in msg.content if b.type == "text"), "")182183for custom_id, classification in sorted(results.items()):184print(f"{custom_id}: {classification}")185```186