Anthropic launches cost-effective Message Batches API for efficient large-scale queries

Anthropic launches cost-effective Message Batches API for efficient large-scale queries

Anthropic has launched a new Message Batches API designed to efficiently handle large volumes of queries asynchronously. Developers can now send up to 10,000 queries per batch, with processing times under 24 hours. This API offers a 50% cost reduction compared to standard API calls, optimizing non-time-sensitive task processing.

Currently available in public beta, the Batches API supports Claude 3.5 Sonnet, Claude 3 Opus, and Claude 3 Haiku on the Anthropic API. Customers using Claude in Amazon Bedrock can also utilize batch inference, with plans to extend support to Google Cloud's Vertex AI soon.

This initiative positions Anthropic to better compete with other AI providers, such as OpenAI, which introduced a similar feature earlier this year.

by Paul

Claude iconClaude
  82
  • ...

Claude is an AI chatbot designed from Anthropic's research to be helpful, honest, and harmless. Offering a chat interface and API access via a developer console, Claude excels in various conversational and text processing tasks. With a rating of 3.2, its AI-powered capabilities position it alongside alternatives like ChatGPT, HuggingChat, and Microsoft Copilot.

No comments so far, maybe you want to be first?
Gu