Skip to main content

Providers & Models

VoltAgent leverages the Vercel AI SDK to provide seamless integration with a wide range of AI providers and models. The AI SDK offers a standardized approach to interacting with LLMs through a unified interface that allows you to switch between providers with ease.

We strongly recommend using the Vercel AI SDK providers directly with VoltAgent's @voltagent/vercel-ai provider. This gives you access to:

  • 30+ AI providers with consistent APIs
  • Automatic updates with the latest models
  • Built-in support for streaming, tool calling, and structured outputs
  • Community-driven provider ecosystem

Custom Providers

While the Vercel AI SDK covers most use cases, VoltAgent also supports custom provider implementation for advanced scenarios. You can create your own provider by implementing the LLMProvider interface.

When to use custom providers:

  • Integration with proprietary or internal LLM services
  • Advanced control over request/response handling
  • Special authentication mechanisms
  • Custom retry logic or rate limiting
  • Integration with legacy systems

For most use cases, the Vercel AI SDK providers are sufficient and recommended as they offer:

  • Battle-tested implementations
  • Regular updates and improvements
  • Wide community support
  • Consistent API across all providers

Tip: Before implementing a custom provider, check if your use case can be covered by the 30+ providers available through Vercel AI SDK, including OpenAI-compatible providers that work with custom endpoints.

Installation

First, install the Vercel AI provider for VoltAgent:

npm install @voltagent/vercel-ai

Then install the specific AI SDK provider you want to use:

# For example, to use OpenAI:
npm install @ai-sdk/openai

# Or Anthropic:
npm install @ai-sdk/anthropic

# Or Google:
npm install @ai-sdk/google

Usage Example

import { Agent } from "@voltagent/core";
import { VercelAIProvider } from "@voltagent/vercel-ai";
import { openai } from "@ai-sdk/openai";

const provider = new VercelAIProvider();

const agent = new Agent({
name: "my-agent",
instructions: "You are a helpful assistant",
llm: provider,
model: openai("gpt-4-turbo"),
});

Available Providers

First-Party AI SDK Providers

These providers are maintained by Vercel and offer the highest level of support and integration:

Foundation Models

ProviderPackageDocumentationKey Models
xAI Grok@ai-sdk/xaiDocsgrok-4, grok-3, grok-2-vision
OpenAI@ai-sdk/openaiDocsgpt-4.1, gpt-4o, o3, o1
Anthropic@ai-sdk/anthropicDocsclaude-opus-4, claude-sonnet-4, claude-3.5
Google Generative AI@ai-sdk/googleDocsgemini-2.0-flash, gemini-1.5-pro
Google Vertex@ai-sdk/google-vertexDocsgemini models, claude models via Vertex
Mistral@ai-sdk/mistralDocsmistral-large, pixtral-large, mistral-medium

Cloud Platforms

ProviderPackageDocumentationDescription
Amazon Bedrock@ai-sdk/amazon-bedrockDocsAccess to various models via AWS
Azure OpenAI@ai-sdk/azureDocsOpenAI models via Azure
Vercel@ai-sdk/vercelDocsv0 model for code generation

Specialized Providers

ProviderPackageDocumentationSpecialization
Groq@ai-sdk/groqDocsUltra-fast inference
Together.ai@ai-sdk/togetheraiDocsOpen-source models
Cohere@ai-sdk/cohereDocsEnterprise search & generation
Fireworks@ai-sdk/fireworksDocsFast open-source models
DeepInfra@ai-sdk/deepinfraDocsAffordable inference
DeepSeek@ai-sdk/deepseekDocsDeepSeek models including reasoner
Cerebras@ai-sdk/cerebrasDocsFast Llama models
Perplexity@ai-sdk/perplexityDocsSearch-enhanced responses

Audio & Speech Providers

ProviderPackageDocumentationSpecialization
ElevenLabs@ai-sdk/elevenlabsDocsText-to-speech
LMNT@ai-sdk/lmntDocsVoice synthesis
Hume@ai-sdk/humeDocsEmotional intelligence
Rev.ai@ai-sdk/revaiDocsSpeech recognition
Deepgram@ai-sdk/deepgramDocsSpeech-to-text
Gladia@ai-sdk/gladiaDocsAudio intelligence
AssemblyAI@ai-sdk/assemblyaiDocsSpeech recognition & understanding

Community Providers

These providers are created and maintained by the open-source community:

ProviderPackageDocumentationDescription
Ollamaollama-ai-providerDocsLocal model execution
FriendliAI@friendliai/ai-providerDocsOptimized inference
Portkey@portkey-ai/vercel-providerDocsLLM gateway & observability
Cloudflare Workers AIworkers-ai-providerDocsEdge AI inference
OpenRouter@openrouter/ai-sdk-providerDocsUnified API for multiple providers
Requesty@requesty/ai-sdkDocsRequest management
Crosshatch@crosshatch/ai-providerDocsSpecialized models
Mixedbreadmixedbread-ai-providerDocsEmbedding models
Voyage AIvoyage-ai-providerDocsEmbedding models
Mem0@mem0/vercel-ai-providerDocsMemory-enhanced AI
Letta@letta-ai/vercel-ai-sdk-providerDocsStateful agents
Sparkspark-ai-providerDocsChinese language models
AnthropicVertexanthropic-vertex-aiDocsClaude via Vertex AI
LangDB@langdb/vercel-providerDocsDatabase-aware AI
Difydify-ai-providerDocsLLMOps platform
Sarvamsarvam-ai-providerDocsIndian language models
Claude Codeai-sdk-provider-claude-codeDocsCode-optimized Claude
Built-in AIbuilt-in-aiDocsBrowser-native AI
Gemini CLIai-sdk-provider-gemini-cliDocsCLI-based Gemini
A2Aa2a-ai-providerDocsSpecialized models
SAP-AI@mymediset/sap-ai-providerDocsSAP AI Core integration

OpenAI-Compatible Providers

For providers that follow the OpenAI API specification:

ProviderDocumentationDescription
LM StudioDocsLocal model execution with GUI
BasetenDocsModel deployment platform
Any OpenAI-compatible APIDocsCustom endpoints

Model Capabilities

The AI providers support different language models with various capabilities. Here are the capabilities of popular models:

ProviderModelImage InputObject GenerationTool UsageTool Streaming
xAI Grokgrok-4
xAI Grokgrok-3
xAI Grokgrok-3-fast
xAI Grokgrok-3-mini
xAI Grokgrok-3-mini-fast
xAI Grokgrok-2-1212
xAI Grokgrok-2-vision-1212
xAI Grokgrok-beta
xAI Grokgrok-vision-beta
Vercelv0-1.0-md
OpenAIgpt-4.1
OpenAIgpt-4.1-mini
OpenAIgpt-4.1-nano
OpenAIgpt-4o
OpenAIgpt-4o-mini
OpenAIgpt-4
OpenAIo3-mini
OpenAIo3
OpenAIo4-mini
OpenAIo1
OpenAIo1-mini
OpenAIo1-preview
Anthropicclaude-opus-4-20250514
Anthropicclaude-sonnet-4-20250514
Anthropicclaude-3-7-sonnet-20250219
Anthropicclaude-3-5-sonnet-20241022
Anthropicclaude-3-5-sonnet-20240620
Anthropicclaude-3-5-haiku-20241022
Mistralpixtral-large-latest
Mistralmistral-large-latest
Mistralmistral-medium-latest
Mistralmistral-medium-2505
Mistralmistral-small-latest
Mistralpixtral-12b-2409
Google Generative AIgemini-2.0-flash-exp
Google Generative AIgemini-1.5-flash
Google Generative AIgemini-1.5-pro
Google Vertexgemini-2.0-flash-exp
Google Vertexgemini-1.5-flash
Google Vertexgemini-1.5-pro
DeepSeekdeepseek-chat
DeepSeekdeepseek-reasoner
Cerebrasllama3.1-8b
Cerebrasllama3.1-70b
Cerebrasllama3.3-70b
Groqmeta-llama/llama-4-scout-17b-16e-instruct
Groqllama-3.3-70b-versatile
Groqllama-3.1-8b-instant
Groqmixtral-8x7b-32768
Groqgemma2-9b-it

Note: This table is not exhaustive. Additional models can be found in the provider documentation pages and on the provider websites.

Migration from Deprecated Providers

If you're currently using VoltAgent's native providers (@voltagent/anthropic-ai, @voltagent/google-ai, @voltagent/groq-ai), we recommend migrating to the Vercel AI SDK providers:

Before (Deprecated):

import { AnthropicProvider } from "@voltagent/anthropic-ai";

const provider = new AnthropicProvider({ apiKey: "..." });
const agent = new Agent({
llm: provider,
model: "claude-opus-4-1",
});
import { VercelAIProvider } from "@voltagent/vercel-ai";
import { anthropic } from "@ai-sdk/anthropic";

const provider = new VercelAIProvider();
const agent = new Agent({
llm: provider,
model: anthropic("claude-opus-4-1"),
});

Environment Variables

Most providers use environment variables for API keys:

# OpenAI
OPENAI_API_KEY=your-key

# Anthropic
ANTHROPIC_API_KEY=your-key

# Google
GOOGLE_GENERATIVE_AI_API_KEY=your-key

# Groq
GROQ_API_KEY=your-key

# And so on...

Next Steps

  1. Choose a provider based on your needs (performance, cost, capabilities)
  2. Install the corresponding package
  3. Configure your API keys
  4. Start building with VoltAgent!

For detailed information about each provider, visit the Vercel AI SDK documentation.


Acknowledgments

The provider lists and model capabilities in this documentation are sourced from the Vercel AI SDK documentation.

A special thanks to the Vercel AI SDK maintainers and community for creating and maintaining this comprehensive ecosystem of AI providers. Their work enables developers to seamlessly integrate with 30+ AI providers through a unified, well-designed interface.

VoltAgent builds upon this excellent foundation to provide a complete framework for building AI agents and workflows.

Table of Contents