Skip to main content
LLM Drivers provide a standardized interface for interacting with different language model providers. Switch between providers without changing your application code.

Available Drivers

OpenAI

Default driver for OpenAI API. Works with GPT-4, GPT-4o, and other OpenAI models.

Anthropic (Claude)

Native support for Claude models via the Anthropic API.

Google Gemini

Native Gemini driver for Google’s AI models.

Groq

Ultra-fast inference with Groq’s LPU platform.

Ollama

Run local LLMs with Ollama integration.

OpenRouter

Access multiple providers through OpenRouter’s unified API.
All drivers are pre-configured in config/laragent.php. Add your API key and you’re ready to go.

Quick Setup

OpenAI (Default)

.env
OPENAI_API_KEY=your-api-key
App/AiAgents/MyAgent.php
class MyAgent extends Agent
{
    protected $provider = 'default'; // Uses OpenAI
    protected $model = 'gpt-4o-mini';
}

Anthropic (Claude)

.env
ANTHROPIC_API_KEY=your-api-key
App/AiAgents/MyAgent.php
class MyAgent extends Agent
{
    protected $provider = 'claude';
    protected $model = 'claude-sonnet-4-20250514';
}

Google Gemini

.env
GEMINI_API_KEY=your-api-key
App/AiAgents/MyAgent.php
class MyAgent extends Agent
{
    protected $provider = 'gemini';
    protected $model = 'gemini-2.5-pro-preview-03-25';
}

Groq

.env
GROQ_API_KEY=your-api-key
App/AiAgents/MyAgent.php
class MyAgent extends Agent
{
    protected $provider = 'groq';
    protected $model = 'llama-3.3-70b-versatile';
}

Ollama (Local)

App/AiAgents/MyAgent.php
class MyAgent extends Agent
{
    protected $provider = 'ollama';
    protected $model = 'llama2'; // Any model installed in Ollama
}

OpenRouter

.env
OPENROUTER_API_KEY=your-api-key
App/AiAgents/MyAgent.php
class MyAgent extends Agent
{
    protected $provider = 'openrouter';
    protected $model = 'anthropic/claude-3-opus';
}

Configuration

Global Configuration

Configure providers in config/laragent.php:
config/laragent.php
'providers' => [
    'default' => [
        'label' => 'openai',
        'api_key' => env('OPENAI_API_KEY'),
        'driver' => \LarAgent\Drivers\OpenAi\OpenAiDriver::class,
        'default_temperature' => 1,
        'default_max_completion_tokens' => 2048,
    ],
    
    'claude' => [
        'label' => 'anthropic',
        'api_key' => env('ANTHROPIC_API_KEY'),
        'driver' => \LarAgent\Drivers\Anthropic\ClaudeDriver::class,
    ],
],

Per-Agent Configuration

Override the driver directly in your agent:
use LarAgent\Drivers\OpenAi\OpenAiCompatible;

class MyAgent extends Agent
{
    protected $driver = OpenAiCompatible::class;
    protected $provider = 'custom-provider';
}
Agent-level driver configuration overrides the global provider settings.

Fallback Provider

Configure a fallback provider that activates when the primary provider fails:
config/laragent.php
return [
    'providers' => [
        'default' => [
            'label' => 'openai',
            'model' => 'gpt-4o-mini',
            'api_key' => env('OPENAI_API_KEY'),
            'driver' => \LarAgent\Drivers\OpenAi\OpenAiDriver::class,
        ],
        
        'gemini' => [
            'label' => 'gemini',
            'model' => 'gemini-2.0-flash',
            'api_key' => env('GEMINI_API_KEY'),
            'driver' => \LarAgent\Drivers\Gemini\GeminiDriver::class,
        ],
    ],
    
    // Use Gemini as fallback when OpenAI fails
    'fallback_provider' => 'gemini',
];
Always set a model in the fallback provider configuration to ensure it works correctly.

Driver Reference

DriverClassProvider Key
OpenAILarAgent\Drivers\OpenAi\OpenAiDriverdefault
OpenAI CompatibleLarAgent\Drivers\OpenAi\OpenAiCompatible
AnthropicLarAgent\Drivers\Anthropic\ClaudeDriverclaude
GeminiLarAgent\Drivers\Gemini\GeminiDrivergemini
GroqLarAgent\Drivers\Groq\GroqDrivergroq
OllamaLarAgent\Drivers\OpenAi\OllamaDriverollama
OpenRouterLarAgent\Drivers\OpenAi\OpenAiCompatibleopenrouter
The OpenAiCompatible driver works with any API that follows the OpenAI format, making it easy to integrate with custom or self-hosted solutions.

Best Practices

Store API keys in environment variables — never hardcode them.
Configure a fallback provider for production reliability.
Not all providers support the same features. Check provider documentation for streaming, tool calling, and structured output support.

Next Steps