LLM Drivers allow you to switch between different AI providers (like OpenAI,
Ollama, or OpenRouter) without changing your application code, providing
flexibility and vendor independence.
Understanding LLM Drivers
LLM Drivers provide a standardized interface for interacting with different language model providers, allowing you to easily switch between providers without changing your application code. The built-in drivers implement this interface and provide a simple way to use various AI APIs with a consistent interface. Check Creating Custom Drivers for more details about building custom drivers.Available Drivers
OpenAiDriver
Default driver for OpenAI API. Works with minimal configuration - just add
your
OPENAI_API_KEY
to your .env
file.OpenAiCompatible
Works with any OpenAI-compatible API, allowing you to use alternative
providers with the same API format.
GeminiDriver
Works with Google Gemini API via OpenAI-compatible endpoint
ClaudeDriver
Works with Anthropic API, located at
LarAgent\Drivers\Anthropic\ClaudeDriver
. Add ANTHROPIC_API_KEY
to your
.env
file and use “claude” provider in your agent.GroqDriver
Works with Groq platform API, using as
LarAgent\Drivers\Groq\GroqDriver
. Simply add GROQ_API_KEY
to your .env
file and use “groq” provider in your agentsOllamaDriver
Works with Ollama platform API, use “ollama” provider in your agents and any
models you have installed with Ollama
OpenRouter
Works with OpenRouter API, supports APP
attribution via adding
referer
and title
keys to provider settings. By default, they are set as
LarAgent.Configuring Drivers
You can configure LLM drivers in two ways:There are all drivers pre-configured in
config/laragent.php
providers. You
can use them right away or delete which isn’t needed.1. Global Configuration
Set drivers in the configuration file inside provider settings (config/laragent.php
):
2. Per-Agent Configuration
Set the driver directly in your agent class:app/AiAgents/YourAgent.php
If you set the driver in the agent class, it will override the global
configuration.
Example Configurations
Ollama (Local LLM)
OpenRouter
Gemini
Gemini driver doesn’t support streaming yet.
Fallback Provider
There is a fallback provider that is used when the current provider fails to process a request. By default, it’s set tonull
:
null
with your provider name:
Example configuration
It is recommended to have a ‘model’ set in provider which is used as a
fallback.
fallback_provider
to null
in configuration file or just by removing it.
LLM Drivers Architecture
The LLM Driver architecture handles three key responsibilities:- Tool Registration - Register function calling tools that can be used by the LLM
- Response Schema - Define structured output formats for LLM responses
- Tool Call Formatting - Abstract away provider-specific formats for tool calls and results
Creating Custom Drivers
If you need to integrate with an AI provider that doesn’t have a built-in driver, you can create your own by implementing theLlmDriver
interface:
Best Practices
Do store API keys in environment variables, never hardcode them
Do set reasonable defaults for context window and token limits
Do consider implementing fallback mechanisms between providers