This document describes the feature introduced in the v0.5 and explains how to expose your agents through an OpenAI-compatible endpoint.
LarAgent\API\Completions
handles OpenAI compatible chat completion requests.
The class expects a valid Illuminate\Http\Request
and an agent class name:
$response
is either an array
(For non-streaming responses) or a Generator
with chunks (for streaming responses).
Completions
class we create abstract classes,
So that you can use the provided base controllers to create endpoints quickly by extending them.
completion(Request $request)
method that delegates work to Completions::make()
and automatically handles SSE streaming or JSON responses compatible with OpenAI API.completion
and models
methods.
Once you have your agent created, 3 steps is enough to expose it via API.
Extend SingleAgentController
when exposing a single agent:
protected ?string $agentClass
property to specify the agent class.protected ?array $models
property to specify the models.MultiAgentController
:
protected ?array $agents
property to specify the agent classes.protected ?array $models
property to specify the models.model
as AgentName/model
or as AgentName
(Default model is used defined in Agent class or provider).
setSessionId
method in SingleAgentController
or MultiAgentController
.
"stream": true
in request returns a text/event-stream
where each chunk matches the OpenAI format and includes:
usage
data is included only in the last chunk as in OpenAI API.Completions::make()
directly: