Use the built-in provider or extend to implement new ones.
We currently offer first-party support for these leading AI providers:
Anthropic
OpenAI
Mistral
Implement new providers
If you want to create a new provider you have to implement the AIProviderInterface interface:
namespace NeuronAI\Providers;
use NeuronAI\Messages\Message;
use NeuronAI\Tools\ToolInterface;
interface AIProviderInterface
{
/**
* Send predefined instruction to the LLM.
*
* @param ?string $prompt
* @return AIProviderInterface
*/
public function systemPrompt(?string $prompt): AIProviderInterface;
/**
* Set the tools to be exposed to the LLM.
*
* @param array<ToolInterface> $tools
* @return AIProviderInterface
*/
public function setTools(array $tools): AIProviderInterface;
/**
* Send a prompt to the AI agent.
*
* @param Message|array $prompt
* @return Message
*/
public function chat(Message|array $prompt): Message;
}
The chat method should contains the call the underlying LLM. If the provider doesn't support tools and function calls, you can implement it with a placeholder.
This is the basic template for a new AI provider implementation.
namespace App\AI\Providers;
use GuzzleHttp\Client;
use GuzzleHttp\RequestOptions;
use NeuronAI\Providers\AIProviderInterface;
class MyAIProvider implements AIProviderInterface
{
/**
* The http client.
*
* @var Client
*/
protected Client $client;
/**
* System instructions.
*
* @var string
*/
protected string $system;
/**
* Log AI driver constructor.
*
* @param LoggerInterface|null $logger
*/
public function __construct(
protected string $key,
proptected string $model
) {
$this->client = new Client([
'base_uri' => 'https://api.llm-provider.com/v1',
'headers' => [
'Content-Type' => 'application/json',
'Authorization' => "Bearer {$this->key}",
]
]);
}
/**
* @inerhitDoc
*/
public function systemPrompt(string $prompt): AIProviderInterface
{
$this->system = $prompt;
return $this;
}
/**
* @inerhitDoc
*/
public function chat(array|string $prompt): Message
{
if (\is_string($prompt)) {
$prompt = [
new UserMessage($prompt)
];
}
$result = $this->client->post('chat', [
RequestOptions::JSON => [
'model' => $this->model,
'messages' => $prompt
]
])->getBody()->getContents();
$result = \json_decode($result, true);
return new AssistantMessage($result['content']);
}
public function setTools(array $tools): AIProviderInterface
{
return $this;
}
}
After creating your own implementation you can use it in the agent:
namespace App\AI\Agents;
use App\AI\Providers\MyAIProvider;
use NeuronAI\Agent;
use NeuronAI\Providers\AIProviderInterface;
class MyAgent extends Agent
{
public function provider(): AIProviderInterface
{
return new MyAIProvider (
key: 'PROVIDER_API_KEY',
model: 'PROVIDER_MODEL',
);
}
}
We strongly recommend you to submit new provider implementations via PR on the official repository or using other Inspector.dev support channels. The new implementation can receives an important boost in its advancement by the community.