With Neuron you can switch between LLM providers with just one line of code, without any impact on your agent implementation.
Supported providers:
Anthropic
Ollama (also available as an )
OpenAI (also available as an )
Mistral
Deepseek
use NeuronAI\Agent;
use NeuronAI\Chat\Messages\UserMessage;
use NeuronAI\Providers\AIProviderInterface;
use NeuronAI\Providers\Anthropic\Anthropic;
class MyAgent extends Agent
{
protected function provider(): AIProviderInterface
{
return new Anthropic(
key: 'ANTHROPIC_API_KEY',
model: 'ANTHROPIC_MODEL',
);
}
}
echo MyAgent::make()->chat(new UserMessage("Hi!"));
// Hi, how can I help you today?
use NeuronAI\Agent;
use NeuronAI\Chat\Messages\UserMessage;
use NeuronAI\Providers\AIProviderInterface;
use NeuronAI\Providers\Ollama\Ollama;
class MyAgent extends Agent
{
public function provider(): AIProviderInterface
{
return new Ollama(
url: 'OLLAMA_URL',
model: 'OLLAMA_MODEL',
);
}
}
echo MyAgent::make()->chat(new UserMessage("Hi!"));
// Hi, how can I help you today?
use NeuronAI\Agent;
use NeuronAI\Chat\Messages\UserMessage;
use NeuronAI\Providers\AIProviderInterface;
use NeuronAI\Providers\OpenAI\OpenAI;
class MyAgent extends Agent
{
public function provider(): AIProviderInterface
{
return new OpenAI(
key: 'OPENAI_API_KEY',
model: 'OPENAI_MODEL',
);
}
}
echo MyAgent::make()->chat(new UserMessage("Hi!"));
// Hi, how can I help you today?
use NeuronAI\Agent;
use NeuronAI\Chat\Messages\UserMessage;
use NeuronAI\Providers\AIProviderInterface;
use NeuronAI\Providers\Mistral;
class MyAgent extends Agent
{
public function provider(): AIProviderInterface
{
return new Mistral(
key: 'MISTRAL_API_KEY',
model: 'MISTRAL_MODEL',
);
}
}
echo MyAgent::make()->chat(new UserMessage("Hi!"));
// Hi, how can I help you today?
use NeuronAI\Agent;
use NeuronAI\Chat\Messages\UserMessage;
use NeuronAI\Providers\AIProviderInterface;
use NeuronAI\Providers\Deepseek;
class MyAgent extends Agent
{
public function provider(): AIProviderInterface
{
return new Deepseek(
key: 'MISTRAL_API_KEY',
model: 'MISTRAL_MODEL',
);
}
}
echo MyAgent::make()->chat(new UserMessage("Hi!"));
// Hi, how can I help you today?
Implement OpenAI compatible providers
If you want to interact to an LLM provider that support the same API format of OpenAI you can easily create e dedicated class in a few lines of code, just extending our OpenAI provider:
namespace App\AI\Providers;
use NeuronAI\Providers\OpenAI\OpenAI;
class TogetherAI extends OpenAI
{
protected string $baseUri = "https://api.together.xyz";
}
That's it.
You can now use this new provider into your Agent implementation:
use App\AI\Providers\TogetherAI;
use NeuronAI\Agent;
use NeuronAI\Providers\AIProviderInterface;
class MyAgent extends Agent
{
public function provider(): AIProviderInterface
{
return new TogetherAI(
key: 'TOGETHER_API_KEY',
model: 'TOGETHER_MODEL',
);
}
}
Implement a new provider
If you want to create a new provider you have to implement the AIProviderInterface interface:
namespace NeuronAI\Providers;
use NeuronAI\Chat\Messages\Message;
use NeuronAI\Tools\ToolInterface;
interface AIProviderInterface
{
/**
* Send predefined instruction to the LLM.
*/
public function systemPrompt(?string $prompt): AIProviderInterface;
/**
* Set the tools to be exposed to the LLM.
*
* @param array<ToolInterface> $tools
*/
public function setTools(array $tools): AIProviderInterface;
/**
* Send a prompt to the AI agent.
*/
public function chat(Message|array $messages): Message;
/**
* Yield the LLM response.
*/
public function stream(Message|array $messages, callable $executeToolsCallback): \Generator;
/**
* Schema validated response.
*/
public function structured(string $class, Message|array $messages, int $maxRetry = 1): mixed;
}
The chat method should contains the call the underlying LLM. If the provider doesn't support tools and function calls, you can implement it with a placeholder.
This is the basic template for a new AI provider implementation.
use GuzzleHttp\Client;
use GuzzleHttp\RequestOptions;
use NeuronAI\Chat\Messages\AssistantMessage;
use NeuronAI\Chat\Messages\Message;
use NeuronAI\Providers\AIProviderInterface;
use NeuronAI\Providers\HanldeWithTools;
class MyAIProvider implements AIProviderInterface
{
use HanldeWithTools;
/**
* The http client.
*
* @var Client
*/
protected Client $client;
/**
* System instructions.
*
* @var string
*/
protected string $system;
public function __construct(
protected string $key,
protected string $model
) {
$this->client = new Client([
'base_uri' => 'https://api.provider.com/v1',
'headers' => [
'Content-Type' => 'application/json',
'Authorization' => "Bearer {$this->key}",
]
]);
}
/**
* @inerhitDoc
*/
public function systemPrompt(string $prompt): AIProviderInterface
{
$this->system = $prompt;
return $this;
}
/**
* @inerhitDoc
*/
public function chat(array $messages): Message
{
$result = $this->client->post('chat', [
RequestOptions::JSON => [
'model' => $this->model,
'messages' => \array_map(function (Message $message) {
return $message->jsonSerialize();
}, $messages)
]
])->getBody()->getContents();
$result = \json_decode($result, true);
return new AssistantMessage($result['content']);
}
}
After creating your own implementation you can use it in the agent:
use App\AI\Providers\MyAIProvider;
use NeuronAI\Agent;
use NeuronAI\Providers\AIProviderInterface;
class MyAgent extends Agent
{
public function provider(): AIProviderInterface
{
return new MyAIProvider (
key: 'PROVIDER_API_KEY',
model: 'PROVIDER_MODEL',
);
}
}
We strongly recommend you to submit new provider implementations via PR on the official repository or using other support channels. The new implementation can receives an important boost in its advancement by the community.