AI Provider
Interact with LLM providers or extend the framework to implement new ones.
With Neuron you can switch between LLM providers with just one line of code, without any impact on your agent implementation.
Anthropic
namespace App\Neuron;
use NeuronAI\Agent;
use NeuronAI\Chat\Messages\UserMessage;
use NeuronAI\Providers\AIProviderInterface;
use NeuronAI\Providers\Anthropic\Anthropic;
use NeuronAI\Providers\HttpClientOptions;
class MyAgent extends Agent
{
protected function provider(): AIProviderInterface
{
return new Anthropic(
key: 'ANTHROPIC_API_KEY',
model: 'ANTHROPIC_MODEL',
parameters: [], // Add custom params (temperature, logprobs, etc)
httpOptions: new HttpClientOptions(timeout: 30),
);
}
}
$response = MyAgent::make()->chat(new UserMessage("Hi!"));
echo $response->getContent();
// Hi, how can I help you today?OpenAIResponses
This component uses the most recent OpenAI responses API:
OpenAI
This component uses the old OpenAI completions API:
AzureOpenAI
This provider allows you to connect with OpenAI models provided in the Azure cloud platform.
OpenAILike
This class simplify the connection with providers offering the same data format of the official OpenAI API.
Ollama
Gemini
Mistral
HuggingFace
Deepseek
Due to the Deepseek API limitations, it doesn't support document and image attachments.
Grok (X-AI)
AWS Bedrock Runtime
To use The BedrockRuntime provider you need to install the aws/aws-sdk-php package.
Custom Http Options
Providers use an HTTP client to communicate with the remote service. You can customize the configuration of the HTTP client passing an instance of \NeuronAI\Providers\HttpClientOptions:
HttpClientOptions class allows customization of timeout, connect_timeout, and headers.
Implement a custom provider
If you want to create a new provider you have to implement the AIProviderInterface interface:
The chat method should contains the call the underlying LLM. If the provider doesn't support tools and function calls, you can implement it with a placeholder.
This is the basic template for a new AI provider implementation.
After creating your own implementation you can use it in the agent:
We strongly recommend you to submit new provider implementations via PR on the official repository or using other Inspector.dev support channels. The new implementation can receives an important boost in its advancement by the community.
Last updated