Skip to main content

Providers

Toolpack SDK supports multiple AI providers out of the box. You use your own API keys directly - there are no subscription plans or middleman services.

Built-in Providers

ProviderModelsAPI Key Env Variable
OpenAIGPT-4o, o1, o3, and moreOPENAI_API_KEY
AnthropicClaude Sonnet, Haiku, Opus, and moreANTHROPIC_API_KEY
GeminiGemini Flash, Pro, and moreGEMINI_API_KEY
OllamaAny locally installed modelNone (local)

Single Provider Setup

The simplest configuration uses one provider:

import { Toolpack } from 'toolpack-sdk';

const toolpack = await Toolpack.init({
provider: 'openai',
model: 'gpt-4o', // optional default model
});

The SDK automatically reads API keys from environment variables:

  • OPENAI_API_KEY or TOOLPACK_OPENAI_KEY
  • ANTHROPIC_API_KEY or TOOLPACK_ANTHROPIC_KEY
  • GEMINI_API_KEY or TOOLPACK_GEMINI_KEY

Or pass the key directly:

const toolpack = await Toolpack.init({
provider: 'anthropic',
apiKey: 'sk-ant-...',
});

Multi-Provider Setup

Configure multiple providers and switch between them:

const toolpack = await Toolpack.init({
providers: {
openai: {
/* Optional - apiKey: process.env.OPENAI_API_KEY, */
},
anthropic: {
/* Optional - apiKey: process.env.ANTHROPIC_API_KEY, */
},
gemini: {
/* Optional - apiKey: process.env.GEMINI_API_KEY, */
},
},
defaultProvider: 'openai',
});

// Use default provider (OpenAI)
await toolpack.generate('Hello!');

// Switch to Anthropic
toolpack.setProvider('anthropic');
await toolpack.generate('Hello from Claude!');

// Or specify per-request
await toolpack.generate(
{ messages: [{ role: 'user', content: 'Hello!' }], model: 'gpt-4o' },
'openai' // provider name
);

Ollama (Local Models)

Ollama runs models locally - no API key required:

const toolpack = await Toolpack.init({
provider: 'ollama',
});

// List available local models
const providers = await toolpack.listProviders();
console.log(providers);

// Use a specific model
const response = await toolpack.generate({
messages: [{ role: 'user', content: 'Hello!' }],
model: 'llama3.2',
});

Configure a custom Ollama host:

const toolpack = await Toolpack.init({
providers: {
ollama: {
baseUrl: 'http://192.168.1.100:11434',
},
},
defaultProvider: 'ollama',
});

Custom Base URLs

Use OpenAI-compatible APIs (like Azure OpenAI, local proxies, or other compatible services):

const toolpack = await Toolpack.init({
providers: {
openai: {
apiKey: 'your-key',
baseUrl: 'https://your-openai-compatible-api.com/v1',
},
},
defaultProvider: 'openai',
});

Listing Available Models

Get all models from all configured providers:

// List all providers and their models
const providers = await toolpack.listProviders();
for (const provider of providers) {
console.log(`${provider.displayName}:`);
for (const model of provider.models) {
console.log(` - ${model.id} (${model.displayName})`);
}
}

// Or get a flat list of all models
const models = await toolpack.listModels();
console.log(models);

Custom Providers

You can create and inject your own provider adapters. See Custom Providers for details.