Frameworks

Using OpenRouter with Frameworks

You can find a few examples of using OpenRouter with other frameworks in this Github repository. Here are some examples:

Using the OpenAI SDK

  • Using pip install openai: github.
  • Using npm i openai: github.

    You can also use Grit to automatically migrate your code. Simply run npx @getgrit/launcher openrouter.

1import OpenAI from "openai"
2
3const openai = new OpenAI({
4 baseURL: "https://openrouter.ai/api/v1",
5 apiKey: "${API_KEY_REF}",
6 defaultHeaders: {
7 ${getHeaderLines().join('\n ')}
8 },
9})
10
11async function main() {
12 const completion = await openai.chat.completions.create({
13 model: "${Model.GPT_4_Omni}",
14 messages: [
15 { role: "user", content: "Say this is a test" }
16 ],
17 })
18
19 console.log(completion.choices[0].message)
20}
21main();

Using LangChain

1const chat = new ChatOpenAI(
2 {
3 modelName: '<model_name>',
4 temperature: 0.8,
5 streaming: true,
6 openAIApiKey: '${API_KEY_REF}',
7 },
8 {
9 basePath: 'https://openrouter.ai/api/v1',
10 baseOptions: {
11 headers: {
12 'HTTP-Referer': '<YOUR_SITE_URL>', // Optional. Site URL for rankings on openrouter.ai.
13 'X-Title': '<YOUR_SITE_NAME>', // Optional. Site title for rankings on openrouter.ai.
14 },
15 },
16 },
17);

Using PydanticAI

PydanticAI provides a high-level interface for working with various LLM providers, including OpenRouter.

Installation

$pip install 'pydantic-ai-slim[openai]'

Configuration

You can use OpenRouter with PydanticAI through its OpenAI-compatible interface:

1from pydantic_ai import Agent
2from pydantic_ai.models.openai import OpenAIModel
3
4model = OpenAIModel(
5 "anthropic/claude-3.5-sonnet", # or any other OpenRouter model
6 base_url="https://openrouter.ai/api/v1",
7 api_key="sk-or-...",
8)
9
10agent = Agent(model)
11result = await agent.run("What is the meaning of life?")
12print(result)

For more details about using PydanticAI with OpenRouter, see the PydanticAI documentation.


Vercel AI SDK

You can use the Vercel AI SDK to integrate OpenRouter with your Next.js app. To get started, install @openrouter/ai-sdk-provider:

$npm install @openrouter/ai-sdk-provider

And then you can use streamText() API to stream text from OpenRouter.

TypeScript
1import { createOpenRouter } from '@openrouter/ai-sdk-provider';
2import { streamText } from 'ai';
3import { z } from 'zod';
4
5export const getLasagnaRecipe = async (modelName: string) => {
6 const openrouter = createOpenRouter({
7 apiKey: '${API_KEY_REF}',
8 });
9
10 const response = streamText({
11 model: openrouter(modelName),
12 prompt: 'Write a vegetarian lasagna recipe for 4 people.',
13 });
14
15 await response.consumeStream();
16 return response.text;
17};
18
19export const getWeather = async (modelName: string) => {
20 const openrouter = createOpenRouter({
21 apiKey: '${API_KEY_REF}',
22 });
23
24 const response = streamText({
25 model: openrouter(modelName),
26 prompt: 'What is the weather in San Francisco, CA in Fahrenheit?',
27 tools: {
28 getCurrentWeather: {
29 description: 'Get the current weather in a given location',
30 parameters: z.object({
31 location: z
32 .string()
33 .describe('The city and state, e.g. San Francisco, CA'),
34 unit: z.enum(['celsius', 'fahrenheit']).optional(),
35 }),
36 execute: async ({ location, unit = 'celsius' }) => {
37 // Mock response for the weather
38 const weatherData = {
39 'Boston, MA': {
40 celsius: '15°C',
41 fahrenheit: '59°F',
42 },
43 'San Francisco, CA': {
44 celsius: '18°C',
45 fahrenheit: '64°F',
46 },
47 };
48
49 const weather = weatherData[location];
50 if (!weather) {
51 return `Weather data for ${location} is not available.`;
52 }
53
54 return `The current weather in ${location} is ${weather[unit]}.`;
55 },
56 },
57 },
58 });
59
60 await response.consumeStream();
61 return response.text;
62};

Mastra

Integrate OpenRouter with Mastra to access a variety of AI models through a unified interface. This guide provides complete examples from basic setup to advanced configurations.

Step 1: Initialize a new Mastra project

The simplest way to start is using the automatic project creation:

$# Create a new project using create-mastra
>npx create-mastra@latest

You’ll be guided through prompts to set up your project. For this example, select:

  • Name your project: my-mastra-openrouter-app
  • Components: Agents (recommended)
  • For default provider, select OpenAI (recommended) - we’ll configure OpenRouter manually later
  • Optionally include example code

For detailed instructions on setting up a Mastra project manually or adding Mastra to an existing project, refer to the official Mastra documentation.

Step 2: Configure your environment variables

After creating your project with create-mastra, you’ll find a .env.development file in your project root. Since we selected OpenAI during setup but will be using OpenRouter instead:

  1. Open the .env.development file
  2. Remove or comment out the OPENAI_API_KEY line
  3. Add your OpenRouter API key:
# .env.development
# OPENAI_API_KEY=your-openai-key # Comment out or remove this line
OPENROUTER_API_KEY=sk-or-your-api-key-here

You can also remove the @ai-sdk/openai package since we’ll be using OpenRouter instead:

$npm uninstall @ai-sdk/openai
$npm install @openrouter/ai-sdk-provider

Step 3: Configure your agent to use OpenRouter

After setting up your Mastra project, you’ll need to modify the agent files to use OpenRouter instead of the default OpenAI provider.

If you used create-mastra, you’ll likely have a file at src/mastra/agents/agent.ts or similar. Replace its contents with:

1import { Agent } from '@mastra/core/agent';
2import { createOpenRouter } from '@openrouter/ai-sdk-provider';
3
4// Initialize OpenRouter provider
5const openrouter = createOpenRouter({
6 apiKey: process.env.OPENROUTER_API_KEY,
7});
8
9// Create an agent
10export const assistant = new Agent({
11 model: openrouter('anthropic/claude-3-opus'),
12 name: 'Assistant',
13 instructions:
14 'You are a helpful assistant with expertise in technology and science.',
15});

Also make sure to update your Mastra entry point at src/mastra/index.ts to use your renamed agent:

1import { Mastra } from '@mastra/core';
2
3import { assistant } from './agents/agent'; // Update the import path if you used a different filename
4
5export const mastra = new Mastra({
6 agents: { assistant }, // Use the same name here as you exported from your agent file
7});

Step 4: Running the Application

Once you’ve configured your agent to use OpenRouter, you can run the Mastra development server:

$npm run dev

This will start the Mastra development server and make your agent available at:

  • REST API endpoint: http://localhost:4111/api/agents/assistant/generate
  • Interactive playground: http://localhost:4111

The Mastra playground provides a user-friendly interface where you can interact with your agent and test its capabilities without writing any additional code.

You can also test the API endpoint using curl if needed:

$curl -X POST http://localhost:4111/api/agents/assistant/generate \
>-H "Content-Type: application/json" \
>-d '{"messages": ["What are the latest advancements in quantum computing?"]}'

Basic Integration with Mastra

The simplest way to integrate OpenRouter with Mastra is by using the OpenRouter AI provider with Mastra’s Agent system:

1import { Agent } from '@mastra/core/agent';
2import { createOpenRouter } from '@openrouter/ai-sdk-provider';
3
4// Initialize the OpenRouter provider
5const openrouter = createOpenRouter({
6 apiKey: process.env.OPENROUTER_API_KEY,
7});
8
9// Create an agent using OpenRouter
10const assistant = new Agent({
11 model: openrouter('anthropic/claude-3-opus'),
12 name: 'Assistant',
13 instructions: 'You are a helpful assistant.',
14});
15
16// Generate a response
17const response = await assistant.generate([
18 {
19 role: 'user',
20 content: 'Tell me about renewable energy sources.',
21 },
22]);
23
24console.log(response.text);

Advanced Configuration

For more control over your OpenRouter requests, you can pass additional configuration options:

1import { Agent } from '@mastra/core/agent';
2import { createOpenRouter } from '@openrouter/ai-sdk-provider';
3
4// Initialize with advanced options
5const openrouter = createOpenRouter({
6 apiKey: process.env.OPENROUTER_API_KEY,
7 extraBody: {
8 reasoning: {
9 max_tokens: 10,
10 },
11 },
12});
13
14// Create an agent with model-specific options
15const chefAgent = new Agent({
16 model: openrouter('anthropic/claude-3.7-sonnet:thinking', {
17 extraBody: {
18 reasoning: {
19 max_tokens: 10,
20 },
21 },
22 }),
23 name: 'Chef',
24 instructions: 'You are a chef assistant specializing in French cuisine.',
25});

Provider-Specific Options

You can also pass provider-specific options in your requests:

1// Get a response with provider-specific options
2const response = await chefAgent.generate([
3 {
4 role: 'system',
5 content:
6 'You are Chef Michel, a culinary expert specializing in ketogenic (keto) diet...',
7 providerOptions: {
8 // Provider-specific options - key can be 'anthropic' or 'openrouter'
9 anthropic: {
10 cacheControl: { type: 'ephemeral' },
11 },
12 },
13 },
14 {
15 role: 'user',
16 content: 'Can you suggest a keto breakfast?',
17 },
18]);

Using Multiple Models with OpenRouter

OpenRouter gives you access to various models from different providers. Here’s how to use multiple models:

1import { Agent } from '@mastra/core/agent';
2import { createOpenRouter } from '@openrouter/ai-sdk-provider';
3
4const openrouter = createOpenRouter({
5 apiKey: process.env.OPENROUTER_API_KEY,
6});
7
8// Create agents using different models
9const claudeAgent = new Agent({
10 model: openrouter('anthropic/claude-3-opus'),
11 name: 'ClaudeAssistant',
12 instructions: 'You are a helpful assistant powered by Claude.',
13});
14
15const gptAgent = new Agent({
16 model: openrouter('openai/gpt-4'),
17 name: 'GPTAssistant',
18 instructions: 'You are a helpful assistant powered by GPT-4.',
19});
20
21// Use different agents based on your needs
22const claudeResponse = await claudeAgent.generate([
23 {
24 role: 'user',
25 content: 'Explain quantum mechanics simply.',
26 },
27]);
28console.log(claudeResponse.text);
29
30const gptResponse = await gptAgent.generate([
31 {
32 role: 'user',
33 content: 'Explain quantum mechanics simply.',
34 },
35]);
36console.log(gptResponse.text);

Resources

For more information and detailed documentation, check out these resources: