Frameworks
Using OpenRouter with Frameworks
You can find a few examples of using OpenRouter with other frameworks in this Github repository. Here are some examples:
Using the OpenAI SDK
- Using
pip install openai
: github. - Using
npm i openai
: github.You can also use Grit to automatically migrate your code. Simply run
npx @getgrit/launcher openrouter
.
Using LangChain
- Using LangChain for Python: github
- Using LangChain.js: github
- Using Streamlit: github
Using PydanticAI
PydanticAI provides a high-level interface for working with various LLM providers, including OpenRouter.
Installation
Configuration
You can use OpenRouter with PydanticAI through its OpenAI-compatible interface:
For more details about using PydanticAI with OpenRouter, see the PydanticAI documentation.
Vercel AI SDK
You can use the Vercel AI SDK to integrate OpenRouter with your Next.js app. To get started, install @openrouter/ai-sdk-provider:
And then you can use streamText() API to stream text from OpenRouter.
Mastra
Integrate OpenRouter with Mastra to access a variety of AI models through a unified interface. This guide provides complete examples from basic setup to advanced configurations.
Step 1: Initialize a new Mastra project
The simplest way to start is using the automatic project creation:
You’ll be guided through prompts to set up your project. For this example, select:
- Name your project: my-mastra-openrouter-app
- Components: Agents (recommended)
- For default provider, select OpenAI (recommended) - we’ll configure OpenRouter manually later
- Optionally include example code
For detailed instructions on setting up a Mastra project manually or adding Mastra to an existing project, refer to the official Mastra documentation.
Step 2: Configure your environment variables
After creating your project with create-mastra
, you’ll find a .env.development
file in your project root. Since we selected OpenAI during setup but will be using OpenRouter instead:
- Open the
.env.development
file - Remove or comment out the
OPENAI_API_KEY
line - Add your OpenRouter API key:
You can also remove the @ai-sdk/openai
package since we’ll be using OpenRouter instead:
Step 3: Configure your agent to use OpenRouter
After setting up your Mastra project, you’ll need to modify the agent files to use OpenRouter instead of the default OpenAI provider.
If you used create-mastra
, you’ll likely have a file at src/mastra/agents/agent.ts
or similar. Replace its contents with:
Also make sure to update your Mastra entry point at src/mastra/index.ts
to use your renamed agent:
Step 4: Running the Application
Once you’ve configured your agent to use OpenRouter, you can run the Mastra development server:
This will start the Mastra development server and make your agent available at:
- REST API endpoint:
http://localhost:4111/api/agents/assistant/generate
- Interactive playground:
http://localhost:4111
The Mastra playground provides a user-friendly interface where you can interact with your agent and test its capabilities without writing any additional code.
You can also test the API endpoint using curl if needed:
Basic Integration with Mastra
The simplest way to integrate OpenRouter with Mastra is by using the OpenRouter AI provider with Mastra’s Agent system:
Advanced Configuration
For more control over your OpenRouter requests, you can pass additional configuration options:
Provider-Specific Options
You can also pass provider-specific options in your requests:
Using Multiple Models with OpenRouter
OpenRouter gives you access to various models from different providers. Here’s how to use multiple models:
Resources
For more information and detailed documentation, check out these resources:
- OpenRouter Documentation - Learn about OpenRouter’s capabilities and available models
- Mastra Documentation - Comprehensive documentation for the Mastra framework
- AI SDK Documentation - Detailed information about the AI SDK that powers Mastra’s model interactions