OpenRouter
OpenRouter provides a unified API to access various AI models from different providers. You can use GenSX with OpenRouter by configuring the OpenAIProvider component with OpenRouter’s API endpoint.
Installation
To use OpenRouter with GenSX, you need to install the OpenAI package:
npm install @gensx/openai
Configuration
Configure the OpenAIProvider
with your OpenRouter API key and the OpenRouter base URL:
import { OpenAIProvider } from "@gensx/openai";
<OpenAIProvider
apiKey={process.env.OPENROUTER_API_KEY}
baseURL="https://openrouter.ai/api/v1"
>
{/* Your components here */}
</OpenAIProvider>;
Example Usage
Here’s a complete example of using OpenRouter with GenSX:
import * as gensx from "@gensx/core";
import { ChatCompletion, OpenAIProvider } from "@gensx/openai";
interface RespondProps {
userInput: string;
}
type RespondOutput = string;
const GenerateText = gensx.Component<RespondProps, RespondOutput>(
"GenerateText",
({ userInput }) => (
<ChatCompletion
model="anthropic/claude-3.7-sonnet"
messages={[
{
role: "system",
content: "You are a helpful assistant. Respond to the user's input.",
},
{ role: "user", content: userInput },
]}
provider={{
ignore: ["Anthropic"],
}}
/>
),
);
const OpenRouterExampleComponent = gensx.Component<
{ userInput: string },
string
>("OpenRouter", ({ userInput }) => (
<OpenAIProvider
apiKey={process.env.OPENROUTER_API_KEY}
baseURL="https://openrouter.ai/api/v1"
>
<GenerateText userInput={userInput} />
</OpenAIProvider>
));
const workflow = gensx.Workflow(
"OpenRouterWorkflow",
OpenRouterExampleComponent,
);
const result = await workflow.run({
userInput: "Hi there! Write me a short story about a cat that can fly.",
});
Specifying Models
When using OpenRouter, you can specify models using their full identifiers:
anthropic/claude-3.7-sonnet
openai/gpt-4o
google/gemini-1.5-pro
mistral/mistral-large-latest
Check the OpenRouter documentation for a complete list of available models.
Provider Options
You can use the provider
property in the ChatCompletion
component to specify OpenRouter-specific options:
<ChatCompletion
model="anthropic/claude-3.7-sonnet"
messages={
[
/* your messages */
]
}
provider={{
ignore: ["Anthropic"], // Ignore specific providers
route: "fallback", // Use fallback routing strategy
}}
/>
Learn More
Last updated on