Loading...
Last updated on
OpenRouter provides a unified API to access various AI models from different providers. You can use GenSX with OpenRouter by configuring the OpenAI client with OpenRouter’s API endpoint.
To use OpenRouter with GenSX, you need to install the @gensx/openai package:
npm install @gensx/openaiConfigure the OpenAI client with your OpenRouter API key and the OpenRouter base URL:
import { OpenAI } from "@gensx/openai";
const client = new OpenAI({
apiKey: process.env.OPENROUTER_API_KEY,
baseURL: "https://openrouter.ai/api/v1",
});Here’s a complete example of using OpenRouter with GenSX:
import { OpenAI } from "@gensx/openai";
interface RespondProps {
userInput: string;
}
type RespondOutput = string;
const GenerateText = gensx.Component<RespondProps, RespondOutput>(
"GenerateText",
async ({ userInput }) => {
const result = await client.chat.completions.create({
model: "anthropic/claude-sonnet-4",
messages: [
{
role: "system",
content: "You are a helpful assistant. Respond to the user's input.",
},
{ role: "user", content: userInput },
],
provider: {
ignore: ["Anthropic"],
},
});
return result.choices[0].message.content ?? "";
},
);
const OpenRouterWorkflow = gensx.Component<{ userInput: string }, string>(
"OpenRouter",
async ({ userInput }: { userInput: string }) => {
const result = await GenerateText({ userInput });
return result;
},
);
const result = await OpenRouterWorkflow.run({
userInput: "Hi there! Write me a short story about a cat that can fly.",
});When using OpenRouter, you can specify models using their full identifiers:
anthropic/claude-sonnet-4openai/gpt-4.1google/gemini-2.5-pro-previewmeta-llama/llama-3.3-70b-instructCheck the OpenRouter documentation for a complete list of available models.
You can use the provider property in the openai.chat.completions.create method to specify OpenRouter-specific options:
openai.chat.completions.create({
model: "anthropic/claude-sonnet-4",
messages: [
/* your messages */
],
provider: {
ignore: ["Anthropic"], // Ignore specific providers
route: "fallback", // Use fallback routing strategy
},
});