Anthropic
The @gensx/anthropic package provides Anthropic API compatible components for GenSX.
Installation
To install the package, run the following command:
npm install @gensx/anthropic
Supported components
Component | Description |
---|---|
AnthropicProvider | Anthropic Provider that handles configuration and authentication for child components |
GSXChatCompletion | Enhanced component with advanced features for Anthropic chat completions |
ChatCompletion | Simplified component for chat completions with streamlined output interface |
AnthropicChatCompletion | Low-level component that directly matches the Anthropic SDK interface |
Component Comparison
The package provides three different chat completion components to suit different use cases:
- AnthropicChatCompletion: Direct mapping to the Anthropic API with identical inputs and outputs
- GSXChatCompletion: Enhanced component with additional features like structured output and automated tool calling
- ChatCompletion: Simplified interface that returns string responses or simple streams while maintaining identical inputs to the Anthropic API
Reference
<AnthropicProvider/>
The AnthropicProvider
component initializes and provides an Anthropic client instance to all child components. Any components that use Anthropic’s API need to be wrapped in an AnthropicProvider
.
import { AnthropicProvider } from "@gensx/anthropic";
<AnthropicProvider
apiKey="your-api-key" // Your Anthropic API key
/>;
Props
The AnthropicProvider
accepts all configuration options from the Anthropic Node.js client library including:
apiKey
(required): Your Anthropic API key- Plus all other Anthropic client configuration options
<GSXChatCompletion/>
The GSXChatCompletion
component is an advanced chat completion component that provides enhanced features beyond the standard Anthropic API. It supports structured output, tool calling, and streaming, with automatic handling of tool execution.
import { GSXChatCompletion, GSXTool } from "@gensx/anthropic";
import { z } from "zod";
// Example with structured output
const result = await gsx.execute(
<GSXChatCompletion
model="claude-3-5-sonnet-latest"
system="You are a helpful assistant."
messages={[
{
role: "user",
content: "Extract the name and age from: John Doe, 32 years old",
},
]}
outputSchema={z.object({
name: z.string(),
age: z.number(),
})}
/>,
);
// result is typed as { name: string, age: number }
// Example with tools
const weatherTool = GSXTool.create({
name: "get_weather",
description: "Get the weather for a given location",
schema: z.object({
location: z.string(),
}),
run: async ({ location }) => {
return { weather: "sunny" };
},
});
const toolResult = await gsx.execute(
<GSXChatCompletion
model="claude-3-5-sonnet-latest"
system="You are a helpful assistant."
messages={[{ role: "user", content: "What's the weather in Seattle?" }]}
tools={[weatherTool]}
/>,
);
Props
The GSXChatCompletion
component accepts all parameters from Anthropic’s messages API plus additional options:
model
(required): ID of the model to use (e.g.,"claude-3-7-sonnet-latest"
,"claude-3-5-haiku-latest"
)messages
(required): Array of messages in the conversationmax_tokens
(required): Maximum number of tokens to generatesystem
: System prompt to set the behavior of the assistantstream
: Whether to stream the response (whentrue
, returns aStream<RawMessageStreamEvent>
)tools
: Array ofGSXTool
instances for function callingoutputSchema
: Zod schema for structured output (when provided, returns data matching the schema)temperature
: Sampling temperature- Plus all standard Anthropic message parameters
Return Types
The return type of GSXChatCompletion
depends on the props:
- With
stream: true
: ReturnsStream<RawMessageStreamEvent>
from Anthropic SDK - With
outputSchema
: Returns data matching the provided Zod schema - Default: Returns
GSXChatCompletionResult
(Anthropic response with message history)
<ChatCompletion/>
The ChatCompletion
component provides a simplified interface for chat completions. It returns either a string or a simple stream of string tokens, making it easier to use in UI components.
import { ChatCompletion } from "@gensx/anthropic";
// Non-streaming usage (returns a string)
const response = await gsx.execute(
<ChatCompletion
model="claude-3-5-sonnet-latest"
system="You are a helpful assistant."
messages={[{ role: "user", content: "What's a programmable tree?" }]}
temperature={0.7}
/>,
);
// Streaming usage (returns an AsyncIterableIterator<string>)
const stream = await gsx.execute(
<ChatCompletion
model="claude-3-5-sonnet-latest"
system="You are a helpful assistant."
messages={[{ role: "user", content: "What's a programmable tree?" }]}
temperature={0.7}
stream={true}
/>,
);
// Use in a UI component
<StreamingText stream={stream} />;
Props
The ChatCompletion
component accepts all parameters from Anthropic’s messages API:
model
(required): ID of the model to use (e.g.,"claude-3-5-sonnet-latest"
,"claude-3-haiku-latest"
)messages
(required): Array of messages in the conversationmax_tokens
(required): Maximum number of tokens to generatesystem
: System prompt to set the behavior of the assistanttemperature
: Sampling temperaturestream
: Whether to stream the responsetools
: Array ofGSXTool
instances for function calling (not compatible with streaming)
Return Types
- With
stream: false
(default): Returns a string containing the model’s response - With
stream: true
: Returns anAsyncIterableIterator<string>
that yields tokens as they’re generated
<AnthropicChatCompletion/>
The AnthropicChatCompletion
component is a low-level component that directly maps to the Anthropic SDK. It has identical inputs and outputs to the Anthropic API, making it suitable for advanced use cases where you need full control.
import { AnthropicChatCompletion } from "@gensx/anthropic";
// Non-streaming usage
const completion = await gsx.execute(
<AnthropicChatCompletion
model="claude-3-7-sonnet-latest"
system="You are a helpful assistant."
messages={[{ role: "user", content: "What's a programmable tree?" }]}
/>,
);
console.log(completion.content);
// Streaming usage
const stream = await gsx.execute(
<AnthropicChatCompletion
model="claude-3-7-sonnet-latest"
system="You are a helpful assistant."
messages={[{ role: "user", content: "What's a programmable tree?" }]}
stream={true}
/>,
);
for await (const chunk of stream) {
if (
chunk.type === "content_block_delta" &&
chunk.delta.type === "text_delta"
) {
process.stdout.write(chunk.delta.text);
}
}
Props
The AnthropicChatCompletion
component accepts all parameters from the Anthropic SDK’s messages.create
method:
-
model
(required): ID of the model to use -
messages
(required): Array of messages in the conversation -
max_tokens
(required): Maximum number of tokens to generate -
system
: System prompt to set the behavior of the assistant -
temperature
: Sampling temperature -
stream
: Whether to stream the response -
tools
: Array of Anthropic tool definitions for function calling -
Plus all other Anthropic message parameters
Return Types
- With
stream: false
(default): Returns the fullMessage
object from Anthropic SDK - With
stream: true
: Returns aStream<RawMessageStreamEvent>
from Anthropic SDK