Skip to Content

Vercel AI SDK

The @gensx/vercel-ai-sdk package provides Vercel AI SDK compatible components for GenSX, allowing you to use Vercel’s AI SDK with GenSX’s component model.

Installation

To install the package, run the following command:

npm install @gensx/vercel-ai-sdk

You’ll also need to install the Vercel AI SDK:

npm install ai

Then import the components you need from the package:

import { GenerateText, GenerateObject } from "@gensx/vercel-ai-sdk";

Supported components

Component
Description
StreamTextStream text responses from language models
StreamObjectStream structured JSON objects from language models
GenerateTextGenerate complete text responses from language models
GenerateObjectGenerate complete structured JSON objects from language models
EmbedGenerate embeddings for a single text input
EmbedManyGenerate embeddings for multiple text inputs
GenerateImageGenerate images from text prompts

Component Reference

<StreamText/>

The StreamText component streams text responses from language models, making it ideal for chat interfaces and other applications where you want to show responses as they’re generated.

import { StreamText } from "@gensx/vercel-ai-sdk"; import { openai } from "@ai-sdk/openai"; const languageModel = openai("gpt-4o");
<StreamText prompt="Explain quantum computing in simple terms" model={languageModel} />
Props

The StreamText component accepts all parameters from the Vercel AI SDK’s streamText function:

  • prompt (required): The text prompt to send to the model
  • model (required): The language model to use (from Vercel AI SDK)
  • Plus all other parameters supported by the Vercel AI SDK
Return Type

Returns a streaming response that can be consumed token by token.

<StreamObject/>

The StreamObject component streams structured JSON objects from language models, allowing you to get structured data with type safety.

import * as gensx from "@gensx/core"; import { StreamObject } from "@gensx/vercel-ai-sdk"; import { openai } from "@ai-sdk/openai"; import { z } from "zod"; const languageModel = openai("gpt-4o"); // Define a schema for the response const recipeSchema = z.object({ recipe: z.object({ name: z.string(), ingredients: z.array(z.string()), steps: z.array(z.string()), }), });
// Streams a structured object when executed <StreamObject prompt="Generate a recipe for chocolate chip cookies" model={languageModel} schema={recipeSchema} />
Props

The StreamObject component accepts all parameters from the Vercel AI SDK’s streamObject function:

  • prompt (required): The text prompt to send to the model
  • model (required): The language model to use (from Vercel AI SDK)
  • schema: A Zod schema defining the structure of the response
  • output: The output format (“object”, “array”, or “no-schema”)
  • Plus all other parameters supported by the Vercel AI SDK
Return Type

Returns a structured object matching the provided schema.

<GenerateText/>

The GenerateText component generates complete text responses from language models, waiting for the entire response before returning.

import * as gensx from "@gensx/core"; import { GenerateText } from "@gensx/vercel-ai-sdk"; import { openai } from "@ai-sdk/openai"; const languageModel = openai("gpt-4o");
// Generates a complete text response when executed <GenerateText prompt="Write a short poem about programming" model={languageModel} />
Props

The GenerateText component accepts all parameters from the Vercel AI SDK’s generateText function:

  • prompt (required): The text prompt to send to the model
  • model (required): The language model to use (from Vercel AI SDK)
  • Plus any other parameters supported by the Vercel AI SDK
Return Type

Returns a complete text string containing the model’s response.

<GenerateObject/>

The GenerateObject component generates complete structured JSON objects from language models, with type safety through Zod schemas.

import * as gensx from "@gensx/core"; import { GenerateObject } from "@gensx/vercel-ai-sdk"; import { openai } from "@ai-sdk/openai"; import { z } from "zod"; const languageModel = openai("gpt-4o"); // Define a schema for the response const userSchema = z.object({ user: z.object({ name: z.string(), age: z.number(), interests: z.array(z.string()), contact: z.object({ email: z.string().email(), phone: z.string().optional(), }), }), });
// Generates a structured object when executed <GenerateObject prompt="Generate a fictional user profile" model={languageModel} schema={userSchema} />
Props

The GenerateObject component accepts all parameters from the Vercel AI SDK’s generateObject function:

  • prompt (required): The text prompt to send to the model
  • model (required): The language model to use (from Vercel AI SDK)
  • schema: A Zod schema defining the structure of the response
  • output: The output format (“object”, “array”, or “no-schema”)
  • Plus any other optional parameters supported by the Vercel AI SDK
Return Type

Returns a structured object matching the provided schema.

<Embed/>

The Embed component generates embeddings for a single text input, which can be used for semantic search, clustering, and other NLP tasks.

import * as gensx from "@gensx/core"; import { Embed } from "@gensx/vercel-ai-sdk"; import { openai } from "@ai-sdk/openai"; const embeddingModel = openai.embedding("text-embedding-3-small");
// Generates an embedding when executed <Embed value="This is a sample text to embed" model={embeddingModel} />
Props

The Embed component accepts all parameters from the Vercel AI SDK’s embed function:

  • value (required): The text to generate an embedding for
  • model (required): The embedding model to use (from Vercel AI SDK)
  • Plus any other optional parameters supported by the Vercel AI SDK
Return Type

Returns a vector representation (embedding) of the input text.

<EmbedMany/>

The EmbedMany component generates embeddings for multiple text inputs in a single call, which is more efficient than making separate calls for each text.

import * as gensx from "@gensx/core"; import { EmbedMany } from "@gensx/vercel-ai-sdk"; import { openai } from "@ai-sdk/openai"; const embeddingModel = openai.embedding("text-embedding-3-small");
// Generates embeddings for multiple texts when executed <EmbedMany values={[ "First text to embed", "Second text to embed", "Third text to embed", ]} model={embeddingModel} />
Props

The EmbedMany component accepts all parameters from the Vercel AI SDK’s embedMany function:

  • values (required): Array of texts to generate embeddings for
  • model (required): The embedding model to use (from Vercel AI SDK)
  • Plus any other optional parameters supported by the Vercel AI SDK
Return Type

Returns an array of vector representations (embeddings) for the input texts.

<GenerateImage/>

The GenerateImage component generates images from text prompts using image generation models.

import * as gensx from "@gensx/core"; import { GenerateImage } from "@gensx/vercel-ai-sdk"; import { openai } from "@ai-sdk/openai"; const imageModel = openai.image("dall-e-3");
// Generates an image when executed <GenerateImage prompt="A futuristic city with flying cars and neon lights" model={imageModel} />
Props

The GenerateImage component accepts all parameters from the Vercel AI SDK’s experimental_generateImage function:

  • prompt (required): The text description of the image to generate
  • model (required): The image generation model to use (from Vercel AI SDK)
  • Plus any other optional parameters supported by the Vercel AI SDK
Return Type

Returns an object containing information about the generated image, including its URL.

Usage with Different Models

The Vercel AI SDK supports multiple model providers. Here’s how to use different providers with GenSX components:

// OpenAI import { openai } from "@ai-sdk/openai"; const openaiModel = openai("gpt-4o"); // Anthropic import { anthropic } from "@ai-sdk/anthropic"; const anthropicModel = anthropic("claude-3-opus-20240229"); // Cohere import { cohere } from "@ai-sdk/cohere"; const cohereModel = cohere("command-r-plus"); // Use with GenSX components import { GenerateText } from "@gensx/vercel-ai-sdk"; const openaiResponse = await gensx.execute( <GenerateText prompt="Explain quantum computing" model={openaiModel} />, ); const anthropicResponse = await gensx.execute( <GenerateText prompt="Explain quantum computing" model={anthropicModel} />, );

For more information on the Vercel AI SDK, visit the official documentation.

Last updated on