AWS Bedrock Plugin
This Genkit plugin allows you to use AWS Bedrock through their official APIs. AWS Bedrock is a fully managed service that provides access to foundation models from leading AI companies through a single API. The plugin enables you to use these models for text generation, embeddings, and image generation. It supports features like tool calling, streaming, multimodal inputs, and cross-region inference for improved performance and resiliency.
Installation
Section titled “Installation”Install the plugin in your project with npm or pnpm:
npm install genkitx-aws-bedrockVersions
Section titled “Versions”If you are using Genkit version <v0.9.0, please use the plugin version v1.9.0. If you are using Genkit >=v0.9.0, please use the plugin version >=v1.10.0 due to the new plugins API.
Features
Section titled “Features”- Text Generation: Support for multiple foundation models (Amazon Nova, Anthropic Claude, Meta Llama, etc.)
- Embeddings: Support for text embedding models from Amazon Titan and Cohere
- Streaming: Full streaming support for real-time responses
- Tool Calling: Complete function calling capabilities
- Multimodal Support: Support for text + image inputs (vision models)
- Cross-Region Inference: Support for inference profiles to improve performance and resiliency
Quick Start
Section titled “Quick Start”import { genkit } from 'genkit';
import { awsBedrock, amazonNovaProV1 } from "genkitx-aws-bedrock";
const ai = genkit({ plugins: [awsBedrock({ region: "us-east-1" })], model: amazonNovaProV1,});
// Basic usageconst response = await ai.generate({ prompt: "What are the key benefits of using AWS Bedrock for AI applications?",});
console.log(await response.text);Configuration
Section titled “Configuration”The plugin supports multiple authentication methods depending on your environment.
Standard Initialization
Section titled “Standard Initialization”You can configure the plugin by calling the genkit function with your AWS region and model:
import { genkit, z } from 'genkit';import { awsBedrock, amazonNovaProV1 } from 'genkitx-aws-bedrock';
const ai = genkit({ plugins: [awsBedrock({ region: "<my-region>" })], model: amazonNovaProV1,});Production Environment Authentication
Section titled “Production Environment Authentication”In production environments, it is often necessary to install an additional library to handle authentication. One approach is to use the @aws-sdk/credential-providers package:
import { fromEnv } from '@aws-sdk/credential-providers';const ai = genkit({ plugins: [ awsBedrock({ region: "us-east-1", credentials: fromEnv(), }), ],});Ensure you have a .env file with the necessary AWS credentials. Remember that the .env file must be added to your .gitignore to prevent sensitive credentials from being exposed.
AWS_ACCESS_KEY_ID =AWS_SECRET_ACCESS_KEY =Local Environment Authentication
Section titled “Local Environment Authentication”For local development, you can directly supply the credentials:
const ai = genkit({ plugins: [ awsBedrock({ region: "us-east-1", credentials: { accessKeyId: awsAccessKeyId.value(), secretAccessKey: awsSecretAccessKey.value(), }, }), ],});Each approach allows you to manage authentication effectively based on your environment needs.
Configuration with Inference Endpoint
Section titled “Configuration with Inference Endpoint”If you want to use a model that uses Cross-region Inference Endpoints, you can specify the region in the model configuration. Cross-region inference uses inference profiles to increase throughput and improve resiliency by routing your requests across multiple AWS Regions during peak utilization bursts:
import { genkit, z } from 'genkit';import { awsBedrock, amazonNovaProV1, anthropicClaude35SonnetV2,} from "genkitx-aws-bedrock";
const ai = genkit({ plugins: [awsBedrock()], model: anthropicClaude35SonnetV2("us"),});You can check more information about the available models in the AWS Bedrock Plugin documentation.
Features
Section titled “Features”- Text Generation: Support for multiple foundation models (Amazon Nova, Anthropic Claude, Meta Llama, etc.)
- Embeddings: Support for text embedding models from Amazon Titan and Cohere
- Streaming: Full streaming support for real-time responses
- Tool Calling: Complete function calling capabilities
- Multimodal Support: Support for text + image inputs (vision models)
- Cross-Region Inference: Support for inference profiles to improve performance and resiliency
Using Custom Models
Section titled “Using Custom Models”If you want to use a model that is not exported by this plugin, you can register it using the customModels option when initializing the plugin:
import { genkit, z } from 'genkit';import { awsBedrock } from 'genkitx-aws-bedrock';
const ai = genkit({ plugins: [ awsBedrock({ region: "us-east-1", customModels: ["openai.gpt-oss-20b-1:0"], // Register custom models }), ],});
// Use the custom model by specifying its name as a stringexport const customModelFlow = ai.defineFlow( { name: "customModelFlow", inputSchema: z.string(), outputSchema: z.string(), }, async (subject) => { const llmResponse = await ai.generate({ model: "aws-bedrock/openai.gpt-oss-20b-1:0", // Use any registered custom model prompt: `Tell me about ${subject}`, }); return llmResponse.text; },);Alternatively, you can define a custom model outside of the plugin initialization:
import { defineAwsBedrockModel } from 'genkitx-aws-bedrock';
const customModel = defineAwsBedrockModel('openai.gpt-oss-20b-1:0', { region: "us-east-1",});
const response = await ai.generate({ model: customModel, prompt: "Hello!",});Supported models
Section titled “Supported models”This plugin supports all currently available Chat/Completion and Embeddings models from AWS Bedrock. This plugin supports image input and multimodal models.