Skip to content

AWS Bedrock Plugin

This Genkit plugin allows you to use AWS Bedrock through their official APIs. AWS Bedrock is a fully managed service that provides access to foundation models from leading AI companies through a single API. The plugin enables you to use these models for text generation, embeddings, and image generation. It supports features like tool calling, streaming, multimodal inputs, and cross-region inference for improved performance and resiliency.

Install the plugin in your project with npm or pnpm:

Terminal window
npm install genkitx-aws-bedrock

If you are using Genkit version <v0.9.0, please use the plugin version v1.9.0. If you are using Genkit >=v0.9.0, please use the plugin version >=v1.10.0 due to the new plugins API.

  • Text Generation: Support for multiple foundation models (Amazon Nova, Anthropic Claude, Meta Llama, etc.)
  • Embeddings: Support for text embedding models from Amazon Titan and Cohere
  • Streaming: Full streaming support for real-time responses
  • Tool Calling: Complete function calling capabilities
  • Multimodal Support: Support for text + image inputs (vision models)
  • Cross-Region Inference: Support for inference profiles to improve performance and resiliency
import { genkit } from 'genkit';
import { awsBedrock, amazonNovaProV1 } from "genkitx-aws-bedrock";
const ai = genkit({
plugins: [awsBedrock({ region: "us-east-1" })],
model: amazonNovaProV1,
});
// Basic usage
const response = await ai.generate({
prompt: "What are the key benefits of using AWS Bedrock for AI applications?",
});
console.log(await response.text);

The plugin supports multiple authentication methods depending on your environment.

You can configure the plugin by calling the genkit function with your AWS region and model:

import { genkit, z } from 'genkit';
import { awsBedrock, amazonNovaProV1 } from 'genkitx-aws-bedrock';
const ai = genkit({
plugins: [awsBedrock({ region: "<my-region>" })],
model: amazonNovaProV1,
});

In production environments, it is often necessary to install an additional library to handle authentication. One approach is to use the @aws-sdk/credential-providers package:

import { fromEnv } from '@aws-sdk/credential-providers';
const ai = genkit({
plugins: [
awsBedrock({
region: "us-east-1",
credentials: fromEnv(),
}),
],
});

Ensure you have a .env file with the necessary AWS credentials. Remember that the .env file must be added to your .gitignore to prevent sensitive credentials from being exposed.

AWS_ACCESS_KEY_ID =
AWS_SECRET_ACCESS_KEY =

For local development, you can directly supply the credentials:

const ai = genkit({
plugins: [
awsBedrock({
region: "us-east-1",
credentials: {
accessKeyId: awsAccessKeyId.value(),
secretAccessKey: awsSecretAccessKey.value(),
},
}),
],
});

Each approach allows you to manage authentication effectively based on your environment needs.

If you want to use a model that uses Cross-region Inference Endpoints, you can specify the region in the model configuration. Cross-region inference uses inference profiles to increase throughput and improve resiliency by routing your requests across multiple AWS Regions during peak utilization bursts:

import { genkit, z } from 'genkit';
import {
awsBedrock,
amazonNovaProV1,
anthropicClaude35SonnetV2,
} from "genkitx-aws-bedrock";
const ai = genkit({
plugins: [awsBedrock()],
model: anthropicClaude35SonnetV2("us"),
});

You can check more information about the available models in the AWS Bedrock Plugin documentation.

  • Text Generation: Support for multiple foundation models (Amazon Nova, Anthropic Claude, Meta Llama, etc.)
  • Embeddings: Support for text embedding models from Amazon Titan and Cohere
  • Streaming: Full streaming support for real-time responses
  • Tool Calling: Complete function calling capabilities
  • Multimodal Support: Support for text + image inputs (vision models)
  • Cross-Region Inference: Support for inference profiles to improve performance and resiliency

If you want to use a model that is not exported by this plugin, you can register it using the customModels option when initializing the plugin:

import { genkit, z } from 'genkit';
import { awsBedrock } from 'genkitx-aws-bedrock';
const ai = genkit({
plugins: [
awsBedrock({
region: "us-east-1",
customModels: ["openai.gpt-oss-20b-1:0"], // Register custom models
}),
],
});
// Use the custom model by specifying its name as a string
export const customModelFlow = ai.defineFlow(
{
name: "customModelFlow",
inputSchema: z.string(),
outputSchema: z.string(),
},
async (subject) => {
const llmResponse = await ai.generate({
model: "aws-bedrock/openai.gpt-oss-20b-1:0", // Use any registered custom model
prompt: `Tell me about ${subject}`,
});
return llmResponse.text;
},
);

Alternatively, you can define a custom model outside of the plugin initialization:

import { defineAwsBedrockModel } from 'genkitx-aws-bedrock';
const customModel = defineAwsBedrockModel('openai.gpt-oss-20b-1:0', {
region: "us-east-1",
});
const response = await ai.generate({
model: customModel,
prompt: "Hello!",
});

This plugin supports all currently available Chat/Completion and Embeddings models from AWS Bedrock. This plugin supports image input and multimodal models.