Skip to content

OpenAI Plugin

The genkit-plugin-compat-oai package includes a pre-configured plugin for official OpenAI models.

Terminal window
pip install genkit-plugin-compat-oai

To use this plugin, import OpenAI and openai_model and specify it when you initialize Genkit:

from genkit.ai import Genkit
from genkit.plugins.compat_oai import OpenAI, openai_model
ai = Genkit(plugins=[OpenAI()], model=openai_model('gpt-4o'))

The plugin requires an API key for the OpenAI API. You can get one from the OpenAI Platform.

Configure the plugin to use your API key by doing one of the following:

  • Set the OPENAI_API_KEY environment variable to your API key.

  • Specify the API key when you initialize the plugin:

    OpenAI(api_key=your_key)

    However, don’t embed your API key directly in code!

The plugin provides helpers to reference supported models.

You can reference chat models like gpt-4o and gpt-4-turbo using the openai_model() helper.

import structlog
from pydantic import BaseModel, Field
from genkit.ai import Genkit
from genkit.plugins.compat_oai import OpenAI, openai_model
logger = structlog.get_logger(__name__)
ai = Genkit(plugins=[OpenAI()])
@ai.flow()
async def say_hi(name: str) -> str:
"""Say hi to a name.
Args:
name: The name to say hi to.
Returns:
The response from the OpenAI API.
"""
response = await ai.generate(
model=openai_model('gpt-4'),
prompt=f'hi {name}',
)
return response.message.content[0].root.text

You can also pass model-specific configuration:

response = await ai.generate(
model=openai_model('gpt-4'),
config={'temperature': 1},
prompt=f'hi {name}',
)

You can define and use tools with OpenAI models.

import httpx
from decimal import Decimal
from pydantic import BaseModel
from genkit.ai import Genkit
from genkit.plugins.compat_oai import OpenAI, openai_model
ai = Genkit(plugins=[OpenAI()])
class WeatherRequest(BaseModel):
"""Weather request."""
latitude: Decimal
longitude: Decimal
@ai.tool(description='Get current temperature for provided coordinates in celsius')
def get_weather_tool(coordinates: WeatherRequest) -> float:
"""Get the current temperature for provided coordinates in celsius.
Args:
coordinates: The coordinates to get the weather for.
Returns:
The current temperature for the provided coordinates.
"""
url = (
f'https://api.open-meteo.com/v1/forecast?'
f'latitude={coordinates.latitude}&longitude={coordinates.longitude}'
f'&current=temperature_2m'
)
with httpx.Client() as client:
response = client.get(url)
data = response.json()
return float(data['current']['temperature_2m'])
@ai.flow()
async def get_weather_flow(location: str) -> str:
"""Get the weather for a location.
Args:
location: The location to get the weather for.
Returns:
The weather for the location.
"""
response = await ai.generate(
model=openai_model('gpt-4o-mini'),
prompt=f"What's the weather like in {location} today?",
tools=['get_weather_tool'],
)
# The response will contain the tool output if the model decided to call it.
return response.message.content[0].root.text

The plugin supports streaming responses.

@ai.flow()
async def say_hi_stream(name: str) -> str:
"""Say hi to a name and stream the response.
Args:
name: The name to say hi to.
Returns:
The response from the OpenAI API.
"""
stream, _ = ai.generate_stream(
model=openai_model('gpt-4'),
prompt=f'hi {name}',
)
result = ''
async for data in stream:
for part in data.content:
result += part.root.text
return result

The @genkit-ai/compat-oai package includes a pre-configured plugin for official OpenAI models.

Terminal window
npm install @genkit-ai/compat-oai

To use this plugin, import openAI and specify it when you initialize Genkit:

import { genkit } from 'genkit';
import { openAI } from '@genkit-ai/compat-oai/openai';
export const ai = genkit({
plugins: [openAI()],
});

The plugin requires an API key for the OpenAI API. You can get one from the OpenAI Platform.

Configure the plugin to use your API key by doing one of the following:

  • Set the OPENAI_API_KEY environment variable to your API key.

  • Specify the API key when you initialize the plugin:

    openAI({ apiKey: yourKey });

    However, don’t embed your API key directly in code! Use this feature only in conjunction with a service like Google Cloud Secret Manager or similar.

The plugin provides helpers to reference supported models and embedders.

You can reference chat models like gpt-4o and gpt-4-turbo using the openAI.model() helper.

import { genkit, z } from 'genkit';
import { openAI } from '@genkit-ai/compat-oai/openai';
const ai = genkit({
plugins: [openAI()],
});
export const jokeFlow = ai.defineFlow(
{
name: 'jokeFlow',
inputSchema: z.object({ subject: z.string() }),
outputSchema: z.object({ joke: z.string() }),
},
async ({ subject }) => {
const llmResponse = await ai.generate({
prompt: `tell me a joke about ${subject}`,
model: openAI.model('gpt-4o'),
});
return { joke: llmResponse.text };
},
);

You can also pass model-specific configuration:

const llmResponse = await ai.generate({
prompt: `tell me a joke about ${subject}`,
model: openAI.model('gpt-4o'),
config: {
temperature: 0.7,
},
});

The plugin supports image generation models like DALL-E 3.

import { genkit } from 'genkit';
import { openAI } from '@genkit-ai/compat-oai/openai';
const ai = genkit({
plugins: [openAI()],
});
// Reference an image generation model
const dalle3 = openAI.model('dall-e-3');
// Use it to generate an image
const imageResponse = await ai.generate({
model: dalle3,
prompt: 'A photorealistic image of a cat programming a computer.',
config: {
size: '1024x1024',
style: 'vivid',
},
});
const imageUrl = imageResponse.media()?.url;

You can use text embedding models to create vector embeddings from text.

import { genkit, z } from 'genkit';
import { openAI } from '@genkit-ai/compat-oai/openai';
const ai = genkit({
plugins: [openAI()],
});
export const embedFlow = ai.defineFlow(
{
name: 'embedFlow',
inputSchema: z.object({ text: z.string() }),
outputSchema: z.object({ embedding: z.string() }),
},
async ({ text }) => {
const embedding = await ai.embed({
embedder: openAI.embedder('text-embedding-ada-002'),
content: text,
});
return { embedding: JSON.stringify(embedding) };
},
);

The OpenAI plugin also supports audio models for transcription (speech-to-text) and speech generation (text-to-speech).

Use models like whisper-1 to transcribe audio files.

import { genkit } from 'genkit';
import { openAI } from '@genkit-ai/compat-oai/openai';
import * as fs from 'fs';
const ai = genkit({
plugins: [openAI()],
});
const whisper = openAI.model('whisper-1');
const audioFile = fs.readFileSync('path/to/your/audio.mp3');
const transcription = await ai.generate({
model: whisper,
prompt: [
{
media: {
contentType: 'audio/mp3',
url: `data:audio/mp3;base64,${audioFile.toString('base64')}`,
},
},
],
});
console.log(transcription.text());

Use models like tts-1 to generate speech from text.

import { genkit } from 'genkit';
import { openAI } from '@genkit-ai/compat-oai/openai';
import * as fs from 'fs';
const ai = genkit({
plugins: [openAI()],
});
const tts = openAI.model('tts-1');
const speechResponse = await ai.generate({
model: tts,
prompt: 'Hello, world! This is a test of text-to-speech.',
config: {
voice: 'alloy',
},
});
const audioData = speechResponse.media();
if (audioData) {
fs.writeFileSync('output.mp3', Buffer.from(audioData.url.split(',')[1], 'base64'));
}

You can pass configuration options that are not defined in the plugin’s custom configuration schema. This permits you to access new models and features without having to update your Genkit version.

import { genkit } from 'genkit';
import { openAI } from '@genkit-ai/compat-oai/openai';
const ai = genkit({
plugins: [openAI()],
});
const llmResponse = await ai.generate({
prompt: `Tell me a cool story`,
model: openAI.model('gpt-4-new'), // hypothetical new model
config: {
seed: 123,
new_feature_parameter: ... // hypothetical config needed for new model
},
});

Genkit passes this config as-is to the OpenAI API giving you access to the new model features. Note that the field name and types are not validated by Genkit and should match the OpenAI API specification to work.

Some OpenAI models support web search. You can enable it in the config block:

import { genkit } from 'genkit';
import { openAI } from '@genkit-ai/compat-oai/openai';
const ai = genkit({
plugins: [openAI()],
});
const llmResponse = await ai.generate({
prompt: `What was a positive news story from today?`,
model: openAI.model('gpt-4o-search-preview'),
config: {
web_search_options: {},
},
});

The OpenAI plugin provides access to OpenAI models.

import "github.com/firebase/genkit/go/plugins/compat_oai/openai"
g := genkit.Init(context.Background(), genkit.WithPlugins(&openai.OpenAI{
APIKey: "YOUR_OPENAI_API_KEY", // or set OPENAI_API_KEY env var
}))
  • gpt-4.1 - Latest GPT-4.1 with multimodal support
  • gpt-4.1-mini - Faster, cost-effective GPT-4.1 variant
  • gpt-4.1-nano - Ultra-efficient GPT-4.1 variant
  • gpt-4.5-preview - Preview of GPT-4.5 with advanced capabilities
  • gpt-4o - Advanced GPT-4 with vision and tool support
  • gpt-4o-mini - Fast and cost-effective GPT-4o variant
  • gpt-4-turbo - High-performance GPT-4 with large context window
  • o3-mini - Latest compact reasoning model
  • o1 - Advanced reasoning model for complex problems
  • o1-mini - Compact reasoning model
  • o1-preview - Preview reasoning model
  • gpt-4 - Original GPT-4 model
  • gpt-3.5-turbo - Fast and efficient language model
  • text-embedding-3-large - Most capable embedding model
  • text-embedding-3-small - Fast and efficient embedding model
  • text-embedding-ada-002 - Legacy embedding model
import (
"github.com/firebase/genkit/go/plugins/compat_oai/openai"
"github.com/firebase/genkit/go/plugins/compat_oai"
)
// Initialize Genkit with the OpenAI plugin
g := genkit.Init(ctx, genkit.WithPlugins(&openai.OpenAI{APIKey: "YOUR_API_KEY"}))
// Use GPT-4o for general tasks
model := oai.Model(g, "gpt-4o")
resp, err := genkit.Generate(ctx, g,
ai.WithModel(model),
ai.WithPrompt("Explain quantum computing."),
)
// Use embeddings
embedder := oai.Embedder(g, "text-embedding-3-large")
embeds, err := ai.Embed(ctx, embedder, ai.WithDocs("Hello, world!"))

OpenAI models support tool calling:

// Define a tool
weatherTool := genkit.DefineTool(g, "get_weather", "Get current weather",
func(ctx *ai.ToolContext, input struct{City string}) (string, error) {
return fmt.Sprintf("It's sunny in %s", input.City), nil
})
// Use with GPT models
model := oai.Model(g, "gpt-4o")
resp, err := genkit.Generate(ctx, g,
ai.WithModel(model),
ai.WithPrompt("What's the weather like in San Francisco?"),
ai.WithTools(weatherTool),
)

OpenAI models support vision capabilities:

// Works with GPT-4o models
resp, err := genkit.Generate(ctx, g,
ai.WithModel(model),
ai.WithMessages(
ai.NewUserMessage(
ai.NewTextPart("What do you see in this image?"),
ai.NewMediaPart("image/jpeg", imageData),
),
),
)

OpenAI models support streaming responses:

resp, err := genkit.Generate(ctx, g,
ai.WithModel(model),
ai.WithPrompt("Write a long explanation."),
ai.WithStreaming(func(ctx context.Context, chunk *ai.ModelResponseChunk) error {
for _, content := range chunk.Content {
fmt.Print(content.Text)
}
return nil
}),
)