Skip to content

OpenAI Plugin

The genkit-plugin-compat-oai package includes a pre-configured plugin for official OpenAI models.

Terminal window
uv add genkit-plugin-compat-oai

To use this plugin, import OpenAI and openai_model and specify it when you initialize Genkit:

from genkit import Genkit
from genkit.plugins.compat_oai import OpenAI, openai_model
ai = Genkit(plugins=[OpenAI()], model=openai_model('gpt-4o'))

The plugin requires an API key for the OpenAI API. You can get one from the OpenAI Platform.

Configure the plugin to use your API key by doing one of the following:

  • Set the OPENAI_API_KEY environment variable to your API key.
  • Specify the API key when you initialize the plugin:
OpenAI(api_key='YOUR_API_KEY')

However, don’t embed your API key directly in code!

The plugin provides helpers to reference supported models and embedders.

You can reference chat models like gpt-4o and gpt-4-turbo using the openai_model() helper.

import structlog
from pydantic import BaseModel, Field
from genkit import Genkit
from genkit.plugins.compat_oai import OpenAI, openai_model
logger = structlog.get_logger(__name__)
ai = Genkit(plugins=[OpenAI()])
@ai.flow()
async def say_hi(name: str) -> str:
"""Say hi to a name.
Args:
name: The name to say hi to.
Returns:
The response from the OpenAI API.
"""
response = await ai.generate(
model=openai_model('gpt-4'),
prompt=f'hi {name}',
)
return response.text

You can also pass model-specific configuration:

response = await ai.generate(
model=openai_model('gpt-4'),
config={'temperature': 1},
prompt=f'hi {name}',
)

You can use text embedding models to create vector embeddings from text.

from genkit import Genkit
from genkit.plugins.compat_oai import OpenAI
ai = Genkit(plugins=[OpenAI()])
@ai.flow()
async def embed_flow(text: str) -> list[float]:
"""Create embeddings for text.
Args:
text: The text to embed.
Returns:
The embedding vector.
"""
embedding = await ai.embed(
embedder='openai/text-embedding-ada-002',
content=text,
)
return embedding

You can define and use tools with OpenAI models.

import httpx
from decimal import Decimal
from pydantic import BaseModel
from genkit import Genkit
from genkit.plugins.compat_oai import OpenAI, openai_model
ai = Genkit(plugins=[OpenAI()])
class WeatherRequest(BaseModel):
"""Weather request."""
latitude: Decimal
longitude: Decimal
@ai.tool(description='Get current temperature for provided coordinates in celsius')
def get_weather_tool(coordinates: WeatherRequest) -> float:
"""Get the current temperature for provided coordinates in celsius.
Args:
coordinates: The coordinates to get the weather for.
Returns:
The current temperature for the provided coordinates.
"""
url = (
f'https://api.open-meteo.com/v1/forecast?'
f'latitude={coordinates.latitude}&longitude={coordinates.longitude}'
f'&current=temperature_2m'
)
with httpx.Client() as client:
response = client.get(url)
data = response.json()
return float(data['current']['temperature_2m'])
@ai.flow()
async def get_weather_flow(location: str) -> str:
"""Get the weather for a location.
Args:
location: The location to get the weather for.
Returns:
The weather for the location.
"""
response = await ai.generate(
model=openai_model('gpt-4o-mini'),
prompt=f"What's the weather like in {location} today?",
tools=['get_weather_tool'],
)
# The response will contain the tool output if the model decided to call it.
return response.text

The plugin supports streaming responses.

@ai.flow()
async def say_hi_stream(name: str) -> str:
"""Say hi to a name and stream the response.
Args:
name: The name to say hi to.
Returns:
The response from the OpenAI API.
"""
stream, _ = ai.generate_stream(
model=openai_model('gpt-4'),
prompt=f'hi {name}',
)
result = ''
async for chunk in stream:
result += chunk.text
return result

You can pass configuration options that are not defined in the plugin’s custom configuration schema. This permits you to access new models and features without having to update your Genkit version.

from genkit import Genkit
from genkit.plugins.compat_oai import OpenAI, openai_model
ai = Genkit(plugins=[OpenAI()])
response = await ai.generate(
prompt='Tell me a cool story',
model=openai_model('gpt-4-new'), # hypothetical new model
config={
'seed': 123,
'new_feature_parameter': ..., # hypothetical config needed for new model
},
)

Genkit passes this config as-is to the OpenAI API giving you access to the new model features. Note that the field name and types are not validated by Genkit and should match the OpenAI API specification to work.