OpenAI Plugin
The OpenAI plugin provides access to OpenAI models.
Configuration
Section titled “Configuration”import "github.com/firebase/genkit/go/plugins/compat_oai/openai"g := genkit.Init(context.Background(), genkit.WithPlugins(&openai.OpenAI{ APIKey: "YOUR_OPENAI_API_KEY", // or set OPENAI_API_KEY env var}))Supported Models
Section titled “Supported Models”Latest Models
Section titled “Latest Models”- gpt-4.1 - Latest GPT-4.1 with multimodal support
- gpt-4.1-mini - Faster, cost-effective GPT-4.1 variant
- gpt-4.1-nano - Ultra-efficient GPT-4.1 variant
- gpt-4.5-preview - Preview of GPT-4.5 with advanced capabilities
Production Models
Section titled “Production Models”- gpt-4o - Advanced GPT-4 with vision and tool support
- gpt-4o-mini - Fast and cost-effective GPT-4o variant
- gpt-4-turbo - High-performance GPT-4 with large context window
Reasoning Models
Section titled “Reasoning Models”- o3-mini - Latest compact reasoning model
- o1 - Advanced reasoning model for complex problems
- o1-mini - Compact reasoning model
- o1-preview - Preview reasoning model
Legacy Models
Section titled “Legacy Models”- gpt-4 - Original GPT-4 model
- gpt-3.5-turbo - Fast and efficient language model
Embedding Models
Section titled “Embedding Models”- text-embedding-3-large - Most capable embedding model
- text-embedding-3-small - Fast and efficient embedding model
- text-embedding-ada-002 - Legacy embedding model
Usage Example
Section titled “Usage Example”import ( "github.com/firebase/genkit/go/plugins/compat_oai/openai" "github.com/firebase/genkit/go/plugins/compat_oai")
// Initialize Genkit with the OpenAI pluging := genkit.Init(ctx, genkit.WithPlugins(&openai.OpenAI{APIKey: "YOUR_API_KEY"}))
// Use GPT-4o for general tasksmodel := oai.Model(g, "gpt-4o")resp, err := genkit.Generate(ctx, g, ai.WithModel(model), ai.WithPrompt("Explain quantum computing."),)
// Use embeddingsembedder := oai.Embedder(g, "text-embedding-3-large")embeds, err := ai.Embed(ctx, embedder, ai.WithDocs("Hello, world!"))Using a custom provider
Section titled “Using a custom provider”import ( "github.com/openai/openai-go" oai "github.com/firebase/genkit/go/plugins/compat_oai")
// custom provider plugin parameters g := genkit.Init(ctx, genkit.WithPlugins(&oai.OpenAICompatible{ Provider: "custom-provider", APIKey: "api-key", BaseURL: "custom-url", }), genkit.WithDefaultModel("custom-provider/id"))
resp, err := genkit.Generate(ctx, g, ai.WithPrompt("Tell me a joke"))Advanced Features
Section titled “Advanced Features”Tool Calling
Section titled “Tool Calling”OpenAI models support tool calling:
// Define a toolweatherTool := genkit.DefineTool(g, "get_weather", "Get current weather", func(ctx *ai.ToolContext, input struct{City string}) (string, error) { return fmt.Sprintf("It's sunny in %s", input.City), nil })
// Use with GPT modelsmodel := oai.Model(g, "gpt-4o")resp, err := genkit.Generate(ctx, g, ai.WithModel(model), ai.WithPrompt("What's the weather like in San Francisco?"), ai.WithTools(weatherTool),)Multimodal Support
Section titled “Multimodal Support”OpenAI models support vision capabilities:
// Works with GPT-4o modelsresp, err := genkit.Generate(ctx, g, ai.WithModel(model), ai.WithMessages( ai.NewUserMessage( ai.NewTextPart("What do you see in this image?"), ai.NewMediaPart("image/jpeg", imageData), ), ),)Streaming
Section titled “Streaming”OpenAI models support streaming responses:
resp, err := genkit.Generate(ctx, g, ai.WithModel(model), ai.WithPrompt("Write a long explanation."), ai.WithStreaming(func(ctx context.Context, chunk *ai.ModelResponseChunk) error { for _, content := range chunk.Content { fmt.Print(content.Text) } return nil }),)