Skip to content

Azure Foundry Plugin

This plugin enables you to use Azure OpenAI APIs with Genkit. Azure AI Foundry provides access to powerful OpenAI models (GPT-5, GPT-4, etc.) through Azure’s infrastructure. The plugin supports text generation, embeddings, image generation, text-to-speech, speech-to-text, streaming, tool calling, and multimodal inputs, all with flexible authentication options including API keys, Managed Identity, and Azure CLI.

Install the plugin in your project with npm or pnpm:

Terminal window
npm install genkitx-azure-openai

The interface to the models of this plugin is the same as for the OpenAI plugin.

You’ll also need to have an Azure OpenAI instance deployed. You can deploy a version on Azure Portal following this guide.

Once you have your instance running, make sure you have the endpoint and key. You can find them in the Azure Portal, under the “Keys and Endpoint” section of your instance.

You can then define the following environment variables to use the service:

AZURE_OPENAI_ENDPOINT=<YOUR_ENDPOINT>
AZURE_OPENAI_API_KEY=<YOUR_KEY>
OPENAI_API_VERSION=<YOUR_API_VERSION>

Alternatively, you can pass the values directly to the azureOpenAI constructor:

import { azureOpenAI, gpt5 } from 'genkitx-azure-openai';
import { genkit } from 'genkit';
const apiVersion = '2024-10-21';
const ai = genkit({
plugins: [
azureOpenAI({
apiKey: '<your_key>',
endpoint: '<your_endpoint>',
deployment: '<your_embedding_deployment_name',
apiVersion,
}),
// other plugins
],
model: gpt5,
});

If you’re using Azure Managed Identity, you can also pass the credentials directly to the constructor:

import { azureOpenAI, gpt5 } from 'genkitx-azure-openai';
import { genkit } from 'genkit';
import { DefaultAzureCredential, getBearerTokenProvider } from '@azure/identity';
const apiVersion = '2024-10-21';
const credential = new DefaultAzureCredential();
const scope = 'https://cognitiveservices.azure.com/.default';
const azureADTokenProvider = getBearerTokenProvider(credential, scope);
const ai = genkit({
plugins: [
azureOpenAI({
azureADTokenProvider,
endpoint: '<your_endpoint>',
deployment: '<your_embedding_deployment_name',
apiVersion,
}),
// other plugins
],
model: gpt5,
});
  • Text Generation: Support for GPT models (GPT-5, GPT-4, etc.)
  • Embeddings: Support for text-embedding models
  • Streaming: Full streaming support for real-time responses
  • Tool Calling: Complete function calling capabilities
  • Multimodal Support: Support for text + image inputs
  • Flexible Authentication: Support for API keys, Managed Identity, and Azure CLI

For more Genkit features like embeddings, structured output, and flows, refer to the Genkit documentation.

Azure AI Foundry plugin for Genkit Go that provides text generation and chat capabilities using Azure OpenAI and other models available through Azure AI Foundry.

Terminal window
go get github.com/xavidop/genkit-azure-foundry-go
  • Text Generation: Support for GPT models
  • Embeddings: Support for text-embedding models
  • Image Generation: Support for creating images from text prompts
  • Text-to-Speech: Convert text to natural-sounding speech with multiple voices
  • Speech-to-Text: Transcribe audio to text with subtitle support
  • Streaming: Full streaming support for real-time responses
  • Tool Calling: Complete function calling capabilities
  • Multimodal Support: Support for text + image inputs
  • Multi-turn Conversations: Full support for chat history and context management
  • Type Safety: Robust type conversion and schema validation
  • Flexible Authentication: Support for API keys, Azure Default Credential, and custom token credentials
package main
import (
"context"
"log"
"os"
"github.com/firebase/genkit/go/ai"
"github.com/firebase/genkit/go/genkit"
azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)
func main() {
ctx := context.Background()
// Initialize Azure AI Foundry plugin
azurePlugin := &azureaifoundry.AzureAIFoundry{
Endpoint: os.Getenv("AZURE_OPENAI_ENDPOINT"),
APIKey: os.Getenv("AZURE_OPENAI_API_KEY"),
}
// Initialize Genkit
g := genkit.Init(ctx,
genkit.WithPlugins(azurePlugin),
genkit.WithDefaultModel("azureaifoundry/gpt-5"),
)
// Optional: Define common models for easy access
azureaifoundry.DefineCommonModels(azurePlugin, g)
log.Println("Starting basic Azure AI Foundry example...")
// Define a GPT-5 model (use your deployment name)
gpt5Model := azurePlugin.DefineModel(g, azureaifoundry.ModelDefinition{
Name: "gpt-5", // Your deployment name in Azure
Type: "chat",
SupportsMedia: true,
}, nil)
// Example: Generate text (basic usage)
response, err := genkit.Generate(ctx, g,
ai.WithModel(gpt5Model),
ai.WithPrompt("What are the key benefits of using Azure AI Foundry?"),
)
if err != nil {
log.Printf("Error: %v", err)
} else {
log.Printf("Response: %s", response.Text())
}
}

The plugin supports various configuration options:

azurePlugin := &azureaifoundry.AzureAIFoundry{
Endpoint: "https://your-resource.openai.azure.com/",
APIKey: "your-api-key", // Use API key
// OR use Azure credential
// Credential: azidentity.NewDefaultAzureCredential(),
APIVersion: "2024-02-15-preview", // Optional
}
OptionTypeDefaultDescription
EndpointstringrequiredAzure OpenAI endpoint URL
APIKeystring""API key for authentication
Credentialazcore.TokenCredentialnilAzure credential (alternative to API key)
APIVersionstringLatestAPI version to use
  1. Go to Azure Portal
  2. Navigate to your Azure OpenAI resource
  3. Go to “Keys and Endpoint” section
  4. Copy your endpoint URL and API key

The plugin supports multiple authentication methods to suit different deployment scenarios:

Best for: Development, testing, and simple scenarios

Terminal window
export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/"
export AZURE_OPENAI_API_KEY="your-api-key"
import (
"os"
azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)
azurePlugin := &azureaifoundry.AzureAIFoundry{
Endpoint: os.Getenv("AZURE_OPENAI_ENDPOINT"),
APIKey: os.Getenv("AZURE_OPENAI_API_KEY"),
}
Section titled “2. Azure Default Credential (Recommended for Production)”

Best for: Production deployments, Azure-hosted applications

DefaultAzureCredential automatically tries multiple authentication methods in the following order:

  1. Environment variables (AZURE_CLIENT_ID, AZURE_CLIENT_SECRET, AZURE_TENANT_ID)
  2. Managed Identity (when deployed to Azure)
  3. Azure CLI credentials (for local development)
  4. Azure PowerShell credentials
  5. Interactive browser authentication
Terminal window
# Required environment variables
export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/"
export AZURE_TENANT_ID="your-tenant-id"
# Optional: For service principal authentication
export AZURE_CLIENT_ID="your-client-id"
export AZURE_CLIENT_SECRET="your-client-secret"
import (
"fmt"
"os"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)
func main() {
endpoint := os.Getenv("AZURE_OPENAI_ENDPOINT")
tenantID := os.Getenv("AZURE_TENANT_ID")
// Create DefaultAzureCredential
credential, err := azidentity.NewDefaultAzureCredential(&azidentity.DefaultAzureCredentialOptions{
TenantID: tenantID,
})
if err != nil {
fmt.Fprintf(os.Stderr, "ERROR: %s\n", err)
return
}
// Initialize plugin with credential
azurePlugin := &azureaifoundry.AzureAIFoundry{
Endpoint: endpoint,
Credential: credential,
}
// Use the plugin with Genkit...
}

Best for: Applications deployed to Azure (App Service, Container Apps, VMs, AKS)

When deployed to Azure, Managed Identity provides authentication without storing credentials:

import (
"os"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)
func main() {
endpoint := os.Getenv("AZURE_OPENAI_ENDPOINT")
// Use Managed Identity
credential, err := azidentity.NewManagedIdentityCredential(nil)
if err != nil {
panic(err)
}
azurePlugin := &azureaifoundry.AzureAIFoundry{
Endpoint: endpoint,
Credential: credential,
}
}

4. Client Secret Credential (Service Principal)

Section titled “4. Client Secret Credential (Service Principal)”

Best for: CI/CD pipelines, automated deployments

Terminal window
export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/"
export AZURE_TENANT_ID="your-tenant-id"
export AZURE_CLIENT_ID="your-client-id"
export AZURE_CLIENT_SECRET="your-client-secret"
import (
"os"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)
func main() {
endpoint := os.Getenv("AZURE_OPENAI_ENDPOINT")
tenantID := os.Getenv("AZURE_TENANT_ID")
clientID := os.Getenv("AZURE_CLIENT_ID")
clientSecret := os.Getenv("AZURE_CLIENT_SECRET")
credential, err := azidentity.NewClientSecretCredential(tenantID, clientID, clientSecret, nil)
if err != nil {
panic(err)
}
azurePlugin := &azureaifoundry.AzureAIFoundry{
Endpoint: endpoint,
Credential: credential,
}
}

5. Azure CLI Credential (Local Development)

Section titled “5. Azure CLI Credential (Local Development)”

Best for: Local development with Azure CLI installed

Terminal window
# Login to Azure CLI first
az login
export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/"
import (
"os"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)
func main() {
endpoint := os.Getenv("AZURE_OPENAI_ENDPOINT")
// Use Azure CLI credentials
credential, err := azidentity.NewAzureCLICredential(nil)
if err != nil {
panic(err)
}
azurePlugin := &azureaifoundry.AzureAIFoundry{
Endpoint: endpoint,
Credential: credential,
}
}

Important: The Name in ModelDefinition should match your deployment name in Azure, not the model name. For example:

  • If you deployed gpt-5 with deployment name my-gpt5-deployment, use "my-gpt5-deployment"
  • If you deployed gpt-4o with deployment name gpt-4o, use "gpt-4o"

For more Genkit features like embeddings, structured output, and flows, refer to the Genkit documentation.

Generate images with DALL-E models using the standard genkit.Generate() method:

// Define DALL-E model
dallE3 := azurePlugin.DefineModel(g, azureaifoundry.ModelDefinition{
Name: azureaifoundry.ModelDallE3,
Type: "chat",
}, nil)
// Generate image
response, err := genkit.Generate(ctx, g,
ai.WithModel(dallE3),
ai.WithPrompt("A serene landscape with mountains at sunset"),
ai.WithConfig(map[string]interface{}{
"quality": "hd",
"size": "1024x1024",
"style": "vivid",
}),
)
if err != nil {
log.Fatal(err)
}
log.Printf("Image URL: %s", response.Text())

The Configuration options for image generation depends on the model used. Common options include:

OptionTypeDescription
qualitystringImage quality: standard, hd
sizestringImage size: 256x256, 512x512, 1024x1024
stylestringImage style: vivid, photorealistic, cartoon

Convert text to speech using the standard genkit.Generate() method:

import "encoding/base64"
// Define TTS model
ttsModel := azurePlugin.DefineModel(g, azureaifoundry.ModelDefinition{
Name: azureaifoundry.ModelTTS1HD,
Type: "chat",
}, nil)
// Generate speech
response, err := genkit.Generate(ctx, g,
ai.WithModel(ttsModel),
ai.WithPrompt("Hello! Welcome to Azure AI Foundry."),
ai.WithConfig(map[string]interface{}{
"voice": "nova",
"response_format": "mp3",
"speed": 1.5,
}),
)
if err != nil {
log.Fatal(err)
}
// Decode base64 audio and save file
audioData, _ := base64.StdEncoding.DecodeString(response.Text())
os.WriteFile("output.mp3", audioData, 0644)

Transcribe audio to text using the standard genkit.Generate() method:

import "encoding/base64"
// Define Whisper model with media support (required for audio input)
whisperModel := azurePlugin.DefineModel(g, azureaifoundry.ModelDefinition{
Name: azureaifoundry.ModelWhisper1,
Type: "chat",
SupportsMedia: true, // Required for media parts (audio)
}, nil)
// Read and encode audio file
audioData, _ := os.ReadFile("audio.mp3")
base64Audio := base64.StdEncoding.EncodeToString(audioData)
// Transcribe audio
response, err := genkit.Generate(ctx, g,
ai.WithModel(whisperModel),
ai.WithMessages(ai.NewUserMessage(
ai.NewMediaPart("audio/mp3", "data:audio/mp3;base64,"+base64Audio),
)),
ai.WithConfig(map[string]interface{}{
"language": "en",
}),
)
if err != nil {
log.Fatal(err)
}
log.Printf("Transcription: %s", response.Text())