No results for ""
EXPAND ALL
  • Home
  • API docs

Quickstart for AI configs

Read time: 9 minutes
Last edited: Dec 03, 2024

Overview

This topic explains how to get started with the LaunchDarkly AI configs product.

You can use AI configs to customize, test, and roll out new large language models (LLMs) and prompts within your generative AI applications. AI configs may be right for you if you want to:

  • manage your prompts outside of your application code
  • upgrade your app to the newest model version, then gradually release to your customers
  • start using a new model provider, and progressively move your production traffic to the new provider
  • compare the performance of different prompts and models

With AI configs, both the prompt messages and the model evaluation are specific to each end user, at runtime. You can update your prompts, specific to each end user, without redeploying your application. Follow the steps below to incorporate AI configs into your app.

Step 1, in LaunchDarkly: Create an AI config

First, create an AI config:

  1. Click Create and choose AI config.
  2. In the "Create AI config" dialog, give your AI config a human-readable Name.
  3. Click Create.

The empty Versions tab of your new AI config displays:

The Versions tab of a newly-created AI config.
The Versions tab of a newly-created AI config.

Then, create a version. Every AI config has one or more versions, each of which includes your AI prompt messages and model configuration.

Here's how:

  1. In the create panel in the Versions tab, replace "Untitled version" with a version Name. You'll use this to refer to the versions when you set up targeting rules, below.
  2. Click Select a model and choose the model to use. LaunchDarkly provides a list of common models.
  3. Select a prompt role and enter the message for the prompt. If you'd like to customize the message at runtime, use {{ example_variable }} or {{ ldctx.example_context_attribute }} within the message. The LaunchDarkly AI SDK will substitute the correct values when you customize the AI config from within your app.
  4. Click Save changes.

Here's an example of a completed version:

The Versions tab of an AI config with one version.
The Versions tab of an AI config with one version.

Step 2, in LaunchDarkly: Set up targeting rules

Next, set up targeting rules for your AI config. These rules determine which of your customers receive a particular version of your AI config.

To specify the AI config version to use by default when the AI config is toggled on:

  1. Select the Targeting tab for your AI config.
  2. In the "Default rule" section, click Edit.
  3. Configure the default rule to serve a specific version.

Here is an example of a default rule:

An AI config's default rule.
An AI config's default rule.

When you're ready to add more AI config versions, you can come back to this step and set up additional targeting rules. If you are familiar with LaunchDarkly's flag targeting, this process is very similar: with AI configs, you can target individuals or segments, or target contexts with custom rules. To learn how, read Target with AI configs.

Step 3, in your app: Install an AI SDK

First, install one of the LaunchDarkly AI SDKs in your app:

npm install @launchdarkly/server-sdk-ai

Next, import the LaunchDarkly AI client in your app and initialize a single, shared instance of it:

import { init, LDContext } from '@launchdarkly/node-server-sdk';
import { initAi } from '@launchdarkly/server-sdk-ai';
const ldClient: LDClient = init('sdk-key-123abc');
try {
await ldClient.waitForInitialization({ timeout: 10 });
// initialization complete
} catch (error) {
// timeout or SDK failed to initialize
}
const aiClient: LDAIClient = initAi(ldClient);

Then, set up the context. Contexts are the people or resources who will encounter generated AI content in your application. The context attributes determine which version of the AI config LaunchDarkly serves to the end user, based on the targeting rules in your AI config. If you are using template variables in the prompt messages in your AI config's versions, the context attributes also fill in values for the template variables.

Here's how:

const context: LDContext = {
kind: 'user',
key: 'example-user-key',
name: 'Sandy',
};

Step 4, in your app: Customize the AI config, track metrics

In your code, use the config function from the LaunchDarkly AI SDK to customize the AI config. This function returns the customized prompt and model along with a tracker instance for recording prompt metrics. Customization means that any variables you include in the prompt messages when you define the AI config version have their values set to the context attributes and variables you pass to the config function. Then, you can pass the customized messages directly to your AI. You should also set up a fallback value to use in case of an error.

Here's how:

const fallbackConfig = {
model: {
id: 'my-default-model',
parameters: { name: 'My default model'}
},
messages: [ { role: 'system', content: '' } ],
provider: { id: 'my-default-provider' },
enabled: true,
};
const aiConfig: LDAIConfig = aiClient.config(
'ai-config-key-123abc',
context,
fallbackConfig,
{ 'example_variable': 'puppy' },
);
// Based on the example AI config version shown in step 1,
// aiConfig.messages[0].content will be:
// "You are a model designed to tell knock-knock jokes. Each joke should be addressed to Sandy and include at least one puppy."

Then, use one of the track[Model]Metrics functions to record metrics from your AI model generation. LaunchDarkly provides specific functions for several common AI model families, and an option to record this information yourself.

Here's how:

const { tracker } = aiConfig;
const completion = await tracker.trackOpenAIMetrics(
// Pass in the result of the OpenAI operation.
// When you call the OpenAI operation, use details from aiConfig.
// For instance, you can pass aiConfig.messages
// and aiConfig.model to your specific OpenAI operation.
//
// For a complete example, visit https://github.com/launchdarkly/js-core/tree/main/packages/sdk/server-ai/examples/openai.
);

Step 5, in LaunchDarkly: Enable the AI config

Now that your application is set up, return to the LaunchDarkly user interface and enable your AI config. On the Targeting tab, toggle the AI config On. Then click Review and save.

An AI config, toggled "On."
An AI config, toggled "On."

When an end user opens your application, they'll get the AI config version you've defined, either in the default rule or in a custom targeting rule. Your app's AI content generation will use the model and prompt from the AI config version, and the prompt messages will be customized for each end user.

Step 6, in LaunchDarkly: Monitor the AI config

Select the Monitoring tab for your AI config. As end users use your application, LaunchDarkly monitors the performance of your AI configs. Metrics are updated approximately every minute.

Learn more about AI configs

The following sections provide answers to common questions about working with AI configs.

Integration with AI providers

In the AI configs product, LaunchDarkly is not handling the integration to the AI provider. The LaunchDarkly AI SDKs provide your application with model configuration details, including customized prompt messages, and model parameters such as temperature and tokens. It is your application's responsibility to pass this information to the AI provider.

The LaunchDarkly AI SDKs provide methods to help you track how your AI model generation is performing, and in some cases, these methods take the completion from common AI providers as a parameter. However, it is still your application's responsibility to call the AI provider. To learn more, read Tracking AI metrics.

Privacy and personally identifiable information (PII)

LaunchDarkly does not send any of the information you provide to any models, and does not use any of the information to fine tune any models.

You should follow your own organization's policies regarding if or when it may be acceptable to send end-user data either to LaunchDarkly or to an AI provider. To learn more, read AI configs and information privacy.

Availability of new models

When you create a new AI config version, you must select a model from the provided list. LaunchDarkly updates this list regularly. To request a new model, click the Give feedback option and let us know what models you'd like to have included.

What's next

We're continuing to build out the AI configs product. As the AI configs early access expands, we're committed to adding the following improvements:

  • Additional AI SDKs, including Java and Go
  • Support for approvals, so that account members have the option to require approval request for changes to an AI configs versions or targeting
  • Support for viewing differences between AI config versions