Customizing AI configs
Read time: 5 minutes
Last edited: Dec 03, 2024
The AI configs product is only available in early access for customers on Foundation and Enterprise plans. To request early access, navigate to AI configs and join the waitlist.
The AI SDKs are designed for use with the AI configs product. The AI SDKs are currently in an alpha version.
Overview
This topic explains how to customize an AI config. This feature is available for AI SDKs only.
About AI configs
The AI config feature customizes an AI config based on the context that you provide. This means both the prompt and the model evaluation in your generative AI application are specific to each end user, at runtime.
This feature requires an AI config key, the context that encounters the AI config in your application, a fallback value, and, optionally, additional variables used in the prompt template.
This feature returns the value of the customized AI config version to use for this context, based on the AI config's targeting rules. This version includes the model and a customized prompt for the context. Then, you can use the value of the AI config version in your AI model generation.
Customizing the prompt means that LaunchDarkly substitutes context attributes and optional variables into the prompt messages associated with the AI config version. If the prompt has multiple messages, for example, different messages for different roles, they are all customized and returned. If the prompt cannot be retrieved, for example because the AI config key does not exist, or because the SDK cannot connect to LaunchDarkly, then the SDK returns the fallback value.
The AI config feature adds a context to the Contexts list, if a context with the same key does not already exist. However, each SDK customizes the AI config based only on the object you provide in the evaluation call. In other words, the SDK does not automatically use the attributes shown on the Contexts list, and attributes are not synchronized across SDK instances. You must provide all relevant attributes for each customization in order for variables in your prompt to be substituted correctly. To learn more, read Context configuration.
AI SDKs
This feature is available for all of the AI SDKs:
.NET (AI)
Expand .NET (AI) code sample
The Config
function customizes the AI config. This function returns the customized prompt and model. Customization means that any variables you include in the prompt messages when you define the AI config version have their values set to the context attributes and variables you pass to Config
. Then, you can pass the customized prompt directly to your AI.
Config
calls take the AI config key, a Context
, a fallback value, and optional additional variables to substitute into your prompt.
The fallback value is the value of the AI config version that your application should use in the event of an error, for example, if the AI config key is not valid, or if there is a problem connecting to LaunchDarkly. You can use LdAiConfig.Disabled
as a fallback value, and then check for this during your application's error-handling. Alternatively, you can create a custom LdAiConfig
object using LdAiConfig.New()
.
Here is an example of calling the Config
method:
var fallbackConfig = LdAiConfig.New().SetModelId("my-default-model").SetModelParam("name", LdValue.Of("My default model")).AddMessage("", Role.system).SetModelProviderId('my-default-provider').SetEnabled(true).Build()var tracker = aiClient.Config("ai-config-key-123abc",context,fallbackConfig,new Dictionary<string, object> {{ "exampleCustomVariable", "exampleCustomValue" }});
After the function call, you can view the context that you provided to it on the Contexts list.
To learn more, read Config
.
Node.js (AI)
Expand Node.js (AI) code sample
The config
function customizes the AI config. This function returns the customized prompt and model. Customization means that any variables you include in the prompt messages when you define the AI config version have their values set to the context attributes and variables you pass to config
. Then, you can pass the customized prompt directly to your AI.
config
calls take the AI config key, a Context
, a fallback value, and optional additional variables to substitute into your prompt.
The fallback value is the value of the AI config version that your application should use in the event of an error, for example, if the AI config key is not valid, or if there is a problem connecting to LaunchDarkly. You can use an empty JSON object as a fallback value, and then check for this during your application's error-handling. Alternatively, you can construct a JSON object, for example with values from one of the AI config versions you have created in the LaunchDarkly UI.
Here is an example of calling the config
method:
const fallbackConfig = {model: {id: 'my-default-model',parameters: { name: 'My default model'}},messages: [ { role: 'system', content: '' } ],provider: { id: 'my-default-provider' },enabled: true,};const aiConfig: LDAIConfig = aiClient.config('ai-config-key-123abc',context,fallbackConfig,{ 'exampleCustomVariable': 'exampleCustomValue' },);
After the function call, you can view the context that you provided to it on the Contexts list.
To learn more, read config
.
Python (AI)
Expand Python (AI) code sample
The config
function customizes the AI config. This function returns the customized prompt and model along with a tracker instance for recording prompt metrics. Customization means that any variables you include in the prompt messages when you define the AI config version have their values set to the context attributes and variables you pass to config
. Then, you can pass the customized prompt directly to your AI.
config
calls take the AI config key, a Context
, a fallback value, and optional additional variables to substitute into your prompt.
The fallback value is the value of the AI config version that your application should use in the event of an error, for example, if the AI config key is not valid, or if there is a problem connecting to LaunchDarkly. You can use an empty JSON object as a fallback value, and then check for this during your application's error handling. Alternatively, you can construct a JSON object, for example with values from one of the AI config versions you have created in the LaunchDarkly UI.
Here is an example of calling the config
method:
key = 'ai-config-key-123abc'context = Context.builder('context-key-123abc').kind('user').set('name', 'Sandy').build()fallback_value = AIConfig(enabled=True,model=ModelConfig(id="my-default-model",parameters={"name": "My default model"},),messages=[LDMessage(role="system", content="")],provider=Provider(id="my-default-provider"),)variables = { 'example_custom_variable': 'example_custom_value' }config, tracker = aiclient.config(key, context, fallback_value, variables)
After the function call, you can view the context that you provided to it on the Contexts list.
To learn more, read config
.