Python (AI) SDK reference
Read time: 5 minutes
Last edited: Dec 19, 2024
The AI configs product is only available in early access for customers on select plans. To request early access, navigate to AI configs and join the waitlist.
The AI SDKs are designed for use with the AI configs product. The Python (AI) SDK is currently in an alpha version.
Overview
This topic documents how to get started with the Python (AI) SDK, and links to reference information on all of the supported features.
LaunchDarkly's SDKs are open source. In addition to this reference guide, we provide source, API reference documentation, and sample applications:
Resource | Location |
---|---|
SDK API documentation | SDK API docs |
GitHub repository | python-server-sdk-ai |
Sample application | Using Bedrock, Using OpenAI |
Published module | PyPI |
Get started
LaunchDarkly AI SDKs interact with AI configs. AI configs are the LaunchDarkly resources that manage model configurations and messages for your generative AI applications.
This reference guide describes working specifically with the Python (AI) SDK. For a complete introduction to LaunchDarkly AI SDKs and how they interact with AI configs, read Quickstart for AI configs.
You can use the Python (AI) SDK to customize your AI config based on the context that you provide. This means both the messages and the model evaluation in your generative AI application are specific to each end user, at runtime. You can also use the AI SDKs to record metrics from your AI model generation, including duration and tokens.
Follow these instructions to start using the Python (AI) SDK in your Python application.
Understand version compatibility
The LaunchDarkly Python (AI) SDK is compatible with Python 3.8.0 and higher.
Install the SDK
First, install the AI SDK as a dependency in your application using your application's dependency manager. If you want to depend on a specific version, refer to the SDK releases page to identify the latest version.
Here's how:
Next, import the LaunchDarkly LDAIClient
into your application code:
Initialize the client
After you install and import the AI SDK, create a single, shared instance of LDAIClient
. Specify your SDK key here to authorize your application to connect to a particular environment within LaunchDarkly.
The Python (AI) SDK uses an SDK key. Keys are specific to each project and environment. They are available from the Environments list for each project. To learn more about key types, read Keys.
Here's how:
Configure the context
Next, configure the context that will use the AI config, that is, the context that will encounter generated AI content in your application. The context attributes determine which variation of the AI config LaunchDarkly serves to the end user, based on the targeting rules in your AI config. If you are using template variables in the messages in your AI config's variations, the context attributes also fill in values for the template variables.
Here's how:
Customize an AI config
Then, use config
to customize the AI config. You need to call config
each time you generate content from your AI model.
This function returns the customized messages and model along with a tracker instance for recording metrics. Customization means that any variables you include in the messages when you define the AI config variation have their values set to the context attributes and variables you pass to config
. Then, you can pass the customized messages directly to your AI.
The customization process within the AI SDK is similar to evaluating flags in one of LaunchDarkly's client-side, server-side, or edge SDKs, in that the SDK completes the customization without a separate network call.
Here's how:
To learn more, read Customizing AI configs.
Record metrics from AI model generation
Finally, use one of the track_[model]_metrics
functions to record metrics from your AI model generation.
Here's how:
Alternatively, you can use the SDK's other track*
functions to record these metrics manually. You may need to do this if you are using a model for which the SDK does not provide a convenience track_[model]_metrics
function. The track_[model]_metrics
functions are expecting a response, so you may also need to do this if your application requires streaming.
Make sure to call config
each time you generate content from your AI model:
To learn more, read Tracking AI metrics.
Supported features
This SDK supports the following features: