LaunchDarkly Developer Documentation

Get started in under 30 minutes.
LaunchDarkly provides feature flags as a service for Java · Python · Ruby · Go · Node.js · PHP · .NET. Control feature launches -- who sees what and when -- without multiple code deploys. Easy dashboard for phased rollouts, targeting and segmenting.
Need more help? write us at support@launchdarkly.com

Get Started    Guides

Kinesis destination

Kinesis destinations let you export all of the flag evaluation and event data generated by LaunchDarkly SDKs into a Kinesis data stream. This document will help you create and test a Kinesis destination.

Configuring the Kinesis Destination

In order to set up a Kinesis Destination, you'll need to create the following things in your AWS account first.

  • a Kinesis stream you want the data to go to (this can be preexisting)
  • an IAM policy which can put records into your Kinesis stream
  • an IAM role that LaunchDarkly can assume which provides access to the policy

If you don't already have a Kinesis stream create one. Make a note of the region, stream name and stream ARN.

Shard count

As a rule of thumb, you'll want to use 1 shard for every 50 million events you send us a month. This is a rough estimate based on typical usage patterns: if you expect to see spikes of usage, you should provision more shards. You'll want to adjust your shard count based on your actual usage at peak times.

Create the IAM policy. Sign in to the Identity and Access Management (IAM) console and follow these instructions to Create an IAM policy to allow LaunchDarkly permission to write to your Kinesis Stream. Select the Create Policy from JSON option and use the following template policy in the Policy Document field. Be sure to change the {region}, {account-id} and {stream-name} with the applicable values or copy in the stream ARN.

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
          "kinesis:PutRecord",
          "kinesis:PutRecords"
      ],
      "Resource": [
          "arn:aws:kinesis:{region}:{account-id}:stream/{stream-name}"
      ]
    }
  ]
}

Be sure to make a note of the name of this policy.

Now, you'll need to create a cross account IAM role. You can start this process here. In the Account ID section, enter 554582317989 as in the below image. This will allow LaunchDarkly to assume the role.

Create a role in AWS IAM for LaunchDarkly to assume.

Create a role in AWS IAM for LaunchDarkly to assume.

Now add the policy created previously to this role. Make a note of the role ARN.

With the region, stream name and role ARN noted, you can now create the destination within LaunchDarkly. Navigate to the integrations page. Here, you can add a Kinesis destination by clicking on the Kinesis icon if this is your first destination or by using the add destination drop down in the top right.

add a Kinesis destination by clicking on the Kinesis icon if this is your first destination or by using the add destination drop down.

add a Kinesis destination by clicking on the Kinesis icon if this is your first destination or by using the add destination drop down.

Enter the AWS data which you noted, choose a name and choose an environment.

A Kinesis destination with AWS information, a name and environment selected.

A Kinesis destination with AWS information, a name and environment selected.

Once you've saved the destination, you'll be able to to send a test event to ensure that the destination has been configured properly.

Send a test event after saving the destination.

Send a test event after saving the destination.

Partition Keys

Kinesis records have partition keys. When we create a record, we will use the user key that was sent with the event as the partition key. If the event does not contain a user key, we will create a random partition key.

Consuming the Kinesis Data Stream

Kinesis Data Streams are incredibly flexible. You can consume them in a lambda function, write an application to consume the streams and use a number of pre-built AWS integrations. A common use case is to use the Data Stream to populate a data store (e.g., S3, Redshift). You can do this by setting the Kinesis Data Stream as a source to a Kinesis Data Firehose as described here.


Kinesis destination


Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.