LaunchDarkly Developer Documentation

Get started in under 30 minutes.
LaunchDarkly provides feature flags as a service for Java · Python · Ruby · Go · Node.js · PHP · .NET. Control feature launches -- who sees what and when -- without multiple code deploys. Easy dashboard for phased rollouts, targeting and segmenting.
Need more help? write us at support@launchdarkly.com

Get Started    Documentation

Kinesis destination

Overview

You can export all of the flag evaluation and event data generated by LaunchDarkly SDKs to a Kinesis data stream.

This topic explains how to create and test a Kinesis destination for data export.

Prerequisites

To set up a Kinesis destination, you must configure your AWS account with the following information:

  • A Kinesis stream you want to export the data to. You can use an existing stream if you already have one.
  • An IAM policy that can put records into your Kinesis stream
  • An IAM role that LaunchDarkly can assume that provides access to the policy

If you don't already have a Kinesis stream, create one by reading Amazon's documentation.

Recommended shard count

As a best practice, we recommend that you use 1 shard for every 50 million events you send us a month.

This recommendation is an estimate based on typical usage patterns, not a requirement. To prevent performance degradation, adjust your shard count based on your actual usage at peak times. If you expect to see usage spikes, provision more shards. If you do not use enough shards for the load, AWS may throttle your data stream. IF this happens, you may experience data loss.

Creating the IAM Policy

This section explains how to create an IAM policy to support the Kinesis data export.

Save this information

If you create a new Kinesis stream, write down or save the following information:

  • Region
  • Stream name
  • Stream ARN

You will need this information when you configure your export.

  1. Sign in to the Identity and Access Management (IAM) console.
  2. Follow Amazon's documentation to create an IAM policy that allows LaunchDarkly to write to your Kinesis Stream.
  3. Select the Create Policy from JSON option.
  4. Use the following template policy to configure the Policy Document field.
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
          "kinesis:PutRecord",
          "kinesis:PutRecords"
      ],
      "Resource": [
          "arn:aws:kinesis:{region}:{account-id}:stream/{stream-name}"
      ]
    }
  ]
}
  1. If you have a stream ARN already configured, you can paste the entire ARN in. At least, change the following fields to include the values you specified during IAM policy configuration:
    • {region}
    • {account-id}
    • {stream-name}.
  2. Click Save.
  3. After you create the policy, write down or save the policy's name.

Creating an IAM Role

This section explains how to create a cross-account IAM role for LaunchDarkly to assume so it can access the policy.

  1. Log into AWS and access the IAM section.
  2. In the Account ID section, enter 554582317989 as shown in the image below. This allows LaunchDarkly to assume the role.
Create a role in AWS IAM for LaunchDarkly to assume.

Create a role in AWS IAM for LaunchDarkly to assume.

Restricting trust

It's safe and straightforward to set up the role as described above, but your organization may have security requirements limiting what kind of roles you can trust.

If your organization's security policies require you to trust a specific role, we've provided arn:aws:iam::554582317989:role/data-export for you to trust instead.

This role can only be assumed by our Data Exporter services. You can trust our role by following a procedure in Amazon's documentation. When you complete that procedure, set the Principal in the trust relationship to {"Principal": { "AWS": ["arn:aws:iam::554582317989:role/data-export"] } after you've created a role.

  1. Add the policy you created earlier to this role. Write down or save the role ARN.

Creating the Kinesis Destination in LaunchDarkly

With the region, stream name and role ARN from earlier in the procedure, you can now create the Kinesis destination within LaunchDarkly.

  1. Navigate to the integrations page.
  2. Click on the Kinesis icon or, if this is not your first destination, click the Add Destination drop down.
add a Kinesis destination by clicking on the Kinesis icon if this is your first destination or by using the add destination drop down.

add a Kinesis destination by clicking on the Kinesis icon if this is your first destination or by using the add destination drop down.

  1. Enter a Name for the destination.
  2. Choose an Environment from the drop-down list.
  3. Enter the AWS data you saved earlier:
    • AWS Kinesis stream region
    • ROle address
    • AWS Kinesis stream name
  4. Click Save.
A Kinesis destination with AWS information, a name and environment selected.

A Kinesis destination with AWS information, a name and environment selected.

Testing the Kinesis Destination

After you've saved the destination, send a test event to confirm that the destination is configured properly.

  1. In the Send a test event section, click Send Event.
Send a test event after saving the destination.

Send a test event after saving the destination.

  1. If you have configured the destination correctly, an event is logged in the Kinesis destination.
  2. Identify a successful test event by searching for the user or flag key LaunchDarklyDataExportTestEvent. When this key appears in the destination, you know the test event arrived.

Partition Keys

Kinesis records have partition keys. When we create a record, we use the user key that was sent with the event as the partition key. If the event does not contain a user key, we create a random partition key.

Consuming the Kinesis Data Stream

Kinesis Data Streams are very flexible. You can consume them in a lambda function, write an application to consume the streams, and use a number of pre-built AWS integrations.

A common use case is to use the Data Stream to populate a data store, such as S3 or Redshift, by setting the Kinesis Data Stream as a source to a Kinesis Data Firehose. To learn how to do this, read Amazon's documentation.


Kinesis destination


Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.