Skip to content

tinybirdco/datadog-integration

Repository files navigation

Tinybird <> Datadog integration

This data project demonstrates how users can integrate Tinybird with Datadog bulding endpoints on top of the Tinybird Service Data Sources and using vector.dev.

Note: This example uses vector.dev version 0.22.2. We're not actively maintaining this example. If you want to use a newer version, you would have to update the configuration from vector-ops-log.toml and vector-pipes-stats.toml.

The project is defined by:

  • Data project: The Tinybird data project
  • Vector.dev files: Files describing vector.dev workflows
  • GitHub Actions files: Files describing the GitHub Action workflows used to call Tinybird endpoints, and execute vector.dev flows

Push data project to Tinybird

The data project is defined in the /data-project folder. To push the project to Tinybird, you just have to install and configure the Tinybird CLI following these instructions, and then execute

cd data-project
tb push

This will create two endpoints:

  • endpoints/ep_datadog_ops_log.pipe: Used to extract datasources operations metrics from tinybird.datasources_ops_log
  • endpoints/ep_datadog_pipes_stats.pipe: Used to extract endpoints metrics from tinybird.pipe_stats_rt

And a datasource:

  • datasources/datadog_integration.datasource: Used to keep track of the last execution to avoid duplicating data in Datadog

The process will also create a token named datadog_integration_token with read access to the endpoints and append access to the datasource.

Configure GitHub Actions

⚠️ Note: This repository illustrates how to do the integration using GitHub Action but you can use the job scheduler best fits your current architecture.

To configure the GitHub Actions, you'll have to create the following environment variables:

  • TB_TOKEN: Tinybird token named datadog_integration_token
  • TB_HOST: Tinybird host: "api.us-east" for us or "api" for eu
  • DATADOG_API_KEY: Datadog API Key
  • DATADOG_REGION: Datadog region: "us|eu" Context

How everything works?

The process basically uses vector.dev to read data from Tinybird API endpoints, makes basic transformations to generate metrics, and sends data to Datadog. The jobs are scheduled to run every 10 minutes, running the following commands:

curl "https://${TB_HOST}.tinybird.co/v0/pipes/ep_datadog_pipes_stats.ndjson?token=${TB_TOKEN}" | ~/.vector/bin/vector --config ./vector-pipes-stats.toml
curl "https://${TB_HOST}.tinybird.co/v0/pipes/ep_datadog_ops_log.ndjson?token=${TB_TOKEN}" | ~/.vector/bin/vector --config ./vector-ops-log.toml

Metrics

metric_name metric_type interval unit_name
tb.pipes.count count 1min requests
tb.pipes.duration gauge 1min seconds
tb.pipes.duration_p99 gauge 1min seconds
tb.pipes.read_bytes count 1min bytes
tb.datasources.count count 1min operations
tb.datasources.rows count 1min rows
tb.datasources.duration gauge 1min seconds

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •