Prometheus AWS MSK discovery

Heavily based on teralytics/prometheus-ecs-discovery, but exports MSK instead of ECS.

Help

Run prometheus-msk-discovery --help to get information.

The command line parameters that can be used are:

  • -config.cluster (string) : arn of the MSK cluster to scrape
  • -config.jmx-metrics : add the jmx metrics to the out file (default = true) (default true)
  • -config.node-metrics : add the jmx metrics to the out file (default = true) (default true)
  • -config.role-arn (string) : arn of the role to assume when scraping the AWS API (optional)
  • -config.scrape-interval (duration) : interval at which to scrape the AWS API for MSK service discovery information (default 1m0s)
  • -config.scrape-times (int) : how many times to scrape before exiting (0 = infinite)
  • -config.write-to (string) : path of file to write MSK service discovery information to (default “msk_file_sd.yml”)

Usage

First, build this program using the usual go get mechanism.

Then, run it as follows:

  • Ensure the program can write to a directory readable by your Prometheus master instance(s).
  • Export the usual AWS_REGION, AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY into the environment of the program, making sure that the keys have access to the MSK API. If the program needs to assume a different role to obtain access, this role’s ARN may be passed in via the --config.role-arn option. This option also allows for cross-account access, depending on which account the role is defined in.
  • Start the program, using the command line option -config.write-to to point the program to the specific folder that your Prometheus master can read from.
  • Add a file_sd_config to your Prometheus master:

scrape_configs:
- job_name: msk
  file_sd_configs:
    - files:
      - /path/to/msk_file_sd.yml
      refresh_interval: 10m
  # Drop unwanted labels using the labeldrop action
  metric_relabel_configs:
    - regex: cluster_arn
      action: labeldrop

That’s it. You should begin seeing the program scraping the AWS APIs and writing the discovery file (by default it does that every minute, and by default Prometheus will reload the file the minute it is written). After reloading your Prometheus master configuration, this program will begin informing via the discovery file of new targets that Prometheus must scrape.

GitHub

View Github