Skip to content

DSC Grafana Crash Course

Jerry Lundström edited this page Aug 16, 2023 · 11 revisions

These instructions are for Debian 11.

Install dsc

wget -O - https://pkg.dns-oarc.net/dns-oarc.distribution.key.gpg | sudo tee /etc/apt/keyrings/pkg.dns-oarc.net.asc
echo "deb [signed-by=/etc/apt/keyrings/pkg.dns-oarc.net.asc] http://pkg.dns-oarc.net/stable/`lsb_release -c -s` `lsb_release -c -s` main" | sudo tee /etc/apt/sources.list.d/dns-oarc.list
sudo apt-get update
sudo apt-get install -y dsc

Setup dsc

NOTE: Change <<YOUR_INTERFACE_HERE>> to the interface to capture DNS on.

cat - | sudo tee /etc/dsc/dsc.conf <<EOF
local_address 127.0.0.1;
local_address ::1;
run_dir "/var/lib/dsc";
pid_file "/run/dsc.pid";
interface <<YOUR_INTERFACE_HERE>>;

dataset qtype dns All:null Qtype:qtype queries-only;
dataset rcode dns All:null Rcode:rcode replies-only;
EOF

Create directory for XML files and change group to the current user for easy access.

sudo mkdir /var/lib/dsc
sudo chgrp `id -un` /var/lib/dsc
sudo chmod g+rws /var/lib/dsc

Run dsc

sudo dsc /etc/dsc/dsc.conf

Optional: You can now do something that generates DNS traffic, note that dsc won't start collecting until the start of the next minute.

watch ls -l /var/lib/dsc

Install influxdb, grafana and dsc-datatool

sudo apt install -y wget gnupg apt-transport-https
wget -q https://repos.influxdata.com/influxdb.key
echo '23a1c8836f0afc5ed24e0486339d7cc8f6790b83886c4c96995b88a061c5bb5d influxdb.key' | sha256sum -c && cat influxdb.key | gpg --dearmor | sudo tee /etc/apt/trusted.gpg.d/influxdb.gpg > /dev/null
echo 'deb [signed-by=/etc/apt/trusted.gpg.d/influxdb.gpg] https://repos.influxdata.com/debian stable main' | sudo tee /etc/apt/sources.list.d/influxdata.list
sudo wget -q -O /usr/share/keyrings/grafana.key https://packages.grafana.com/gpg.key
echo "deb [signed-by=/usr/share/keyrings/grafana.key] https://packages.grafana.com/oss/deb stable main" | sudo tee -a /etc/apt/sources.list.d/grafana.list
sudo apt-get update
sudo apt install influxdb2 grafana dsc-datatool

Start influxdb

sudo systemctl daemon-reload
sudo systemctl start influxdb

Setup grafana

sudo systemctl enable grafana-server
sudo systemctl start grafana-server
sudo ufw allow from any to any port 3000

You can now open a browser to http://<<YOUR_IP_HERE>>:3000/ (change <<YOUR_IP_HERE>>), log into Grafana using admin/admin and change the password.

Setup influxdb

influx setup -u dsc -p some-random-password -o dsc -b dsc -f
influx v1 dbrp create --db dsc --rp autogen --bucket-id `influx bucket list -n dsc --hide-headers|cut -f 1`
influx auth create --org dsc --read-bucket `influx bucket list -n dsc --hide-headers|cut -f 1`

Save the token from the last command, needed to setup the datasource below.

Add datasource

  • In Grafana, go to Configuration -> Data sources and Add data source
  • Select InfluxDB
  • In HTTP, for URL use: http://localhost:8086
  • In Custom HTTP Headers, add the following and change <<token>> to the token from the last influx command above:
    • Header: Authorization
    • Value: Token <<token>>
  • In InfluxDB Details, for Database use: dsc, for HTTP Method select POST and for Min time interval use: 1m
  • Press Save & test

Prepare for dsc-datatool

First create a YAML file with DNS parameter labels.

wget https://github.com/DNS-OARC/dsc-datatool/raw/develop/contrib/iana-dns-params-toyaml.py
python3 iana-dns-params-toyaml.py > $HOME/labler.yaml

Then we start moving XML files to a process area.

mkdir $HOME/xmls
find /var/lib/dsc -type f -name '*.xml' | xargs -r mv -t $HOME/xmls/

Once you verify it's working you can cron it.

* * * * * find /var/lib/dsc -type f -name '*.xml' | xargs -r mv -t $HOME/xmls/

Setup dsc-datatool

Create a script that processes XMLs with dsc-datatool and then imports them into InfluxDB.

cat - | tee $HOME/run-dsc-datatool.sh <<EOF
#!/bin/sh -e
find \$HOME/xmls -type f | while read file; do
  dsc-datatool \\
    --server "<<YOUR_SERVER_HERE>>" \\
    --node "<<YOUR_NODE_HERE>>" \\
    --output ";InfluxDB;file=\$HOME/influx.txt;dml=1;database=dsc" \\
    --transform ";Labler;*;yaml=\$HOME/labler.yaml" \\
    --xml "\$file"
  influx -import -path=\$HOME/influx.txt
  rm "\$file"
  break
done
EOF
chmod +x run-dsc-datatool.sh

Run it once to verify, note the break in the while loop.

sh -xe ./run-dsc-datatool.sh

Once it looks like everything is working correctly then edit out break and cron it.

* * * * * $HOME/run-dsc-datatool.sh >$HOME/run-dsc-datatool.log 2>&1

Optional, wait a verify that it runs correctly by cat'ing the log file.

Create first graph

  • create dashboard
  • add empty panel
  • change title: QTYPE
  • change from to default qtype
  • group by: add tag(qtype), change to fill(0)
  • set alias: $tag_qtype
  • change unit (type): ops/min
  • save

Setup response time dataset and create a graph for it

Configure dsc to store response time statistics and restart it.

echo "dataset response_time dns All:null ResponseTime:response_time;" | sudo tee -a /etc/dsc/dsc.conf
sudo pkill dsc
sudo dsc /etc/dsc/dsc.conf

Now wait a few mins until the new data is in the XMLs and populated into the database.

watch grep -R response_time /var/lib/dsc/
watch cat influx.txt

Optional, look at the response time statistics stored in the InfluxDB database.

$ influx
> use dsc
> show measurements
> select * from response_time
^D
  • add new panel
  • change title: Response Time Buckets
  • change from to: response_time
  • group by: add tag(responsetime), change to fill(0)
  • change alias: $tag_responsetime
  • tooltip mode = all
  • line interpolation = smooth
  • save