Skip to content
This repository has been archived by the owner on Dec 8, 2021. It is now read-only.
Daniel Lee edited this page Nov 9, 2018 · 25 revisions

Wiki Home

Setup for Testing Datasource Plugins

The largest part of the work for reviewing a datasource plugin is setting up the datasource. This is a guide for setting up the different databases needed for testing.

Akumuli Datasource

There is a test server for akumuli at http://206.189.27.155:8181/ and this is the only setting (URL field) needed to create a datasource connection.

Ambari Datasource

There are a lot of steps (takes 30 minutes to install everything) and requires vagrant and virtualbox. Follow the instructions here.

I had to manually start the start the ambari services after they failed to start the first time when using the cluster wizard.

The url to login into the Ambari website is http://c6801.ambari.apache.org:8080 (where c68 is if you use centos 6.8) but the url for config in Grafana should be:

http://c6801.ambari.apache.org:6188 with basic auth and user admin and password `admin.

Atlas Datasource

To get Atlas running with some sample data:

curl -LO https://github.com/Netflix/atlas/releases/download/v1.5.3/atlas-1.5.3-standalone.jar
java -jar atlas-1.5.3-standalone.jar

Consul Datasource

  • Initial agent: docker run -d --name=dev-consul -e CONSUL_BIND_INTERFACE=eth0 consul
  • Add more agents with: docker run -d -e CONSUL_BIND_INTERFACE=eth0 consul agent -dev -join=172.17.0.2
  • Create a test key value (key: test3, value: 40): curl -X PUT -d @- http://172.17.0.2:8500/v1/kv/test3 <<< 40
  • Fill in a dummy value for the token on the Grafana config page.

Darksky Datasource

See the docs: https://grafana.com/plugins/andig-darksky-datasource

DeviceHive

  1. https://playground.devicehive.com/
  2. Sign up
  3. Create a device with curl (curl command is on playground)
  4. Copy the access token for the data source
  5. Use the admin console on playground to send commands. Name: test Parameters: {"test": 1} or {"test2": 1}

GLPI

Use the following docker-compose file:

glpi-md:
  image: mariadb
  restart: always
  ports:
    - "32806:3306"
  environment:
    MYSQL_DATABASE: glpi
    MYSQL_ROOT_PASSWORD: password
    MYSQL_USER: glpi
    MYSQL_PASSWORD: password
  volumes:
  - glpi-db:/var/lib/mysql
  - glpi-dblog:/var/log/mysql
  - glpi-dbetc:/etc/mysql

glpi:
  image: fjudith/glpi
  restart: always
  ports:
    - "32706:80"
  volumes:
    - glpi-files:/var/www/html/files
    - glpi-plugins:/var/www/html/plugins
  links:
    - glpi-md:mysql
  1. Then browse to http://localhost:32706 to start the installation wizard. Instructions for that here if you get stuck.

  2. The step where with 3 fields - database, user and password. Database should be localhost (or the IP address of the mariadb started with docker-compose e.g. 172.17.0.3). User: glpi Password: password

  3. Then login to the GLPI portal with user: glpi password: glpi

  4. The REST API needs to be enabled in Setup -> General section:

    image

  5. Need to enable an API client as well (the first time you need to check the regenerate checkbox for the token):

    image

  6. For the config page in Grafana, you will need the App token that is generated in the previous step and the client token can be generated in the Adminstration->Users section. Find the glpi user, then navigate to the Setting section. Find the API token field, check the regenerate checkbox and save:

    image

  7. Finally, I have only ever got this working using Access mode Browser and using a Cors plugin in Chrome to get around the CORS issues. Then create a ticket and it should show up in Grafana.

Instana

  1. cd data/plugins/instana-datasource
  2. docker-compose up mountebank
  3. Create datasource for Instana in Grafana with url: http://localhost:8010. You don't need an API key so just ignore the validation error.
  4. Create new dashboard with graph panel
  5. Query: "filler" and then select something from the dropdowns.

LinkSmart SensorThings Datasource

  1. Use the following docker-compose file to start a SensorThings server (the server implementation is called Gost) with a postgres db with GIS installed:

    https://raw.githubusercontent.com/gost/docker-compose/master/docker-compose.yml (From this repo)

  2. Test that it is working. The following command should return a json file with an empty array and not an error: curl http://localhost:8080/v1.0/Things

  3. Create some data using Postman. Gost has a collection of HTTP commands for Postman that you can use to create sensors, things and data. Import this collection into Postman and run some POST commands to create datapoints.

  4. Create a datasource in Grafana and set this value in the url field: http://localhost:8080/v1.0

Prometheus Alert Manager Datasource

Grafana has a docker block for prometheus that includes everything needed to test this datasource.

In Grafana source root folder:

  • cd docker
  • ./create_docker_compose.sh prometheus
  • docker-compose up
  • create datasource in Grafana with url: http://127.0.0.1:9093
  • Fill in the severity level fields

Thruk Datasource

Warp10

  • Start the docker container for Warp10: docker run --name warp10 -p 8080:8080 -p 8081:8081 -d -i warp10io/warp10:latest
  • To get tokens to read and write (from the Worf component): docker exec --user warp10 -t -i warp10 warp10-standalone.sh worf test 31536000000
  • It should return something like:
    {"read":{"token":"8rJ46zGxQkTLQsBu39SUUrANJmjHB__qJkHjDM8IsWsf8XAi6P03EI1e6ve5NqbzrC81uIwiB6S9JgI9bNtR2PEwD7qpNR9pYA4U29H3HiER37DNIOyvP.","tokenIdent":"2cfb976558c726f7","ttl":31536000000,"application":"test","applications":["test"],"owners":["932cd87c-bb00-4127-b75e-07fedbd12fa3"],"producer":"932cd87c-bb00-4127-b75e-07fedbd12fa3","producers":[]},"write":{"token":"UYWwdR4S6at_NpxlD_tsN99rAj5H_6yBZ5JhSJ5oTlaoJXmEYPMjPWQuD6Zs6ZENVaRacl2lgfSkI595gSETveyDQAaxPqzeAcQXfWBkKt7","tokenIdent":"f9211941b17d9b81","ttl":31536000000,"application":"test","owner":"932cd87c-bb00-4127-b75e-07fedbd12fa3","producer":"932cd87c-bb00-4127-b75e-07fedbd12fa3"}}
  • Create a file with test data (warp10.txt):
        1538250207762000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 79.16
    1538250237727000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 75.87
    1538250267504000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 74.46
    1538250267504000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 73.55
    1538250297664000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 72.30
    1538250327765000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 70.73
    1538250327765000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 69.50
    1538250357724000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 68.24
    1538250387792000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 66.66
    1538250387792000/51.501988:0.005953/ some.sensor.model.humidity{xbeeId=XBee_40670F0D,moteId=53,area=1} 65.73
  • Take the write token from the json result and use it to create test data:
    curl -H 'X-Warp10-Token: your_write_token' --data-binary @~/warp10.txt 'http://localhost:8080/api/v0/update'
  • Use the read token in the Warp10 query editor.
  • Example raw query for Worldmap:
    '[ { "key": "amsterdam", "latitude": 52.3702, "longitude": 4.8952, "name": "Amsterdam", "value": 9 }, { "key": "charleroi", "latitude": 50.4108, "longitude": 4.4446, "name": "Charleroi", "value": 6 }, { "key": "frankfurt", "latitude": 50.110924, "longitude": 8.682127, "name": "Frankfurt", "value": 9 }, { "key": "london", "latitude": 51.503399, "longitude": -0.119519, "name": "London", "value": 12 }, { "key": "paris", "latitude": 48.864716, "longitude": 2.349014, "name": "Paris", "value": 15 } ]'
    JSON->
    
  • Example query for test data file:
    • Metric name: some.sensor.model.humidity
    • Label key: moteId
    • Label value: 53
Clone this wiki locally