Feeding data into Dashing

As we've already covered, Dashing uses a series of scheduled jobs written in Ruby that will collect any data that we are interested in. A library called rufus-scheduler controls the scheduling; the rufus-scheduler library allows for great flexibility as to when and how jobs are run, meaning that you could have a lightweight job that scrapes data from a public API and runs every five seconds, and another job that will run every 30 minutes and perform a heavy query on a database.

We're going to create a single job called puppet.rb, and this Ruby code is going to perform the following actions:

  • Gather metrics using PuppetDB's metrics endpoint
  • Gather a list of nodes using PuppetDB's nodes endpoint
  • Use the nodes gathered to gather ...

Get Puppet Reporting and Monitoring now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.