Building custom monitor with Pentaho Kettle

free-website-monitoringMonitis open API and available collection of open source monitoring and management scripts provide nice possibility for finding solutions for monitoring your systems. Still there are many cases when you need a specific monitor and do not have or don’t want to spend much time on coding. That is the reason of presenting the very simple and easy way of building custom monitors with Pentaho Data Integration suite.

Pentaho Data Integration (PDI) – Kettle is a free, open source ETL (Extraction, Transformation and Loading) tool. Along with powerful data extract, transform and load capabilities, Kettle provides intuitive and rich graphical design environment – Spoon. Spoon is a fast and easy way for building applications without writing a code. Drag and drop interface allows to graphically construct transformations and jobs.

To start with Kettle we recommend the following tutorial, it is a help with installation and introduction to Spoon; also check PDI user guide, a brief introduction to Kettle components.

In our article we want to present a very simple way of building custom monitor using Spoon. Moreover, our goal today will be monitoring of a business performance data opposite to usual system or application monitoring. Actually monitored data can be any information extracted from your database that needs to be shared and/or monitored. We’ll build a monitor that based on SQL query, will trace test table Orders, randomly populated with data, by order statuses. In this case number of orders grouped by current status (In Process, On Hold, Shipped and Cancelled) will act as metrics for our custom monitor.

To start, please, just have a look at Monitis API documentation. For creating custom monitor we need to implement the steps described below:

1.       Authentication – using Monitis API key and secret key (keys are available from your Monitis account: Tools->API) we need to get authentication token that will be used further for creating monitor and posting data.

For that, the following transformation


was created, using transformation steps listed below:

 New Picture (20) to provide API url, API key, secret key and other request parameters for API calls
 New Picture (21) HTTP request for Authentication token
 New Picture (22) Json input for parsing result of Authentication token request
 New Picture (23) and selection of needed parameter to be used later


After testing, we will implement small changes for converting created transformation to sub-transformation by simply adding Input and Output Specification as a start and end steps and removing info about API and secret key from parameters. This information will be provided in main transformations as an input for Authentication sub-transformation. Actually, we have created building block for our next steps which can be used in other transformation without any changes.


2.       Creating monitor



Here Data Grid steps are used for providing necessary input information:

 New Picture (24) API key and secret key in User data, as an input for Authentication  sub-transformation
 New Picture (25) monitor parameters
 New Picture (26) and metrics description

User Defined Java Expression step and Group By step for constructing parameter list for create monitor API call:

New Picture (13)

New Picture (15)

All the parameters are grouped by the Join Rows “Add Monitor Param” step resulting as an input for Add Monitor HTTP Post request . Write to Log step is providing information on transformation execution results where Data field is the ID of created monitor and will be used in the next transformation.

New Picture (16)


3.       Posting metric results for custom monitor

New Picture (17)

As an input here along with the user data (API and secret keys) we have Custom Monitor ID – result of Create Monitor transformation and Table Input step, which will retrieve the necessary information from database.

HTTP Post step will execute API call for posting monitor data.


4.       Creating a job

The only thing left is just creating a simple job to run the transformation for posting metric results.

New Picture (18)

After test you can use any scheduler to run the created job using Pentaho Kitchen, a standalone command line process that can be used to execute jobs.

And here we can see our custom monitor on Monitis dashboard.

 New Picture (27)


Using these simple transformations as a basis, you can create monitors by just changing input parameters and SQL query in Table Input step for retrieving metric data. Moreover, instead of Table Input step any other transformation Input, Utility, Lookup or Scripting step can be used as a source for monitored data. That will allow you to access relational and NOSQL databases and log files or data input of any format (CSV, JSON, XML, YAML, Excel, plain text …); to base monitor on script execution, Java classes or shell/process output; HTTP, REST and WSDL requests; fetch data from Google analytics account – just feel free to explore rich collection of Spoon transformation steps.


You might also like