13 February 2017

Proposal for audit a Cognos BI portal

This article is based on a dashboard implementation for audit a Cognos portal and to manage the daily run cubes process. The implementation scope is to understand the BI Portal usage and check the daily cubes run process, their performance and SLA. For this implementation, we used the Tableau Desktop tool. Below, we will detail each development process and applications.

BI Portal Usage

This implementation was based on the Audit package, which is a transverse Cognos package that allows the access to a high number of audit models over the Cognos infra-structure. As these dashboards implementation was done within a multinational client context, it was not possible to access directly the Cognos Content Store databases. Due to these restrictions, the solution detailed below, includes the creation of a set of reports based on Audit data and a schedule to daily extract the last 10 days report data to a “.csv” extension file.

Once the files are being extracted to a folder it would be possible to have it as data source for the dashboards development, however it was decided to have a consolidated database. It was created a project using SSIS ETL tool to import the factual table, where it was used the hash function to only allow the import of new data.

The final deliverable is the development of dashboards in order to quickly answer important questions such as:

> How many users access to the portal?

> How many users effectively use the portal?

> What is the portal evolution usage overtime?

> What is the time spent by users in their analysis?

> At which time we have a higher usage of the portal?

> ...

After analysing the Audit model and according with the questions raised above, we’ve focus on Audit logins model.

Data Source

On the image below, we can find the model used for the dashboards development.

Measures Analysis

Besides the existing measures in the factual table, it was necessary to create additional measures detailed below.

> # Logins

This measure allows us to better understand the number of employee’s logins sessions. The factual table lists all the users’ logins and the purpose of this measure is to know the aggregate logins number.

> # Employees

This measure allows us to understand the number of employees. The factual table lists all the users’ logins and the purpose of this measure is to count the distinct users number.

> Avg. Employees Logins

This measure was created in order to calculate the average number of times a user accesses the portal.

Dashboard: Country Usage

This dashboard displays, for the current year, the portal users’ accesses, logins and the ‘actual use’ of the portal. The ‘actual use’ means on the users’ universe with access to the portal how many of them connect in fact to the portal.

Dashboard: Date & Time Usage Distribution

This dashboard displays a detailed portal usage analysis. It shows, for the current year, the periods with higher number of accesses and the time spent by the users on the portal.

Daily Cubes Run - Process, Performance and SLA

The goal of daily cube execution monitoring process is to understand the percentage of successful cube builds, main reasons behind cube failures and also identify the margin to perform changes on existing processes given the defined SLA for cube build finish. This SLA value represents the difference between actual cubes delivery time and the expected time they should be available to business users.

The cubes execution monitoring process uses a Sharepoint portal has data source, where on a daily basis a dedicated team keeps track of overnight processes execution (cube builds are part of that monitoring), introduce relevant information such as: cube name, frequency of cube build (daily, weekly, monthly), cube build hour, cube limit hour, execution status (delivered on time with/without errors, delivered outside of the SLA), error type, among others.

This information, through a series of ETL processes, was written in SQL server tables, which were the source for Tableau dashboards. To notice, the data loaded into the Sharepoint site, since it was filled by humans (dedicated team that was monitoring the overnight processes), some data issues were discovered, leading to the need of data cleansing and reducing the period of analysis – on the following dashboard the analysis is limited from January until June of 2016.

The end result of this proposal is the elaboration of a dashboard, capable of providing quick and visual answers to questions such as:

· How many times a cube was built in a month?

· How many successful cube build executions we had, compared with the number of times the process failed?

· When a cube build failed which were the most common reasons?

· On average, what is the margin in hours the cubes finish building prior to the defined SLA?

Data Source

The model used to develop the cube build monitoring process dashboard is presented below:

The model is quite simple, the data is obtain by an inner join between a view named VW_F_CUBE_DELIVERIES where all the details on cubes executions are stored (Cube Name, Date Delivery, DateTime Delivery, Deletion Status, Frequency, Issue Type, Target Time, Time Delivery) and D_CALENDAR, which is the time dimension.

Measures analysis

Besides the main measure present on the model (Number of Runs), three more measures were created: Issue Run, No Issue Run and Date Difference.

Below is the explanation on the purpose of each measure for the analysis.

> # Issue Run

This measure counts the number of times the cube build execution ends in error. A calculated field was created on Tableau Desktop with the formula:

> # No Issue Run

This measure counts the number of times the cube build execution ends in success. A calculated field was created on Tableau Desktop with the formula:

> # Date Difference

This measure returns the average in hours cube build tasks finish prior to the defined SLA. A calculated field was created on Tableau Desktop with the formula:

For proper visualization, the value of this measure was setup to have only one decimal place.

Dashboard: Cube Run Analysis

This dashboard presents cubes executions with a monthly distribution, allowing analysis on the number of successful cube builds with comparison with cube executions with failures, when the failures occur it allows to understand the most common causes and for each cube the average number of hours between the time it is taking to finish and pre-defined SLA time.

Conclusions

On this article a set of operational analysis were introduced, using a reporting tool named Tableau Desktop, developed on top of data sources existent in SSIS and/or SQL Server. The presented analysis shows a model to support decision making and system flow usage, and also success factors on several system processes, in this case, cube builds. Given the previous statements, the expectations that based this article were fully accomplished.
Through the article it was induced that any company with a Cognos system implemented could develop a set of analysis which will allow monitoring several processes such as: number of users who access the Portal, at which time(s) the distribution of accesses increases, on which countries the level of usage is bigger, among others. On another perspective, a set of analysis regarding cube build performance were introduced; the same analysis could have been extended to the number of reports sent to users on a daily bases through a schedule – the information was available on the sources. The solution presented can be seen as a small module that could be implemented as part of a broader solution and developed on several companies concerned with operational topics or where this information can be described as critical.
The dashboards developed on the scope of this article bring, without question, added value to companies who are still dependent on legacy systems to obtain the same type of information. Tableau Desktop, being a self-service tool, allows users of these dashboards/workbooks, the possibility to develop their own visualizations and/or analysis, creating a more complex solution than the one demonstrated on this article. Despite the use of Tableau as the reporting tool, there are several other tools available on the market which can deliver the same type of analysis. Qlik Sense, Tibco Spotfire or Microsoft Power BI is just a small list of examples on the same.

        Ana Russo
        Consultant
   Sérgio Carvalho
       Consultant
Blog