Business Intelligence with Idera – Update
In October, I discussed how to set up a business intelligence (BI) solution with Idera data using the Microsoft BI stack. This includes Integration Services (SSIS), Analysis Services (SSAS) and PowerBI. Today, I would like to share a minor update to that solution that helped me solve a business problem.
When you have multiple SQL Server instances in Production, one wants to know if it is possible, and feasible, to consolidate different databases from multiple instances onto one machine. This presents a few variables, the most important of which are business function and performance. For performance, you need to understand the CPU and memory characteristics for your databases and determine if you will have a detrimental performance impact by putting them together.
Solving this problem can be achieved by running the Microsoft Assessment and Planning toolkit (MAP tool). This is an excellent tool provided by Microsoft that captures information for analysis in migration projects. I used the tool to capture performance metrics which were then automatically fed through various statistical programs to produce results (such as 95th percentile) for analysis. I highly recommend you check it out here if you haven’t already.
In addition to MAP, I saw an opportunity for this project to use the BI solution I discussed earlier. My goal was to select servers and show an additive figure for their combined CPU, memory and Page Life Expectancy (PLE) values by time.
Tabular Model
First, I had to set up the Tabular Model to display the right measures for the analysis. In the main fact table, I created or verified a measure for average, maximum and minimum counter values. This was a simple formula that used the DAX expression AVERAGE, MAX and MIN respectively. I also created a calculated column “UTCDate” that stripped away the time values from the “UTCDateTime” column. I did this to make the model easier to deal with, as I didn’t need to slice down to the time level.
Next, I created a calculated table that summarized (using the DAX expression SUMMARIZE) the values by UTCDate for the measures created in the previous step. In effect, this is similar to a Group By in SQL Server. What we end up with is our average, max and min values grouped by date and SQLServerDatabaseID.
Here is the formula used to create the calculated table.
=SUMMARIZE(IderaStatistics, IderaStatistics[SQLServerDatabaseID], IderaStatistics[CounterID], IderaStatistics[UTCDate], “MaxGroupBy”, MAX(IderaStatistics[CounterValue]), “AvgGroupBy”, AVERAGE(IderaStatistics[CounterValue]), “MinGroupBy”, MIN(IderaStatistics[CounterValue]))
Lastly, I created two additional measures for each of our three mathematical functions (average, max, min). First, I created a measure to sum the values.
Then, I created a measure to calculate the value of all selected filters (DAX ALLSELECTED expression). This allows me to select the servers in PowerBI and have one value per date representing the summed counter value.
Power BI
With the tabular model completed, deployed and active, I can move on to the PowerBI piece of the puzzle. This is mostly trivial, as the hard work is done in the Tabular Model.
The first report is for CPU. We are interested in the maximum and average CPU for a server. The first and second graphs on the top display the average and maximum counter values (I have a page level filter for CPUActivityPercentage from Idera) by date. These separate the servers according to the filter values (I have them on the left side of the report). The bottom graph takes those values for average and maximum and sums them for all servers in the filter (Remember our ALLSELECTED DAX function). By adding a red line for 100, representing 100% CPU usage, as well as a trend line, I can get a good idea of whether the servers I’ve selected in the filter can be consolidated by CPU. The real power of this is the model will update almost instantaneously based on the filter, so I can try different servers out very quickly.
The bottom graph is a line chart that uses the measures MaxGroupBySum_AllSelected and AvgGroupBySum_AllSelected with the UTCDate as an axis. Below that is a simple time filter visualization for UTCDate.
In the example above, the average is clearly fine, however, the maximum is not.
The report for the memory usage is very similar. It is less useful than the CPU report but nonetheless shows how much memory would be needed for a consolidation. It is the exact same code save for a different page level filter, this time the SQLMemoryUsed counter from Idera.
Finally, the page life expectancy report shows an average and minimum value (we are not interested in maximum here). This is a simple report that required none of the extra changes in the Tabular model discussed above.