I am going to start by saying some of the tactics in here are likely not Microsoft supported, but they work. Before you make changes, ensure you have the previous settings written down somewhere and your data backed up. While I have not encountered any issues with this procedure, I suppose something could happen that I have not seen in the environments I manage.
There are also a few other ways of accomplishing the report writing, such as using Visual Studio. I wanted to use only free tools that are readily available for this exercise. The driving piece of this is getting access to the performance data. That has been a challenge for myself and many others out there. After this, you will be better prepared to pull a variety of performance metrics from your system in the way you want.
Saban's blog was able to point me to the location of the tables used for the performance data and he uses the Visual Studio method for creating the report. Lot of good information in his blog. Ultimately, I chose a different SQL query method to make it easier (in my opinion) working with the data.
- Go ahead and connect to the SQL server running the OperationsManagerDW database using the SQL Management Studio. If you don't have the SQL management studio on your desktop then you may want to terminal into your database server and connect from there.
- When connecting, enter the SQL server name housing your OperationsManagerDW database.
- Once in the SQL Management Studio, browse through the tree to the OperationsManagerDW view tab, where you will right-click on "Views" and select "New View".
- A box will appear that allows you to grab specific tables or views to add for the query designer. Go ahead and just close this window.
- You will then be presented with a three pane window, the fields should be blank. Post the following SQL statements into the bottom pane (replacing any existing information), after which the view should look like the image following the SQL statements. This view will filter the results to only show the logical disk counters. You could leave the filter out so that you can use this view to poll any performance data. For now, use these statements to only look for LogicalDisk metrics.
- Run the query just to make sure data is getting polled correctly. If not, check to ensure everything has copied and pasted correctly and that the OperationsManagerDW database is selected to run the query against. After the query runs successfully, save the view.
- When the save dialog appears, use the name as shown, as the remainder of this tutorial will be based on this view. Otherwise, remember to change the references to this name with the one you input. The query could also be changed to reflect daily information. Simply go back to the select statement above and change the joins to include Perf.vPerfDaily instead of Perf.vPerfHourly
- At this point, we need to make an adjustment to the reporting service. Open the "Reporting Services Configuration Manager" and select the "Execution Account". Take note of the account used here.
- Now open the web management console for the SQL Reporting Server that runs your SCOM reports and select the detailed view in the report manager screen. Look for the Data Warehouse Main data source. Hovering over the entry should bring up a drop down menu, select that and click manage to open the properties of the Data Warehouse Main data connection.
- Go to the "Connect String" properties of the data connector and input the execution account credentials in this spot and save the changes.
- Now you will want to download the Microsoft SQL Server 2008 R2 Report Builder 3.0 software, http://www.microsoft.com/en-us/download/details.aspx?id=6116 (If you do not have SQL 2008 R2 installed, then download Microsoft SQL Server Report Builder 2.0. Report Builder 3.0 only works with SQL 2008 R2)
- Once downloaded and installed, open the report builder and select the "Blank Report" option. You may be promopted for a login to the report server. You should be able to use your local credentials. If those do not work, you may need additional rights to login to the SQL server with your account.
- Once in the "Blank Report", you will need to add a data source. Browse through the tree view on the left and right-click ont he "Data Source" folder to add a Data Source.
- From here you should be able to browse the datasources installed on the report server. The Data Warehouse Main connection should be found by browsing.
- With a datasource setup, you need to now add a dataset to pull the data. Browse the tree 5o "Datasets" and right-click to add a new dataset.
- In the "Dataset Properties", under the "Query" section, input the following code:
- After saving the dataset, you should see the variables for the dataset apear in the left-hand pane.
- In the Report Builder menu at the top, select "Insert" and select the down arrow under "Matrix" to manually insert a matrix table.
- You should be presented with a two row, two column matrix table. Start by dragging the "Path" variable to the lower-left box.
- Now click on the box with the [path] variable, right-click to bring up the context menu and browse down to "Add Group" and select a "Child Group" for the row.
- This will bring up a "Tablix group" dialog box. Select the [InstanceName] variable for this group.
- In the lower-right column, drag the [Total_Disk_Space] Variable to the box. This will want to change the value to the sum of all values. Right-click the field to bring up the expression option.
- Take out the SUM() calculation to leave just the "Total Disk Space" option.
- In the same column, right-click for the context menu and now lets select "Insert Column - Inside Group - > Right". Do this again to add another column.
- Select the top header, right-click for the context menu and select the option to "Split cells". Now you can drag the [Free_Megabytes] variable to the next to last column and the [Used_Space] to the last column.
- Your report should now look something close to this, two rows and five columns, each with unique variables and headers.
- Once you run the report and can see the data, go ahead and save the report directly to the SCOM Report Server. If you select the "My Reports" folder, your report will show in the reports page under "Authored Reports"
Perf.vPerfHourly INNER JOIN dbo.vPerformanceRuleInstance ON Perf.vPerfHourly.PerformanceRuleInstanceRowId = dbo.vPerformanceRuleInstance.PerformanceRuleInstanceRowId INNER JOIN dbo.vManagedEntity ON Perf.vPerfHourly.ManagedEntityRowId = dbo.vManagedEntity.ManagedEntityRowId INNER JOIN dbo.vPerformanceRule ON dbo.vPerformanceRuleInstance.RuleRowId = dbo.vPerformanceRule.RuleRowId
WHERE (dbo.vPerformanceRule.ObjectName = 'Logicaldisk')
[Total Disk Space],
[Total Disk Space]-[Free Megabytes] AS "Used Space",
FROM vCustomHourlyLogicalDiskPerf) AS SourceTable PIVOT ( AVG (AverageValue) FOR CounterName IN ([Total Disk Space],[Free Megabytes])
) AS PivotTable
DateTime >= @Start_Date AND
DateTime <= @End_Date AND
NOT InstanceName = '_Total' AND
NOT InstanceName Like '\\?\Volume%%'
Alternatively, if you didn't setup or want to setup the WMI counters for logical disk size, you can extrapolate the information with the following query:
[% Free Space],
[Free Megabytes]/([% Free Space]/100) AS "Total Disk Space",
([Free Megabytes]/([% Free Space]/100) - [Free Megabytes]) AS "Used Disk Space",
(select CounterName, AverageValue,InstanceName, Path,DateTime
FROM vCustomHourlyLogicalDiskPerf) AS SourceTable
AVG (AverageValue) FOR CounterName IN ([Free Megabytes],[% Free Space])
) AS PivotTable
DateTime >= '7/17/2012' AND
DateTime <= '7/18/2012' AND
NOT InstanceName = '_Total' AND
NOT InstanceName Like '\\?\Volume%%' AND
[% Free Space] > 0.000001
This is a super basic, excel like report. In Part-3, we'll take a stab at adding some graphics and formatting the output for Gigabytes, instead of Megabytes, use commas and only two decimal places for the output of the data. By using custom views, you can aggregate a lot of information to query more easily later and with much simpler SQL statements. Those views may get delted during an update, so that is something to keep in mind, since it is a bit of a hack. If you find that you're missing data, you may want to up your collection interval for the devices or check for WMI connection errors on the servers you are collecting WMI performance data agains from Part-1.