Analyzing Pi System Data Work Book
Analyzing Pi System Data Work Book
Version 2022
Analyzing PI System Data
OSIsoft, LLC
1600 Alvarado Street
San Leandro, CA 94577
All rights reserved. No part of this publication may be reproduced, stored in a retrieval
system, or transmitted, in any form or by any means, mechanical, photocopying,
recording, or otherwise, without the prior written permission of OSIsoft, LLC.
OSIsoft, the OSIsoft logo and logotype, Managed PI, OSIsoft Advanced Services,
OSIsoft Cloud Services, OSIsoft Connected Services, OSIsoft EDS, PI ACE, PI
Advanced Computing Engine, PI AF SDK, PI API, PI Asset Framework, PI Audit Viewer,
PI Builder, PI Cloud Connect, PI Connectors, PI Data Archive, PI DataLink, PI DataLink
Server, PI Developers Club, PI Integrator for Business Analytics, PI Interfaces, PI JDBC
Driver, PI Manual Logger, PI Notifications, PI ODBC Driver, PI OLEDB Enterprise, PI
OLEDB Provider, PI OPC DA Server, PI OPC HDA Server, PI ProcessBook, PI SDK, PI
Server, PI Square, PI System, PI System Access, PI Vision, PI Visualization Suite, PI
Web API, PI WebParts, PI Web Services, RLINK and RtReports are all trademarks of
OSIsoft, LLC.
All other trademarks or trade names used herein are the property of their respective
owners.
2
How to Use this Workbook
Each Main Heading describes a
high-level valuable learning topic.
User manuals, Learning workbooks, and other materials used in class can be
downloaded from http://techsupport.osisoft.com. Login to an OSIsoft technical
support account is required.
Page i
Analyzing PI System Data
Software Version
PI DataLink 2019 SP1 Patch 1
Microsoft Office 2016
PI ODBC Driver 2016 R2
PI SQL Client 2018 R2
PI Integrator for Business Analytics Advanced Edition 2020 R2 Patch 1
PI OLEDB Enterprise 2019
Microsoft SQL Server 2016
PI Data Archive 2018 SP3 Patch 2
PI Asset Framework 2018 SP3 Patch 3
PI Vision 2021 Patch 1
ii
Contents
1 Welcome .............................................................................................................. 5
1.1 Course Environment .................................................................................. 5
1.2 Review PI System Architecture ................................................................. 6
1.3 Assets and Tags – The Basic Building Blocks in the PI System ............. 8
8 Analyzing Events............................................................................................... 90
8.1 Objectives ................................................................................................ 90
8.2 PI Event Frames in PI System Explorer .................................................. 90
8.3 PI Event Frames in PI DataLink ............................................................... 96
8.4 PI Event Frames in PI Vision ................................................................. 102
8.5 Discussion.............................................................................................. 109
4
Analyzing PI System Data
1 Welcome
Page 5
Analyzing PI System Data
The PI System collects, stores, and manages data from your plant or process.
Products called PI Interfaces and PI Connectors read data from your data
sources (control systems, instrumentation, etc), timestamp it, and write it to tags
on the PI Data Archive. The data is organized and given context using PI Asset
Framework. Users get data from the Data Archive and Asset Framework and
work with it using a variety of client tools, such as PI Vision and PI DataLink.
6
Analyzing PI System Data
Sometimes the architecture can be very simple. Some customers have as few as
one or two interfaces feeding data to a single Data Archive. Access to data is
through the single Data Archive. In a large enterprise however, the architecture
becomes more complex.
PI AF Server
Asset Analytics
Event Frames
Network Devices Notifications
IT Monitor
Process
HQ PI Data
Controls
Archive
DCS / PLC
OPC
Relational
Database
SQL Server
Text /
Flat Files
PI ProcessBook
PI Manual Logger
PI to PI
PI DataLink
Process Control
Data Sources DMZ Corporate LAN VPN
Network (PCN)
There are often several Data Archives in an organization, aggregating data from
lower levels. Some corporations have Data Archives dedicated to servicing their
clients with restricted company data.
Page 7
Analyzing PI System Data
1.3 Assets and Tags – The Basic Building Blocks in the PI System
Objectives:
• Define an AF Asset with its components element and attributes.
• Define four attribute types: Static (None), PI Point, Formula, and Table
Lookup.
• Define a Data Archive Tag with the attributes Tag Name, Descriptor, and
Point Source.
• Define the different data types that can be stored in Data Archive Tags.
AutoCreate
Tags Assets
Figure 3: Assets and Tags
1.3.3 Some Basic Properties and Why They Are Important to You
AF attributes and Data Archive points have a set of properties that define them.
Some common properties used in client tools are for display or informational
purposes.
8
Analyzing PI System Data
Attribute name
The attribute name is similar in concept to the point description. A detailed name
for the attribute may help the user identify the source of the information.
Tag name
Unique name is used to create points for storage in the Data Archive. Points for
data attributes storage can be built through AF templates using substitution
parameters for local naming convention or can be searched for on the Data
Archive. Creating points through templates, lends consistency in nomenclature
making searches easier for PI Administrators. For example, which might be
easier to locate in a search?
Point: M03_E1P1_MOTDRV1202_RUNSTAT
Attribute: Machine3 Enclosure 1 Panel 1 Motor Drive 1202 Run Status
Substitution parameters are variables placed in attribute templates for PI point and PI point
array data references representing portions of the AF hierarchy.
For example, %Element% is a substitution parameter that represents the element name.
After you create an element based on that template, you tell AF to create the data reference.
When AF creates the reference, it substitutes the current element name wherever
%Element% is present.
Page 9
Analyzing PI System Data
Descriptor
This is the human-friendly description of the Data Archive Point, similar to the
attribute. The descriptor is often a search criterion since the point name is not
always intuitive. Often the point name is some sort of abbreviated convention
and the descriptor captures the “full name.”
Point source
Points can be related to their interfaces that collect the data by a point attribute
called pointsource. Grouping by point source allows all of points associated
with a particular device to be identified by searching for all points of a certain
point source. This assumes that the user knows the point sources in use and
that will not be true in most situations.
Point type
The PI point attribute that specifies the data type for the values that a point
stores. The possible point types include int16, int32, float16, float32, float64,
digital, string, BLOB, and timestamp.
10
Analyzing PI System Data
2 Business Intelligence
Business intelligence (BI) tools offer solutions to quickly analyze raw, un-
normalized, multidimensional data. Values from the PI Data Archive, external
metadata, and calculations from Asset Framework can be transformed by
business intelligence tools into actionable analysis and interactive reports in order
to gain insight into business and operational processes.
Later on in the course, we will explore the process of preparing the Asset
Framework model to add additional dimensions of information to our AF
database. The next step is extracting desired information (process data,
metadata, and event frame data) from the PI System through PI Data Access
tools. This data will be incorporated into a BI cube and used to develop
interactive reports that allow us to “slice and dice” our data and bring meaning to
our multidimensional data cube.
Page 11
Analyzing PI System Data
• Less work than Excel for more complex analysis and visuals
• Can solve problems that are simply too large for Excel and PI DataLink
• Cheap – Free download or $9.99 / month per user for Power BI Pro
• Live reporting and centralized web-based dashboards in Office 365 and Power BI
Server
• Slick visuals including 3rd Party Visuals in Microsoft AppSource
12
Analyzing PI System Data
This course will be broken down into three main sets of exercises. In Part 1, we’ll
use PI Integrator for BA asset view to publish data from PI and spend a lot of time
configuring Power BI. In Part 2, we’ll get to know event frame and streaming view
in PI Integrator for BA, test some python code to predict the values. In Part 3,
we’ll make modifications to a PI AF hierarchy and then use prebuilt SQL tables to
create a report in Power BI.
In Part 1, we will be working with a data set for a fleet generation company, which
includes different KPI characteristics for 30 units like gas and steam turbines and
other. The source data will be published in a data-science ready format using PI
Integrator for BA. Once this is done, we'll configure an array of Power BI visuals
and integrate the results with PI Asset Framework and PI Vision.
Fleet generation database is not complete. We are going to add more analytics in
it later in Part 3. For the current exercise this information is enough to get a good
understanding of the data.
In this part of the class you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
• Better understand the data set used in the following chapters
We will take a few minutes to understand where the data set came from and relate the
sample Power BI report back to the PI System. We are working with a data set for a fictitious
fleet generation company. They have built a PI AF Hierarchy for their units serving a number
of geographical areas. In this course, we will focus on analyzing these units, which generate
energy using different technologies.
Page 13
Analyzing PI System Data
Open PI System Explorer and head to the Fleet Generation AF database. Drill down to a
level with unit (names starting with GAO_, BCU_, etc.) and inspect the available attributes.
We will be using a sub-set of these attributes for all of our analysis, in addition to leveraging
the AF hierarchy.
Browse the hierarchy, which is organized into Region, Station, and Unit.
Most of the child elements are based on the generic Unit template.
Those in the CENTRAL region are based on the Gas Turbine template, which is
derived from the UNIT template and has additional attributes.
14
Analyzing PI System Data
Gas Turbines have all the attributes from the Gas Turbine template, but also
inherit those from the UNIT Template:
Page 15
Analyzing PI System Data
Data from this PI AF hierarchy will be published for use in a Power BI report in a
later exercise.
16
Analyzing PI System Data
Getting the data out of the AF structure and into the client tools requires the use of
integration software such as the PI Integrator for Business Analytics or PI System Access
software such as PI SQL Client. Another option is custom application, which is using AFSDK
calls for example to get AF data to a 3 rd party client. This chapter will discuss the former
method of extracting the data.
The PI Integrators join your Business Intelligence (BI) infrastructure with OSIsoft’s PI
System, allowing you to combine high-value Operation Technology (OT) data from the PI
System with Information Technology (IT) data for reporting, analytics, and application
integrations. The integration of data from OT systems, such as automation and control
systems and internet-enabled devices, with data from IT systems, such as transactional and
business process systems, increases situational awareness, adds transparency into
industrial operations and business processes, and makes it possible to anticipate problems
and identify opportunities for process improvements.
PI AF Server &
PI Integrator for BA SQL Server Client Machine
PI Data Archive • Relational tables • Microsoft Power BI
• Web UI
• PI AF Databases (static data) • Worker Node • PI Software
• PI Points (time series data / tags)
Page 17
Analyzing PI System Data
The following is a breakdown of the My Views page layout, and the different
operations available.
Note: The information regarding the My Views page layout is available within the PI
Integrator for Business Analytics User Guide.
1. All the views to which you have access are listed in the table
2. Click to create an Asset View that is based on Elements and Element Templates
3. Click to create an Event View that is based on Event Frames and Event Frame
Templates
4. Click to create a Streaming View for publish targets that support streaming such as
Apache Kafka, Azure Event Hub, and Azure IoT Hub.
5. To modify a view, select the view in the table and click Modify View.
6. To delete it, click Remove View. Deleting a view removes data from the buffer,
therefore freeing up space. However, this does not free up the available output streams
allowed with your license.
7. For the selected view, the Overview, Log and Security tabs provide additional details
about the view.
8. The red message counter icon at top right show that there are warning and error
messages recorded by PI Integrator for Business Analytics. Click the icon to open the
message list.
9. Click the gear icon at top right to see the version of PI Integrator for Business Analytics
and AF you are using.
18
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Page 19
Analyzing PI System Data
Clear the Asset Name Checkbox, Change it to filter on the Region template, click Save:
Drag and drop Albertsville to the Shape configuration, and change it to filter on
the Station Template:
20
Analyzing PI System Data
Drag and drop GAO01 and select Unit as the Template, this time check the box
to search derived templates.
Page 21
Analyzing PI System Data
Click GAO01 then hold control and multi-select Demand, Gross Generation,
Hourly Capacity, Net Generation, Technology and Unit Status. Drag and drop
these selections to the Shape configuration.
There should be 30 matches in the preview, click Next in the top right corner.
22
Analyzing PI System Data
We now see a preview of the data using the default Time Range and interpolation
mode.
We want to publish Hourly data for the time period 01-Jun-19 00:00:00 to 31-Aug-
19 23:00:00. Modify the Start Time and End Time and click Apply:
Click Edit Value Mode and change the time step to 1 hour, then Save Changes:
Page 23
Analyzing PI System Data
The TimeStamp column should now reflect changes to the Start, End, and Value
Mode:
Now we’ll add some additional time columns that will come in handy later when
building the reports. Click Add Colum. Select the Time Column tab. Select
Month, Month Name, Week of the Year, and Hour, then click the arrow to bump
them over to the right:
Now that the time ranges and columns have been specified, click Next.
24
Analyzing PI System Data
Now we can choose what target to publish to. This depends on the platform used
to support front-end application, but for our purposes we’ll publish to a SQL
Server. Select SQL Server for the Target Configuration, Leave Run Once
checked, and click Publish:
Page 25
Analyzing PI System Data
We will now spend some time configuring a Microsoft Power BI report. The first
step is importing the data.
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Import the Fleet Generation table.
Approach:
Open Microsoft Power BI and start a new report.
26
Analyzing PI System Data
There may be a warning that the connection is not encrypted, this can be safely ignored,
click OK:
Expand the PIInt database and Select the Fleet Generation table, click Load
Page 27
Analyzing PI System Data
Note that 66 240 rows have been imported. It can load much more data, while Microsoft
Excel has a limitation of 1 million rows.
28
Analyzing PI System Data
In case there were mistakes or problems with the previous steps, a starter .pbix
file has been created with the raw data set already imported with columns that
will match the exercises exactly.
Open C:\Class\Part 1 - PI Integrator for BA\Starter File - Part 1 Fleet
Generation.pbix and use this as a starting point for the remaining exercises.
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objectives:
• Configure a Hierarchy
• Configure a Hierarchy Slicer
• Configure a Measure to calculate service hours
• Configure a Group to create bins for different load ranges which can then be used
for highlighting and filtering
• Configure a Stacked Bar Chart to display the service hours spent in each Load
Range by unit
• Configure a Table to show the top 20 units by average Loading
• Configure a Slicer to filter by Month
In this exercise, we will analyze characteristics of different assets, which generate energy.
The goal is to assess the number of service hours spent in various high load conditions to
better understand which assets need more attention and inspect the reasons of high energy
generation.
Page 29
Analyzing PI System Data
Approach:
Within the fields list, drag and drop the Station field on top of the new Region hierarchy:
30
Analyzing PI System Data
The Hierarchy Slicer is a custom visual that can be used to filter reports and mimic the PI AF
hierarchy. This is similar to the PI TreeView from PI WebParts.
Most custom visuals can be found on Microsoft AppSource. We will briefly go through the
procedure of how one would normally obtain a custom visual.
Search for a custom visual on Google or within AppSource and you’ll arrive at a page like
this:
https://appsource.microsoft.com/en-us/product/power-bi-
visuals/WA104380820?tab=Overview
At which point you would click Get It Now, sign in using your work or school account, and
download the .pbiviz file.
Page 31
Analyzing PI System Data
We should now see the Hierarchy Slicer in the list of available visuals:
32
Analyzing PI System Data
We will use a Hierarchy Slicer to leverage the existing PI AF hierarchy for filtering. Add a
Hierarchy Slicer by clicking the icon:
Experiment with the Hierarchy Slicer for a bit by drilling down through the levels. Note that
checking a box for a parent will also include the children. This is a great way to visualize how
filtering works in Power BI.
Page 33
Analyzing PI System Data
Change the Title of the Hierarchy Slicer to Location in the formatting options. Change the
color and increase the text size.
Service Hours
Now we’ll configure a Measure to calculate service hours. Each row in the data set
represents 1 hour, so we can simply count the number of rows that have been filtered
through user selection. This should make a bit more sense when it all comes together.
Right click ANY of the fields from the Fields list and select New measure:
34
Analyzing PI System Data
Enter the below formula into the configuration box and hit Enter or click the Checkmark:
Measures and calculated columns both use DAX expressions. The difference is the context
of evaluation. A measure is evaluated on the fly using a subset of data, whereas a calculated
column is pre-calculated at the row level within the table to which it belongs. A simple way to
put it is that Measures take into account the filtering that has been set by the end user of the
report (the stuff they’ve clicked on), while calculated columns are computed row by row and
are not influenced by the report filtering.
Page 35
Analyzing PI System Data
Change the name to Net Generation (50 MW) and set the bin size to 50, then click OK.
36
Analyzing PI System Data
With the Stacked Bar Chart selected, drag and drop Fields from the data set into the field
configuration boxes. Use Unit for the Axis, Net Generation (50%) for the Legend, and
Service Hours for the Value:
Next we will apply some formatting and filters to make the data set more manageable. We’ll
change the color scheme and only show Net Generation greater than 450MW, since Net
Generation in the normal range is not of interest to us.
Page 37
Analyzing PI System Data
Filter for Net Generation greater than 450MW. Be sure to click Apply Filter:
38
Analyzing PI System Data
Next go to the Visualization Options and sort by Service Hours (done by default in this
version of Power BI):
Next change the color scheme. With the Visualization selected, click the Format Icon in the
Visualization Pane and adjust the colors to better convey the severity of the Net Generation
levels.
Page 39
Analyzing PI System Data
The stacked bar chart should now look something like this:
40
Analyzing PI System Data
Click some blank space on the canvas to deselect any visuals, otherwise you will
accidentally convert the Stacked Bar Chart to a Table.
Create a Table:
Drag and drop the Unit, Service Hours, Net Generation, Gross Generation and Demand
Fields into the Values section:
Page 41
Analyzing PI System Data
42
Analyzing PI System Data
Change to summarize Net Generation as Average, then be sure to click Apply filter.
Go into the formatting options and change the orientation to horizontal to change the look of
the Slicer.
Reposition & Resize the slicer so all months are in a single row. Reposition & Resize the
table and stacked bar chart:
Page 43
Analyzing PI System Data
To put the Months in chronological order, we will sort the Month Name column in the data
set by the Month column where the months are numbered. Go to the Data View and click
one of the fields to make the data show up:
44
Analyzing PI System Data
Select the Month Name column, open the Column Tools Ribbon, and Sort by Column ->
Month:
Page 45
Analyzing PI System Data
Click the bars on the Net Generation by Unit chart and the Month slicer buttons and
note how the service hours and units for that load range update on the table.
We will save formatting until the end in case we need to save time, but feel free to adjust the
formatting and add a title.
Linking to PI Vision
We have a PI Vision display for Units that we can link to from this report. We will utilize PI
Vision URL Parameters to set the same Unit in the PI Vision display that the user clicks on in
the Power BI report. The URL parameters reference guide can be found in the
docs.osisoft.com.
Take the above URL and append the following string to it in a text editor, then paste the URL
into Chrome:
Note that the ?Asset parameter denotes the path to the Asset in the PI AF hierarchy.
46
Analyzing PI System Data
Once that is working, configure a Calculated Column to concatenate the URL with the Unit
asset path.
Right click on the header of ANY column and select New column:
For the DAX formula, enter the following and hit enter or click the checkmark:
PI Vision =
"https://pisrv01.pischool.int/pivision/#/Displays/5/FleetGeneration"&"?Asset=\\PISRV01\Fleet
Generation\All Turbines\" & 'Fleet Generation'[UNIT]
Page 47
Analyzing PI System Data
Next scroll all the way to the right and find the PI Vision column, then select it.
Go to the Column Tools ribbon, and change the Data Category to Web URL.
Now go back to the Report Tab and select the Table, then drag and drop the PI Vision field
as one of the table values
The links are now displayed, and they work, but they are not pretty to look at. Luckily, Power
BI has a feature that addresses this.
48
Analyzing PI System Data
Go into the Formatting Options, scroll down to the Values section, and turn on the URL
icon:
Test the links to confirm that the PI Vision display is launched and the correct unit is set.
Page 49
Analyzing PI System Data
(Optional) Formatting
Take some time to apply formatting to make the report more visually appealing and easier to
read.
1. Add a Title text box for the report (Insert Ribbon -> Insert Text Box)
2. Add titles for the Stacked Bar Chart and Table, change the font color to black and
bump up the font size
50
Analyzing PI System Data
Page 51
Analyzing PI System Data
52
Analyzing PI System Data
Finally test the links and experiment with filtering the report. We will get back to Fleet
Generation database in one of the next chapters and add more analytics and event frames
in it, which allows us to find more valuable information like units’ utilization, generating
efficiency and even more.
PI Integrator for BA (asset view) saved a lot of time and helped to display PI Data in a 3rd
party application like Microsoft Power BI. Let’s explore other functionalities of PI Integrator
for Power BA, such as Event View and Streaming View. Next chapter explains it very good
and is based on the real project that was done in San Leandro. Let’s change a bit our
production area and imagine ourselves as the data scientists for a moment. All that
approaches might be applied in any industry and help to solve a lot of real cases.
Page 53
Analyzing PI System Data
6 PI Analysis Service
Expressions:
Expressions allow for multi-lined calculations that utilize mathematical operators and
functions, if-conditions, and PI time-based functions to perform advanced analyses.
Expressions, created for a given asset type (element template), are automatically applied to
all elements of that type.
Rollups:
Rollups allow for the calculation of summary statistics (averages, maximums, minimums) of
values from a set of AF attributes. Current statistical values can be written directly to the PI
Data Archive.
PI Analysis Service allows for the automatic detection of events that occur. These events are
bookmarked and information for any event type can be retrieved for further analysis.
Scheduling:
Expressions and Rollups can be scheduled to run whenever a new event arrives into the PI
Data Archive or calculated on a periodic basis.
Backfilling:
Results from all three types of analyses can be backfilled into the PI System.
6.2 Expressions
With Expressions, you can implement calculations through a set of built-in functions that take
values of attributes in PI Asset Framework as inputs, and outputs results to other PI AF
54
Analyzing PI System Data
Multi-line calculation dependency allows for each expression to be written to different output
attributes as well as re-using calculated results in subsequent expressions.
Page 55
Analyzing PI System Data
The following procedure can be used to configure an Expression analysis using a template:
1) In the AF Database Library, create a new analysis template of type Expression.
2) Define expressions for the calculations in the analysis template.
3) Define the scheduling for the analysis template.
4) Define output attribute templates to store results.
5) Create the PI tags used to store the results.
6) Evaluate and preview the data to validate calculations.
7) Backfill the calculation if required.
8) Confirm the backfilled data
9) Backfill the data for other elements sharing the same template.
56
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
The Utilization is a percentage that represents the amount of electrical power that
a unit produced against its theoretical capacity. Configure, test, run, and validate
analyses to calculate the percent utilization of all generating units.
Approach:
• In PI System Explorer, navigate to the Library in the Fleet Generation
database.
• Under Element Templates, select the UNIT element template.
• Select the Analysis Templates tab to configure the multi-lined expression
for Utilization:
Utilization = Total Hourly Gross Generation / Hourly Capacity
• Specify and configure an attribute template to store the results.
• Schedule the calculation to run periodically every hour.
• Backfill unit GAO01 for the past seven days.
Page 57
Analyzing PI System Data
Approach
From the Unit Template, found in the Library plug-in of the Fleet Generation
database, select the Analysis Templates tab.
Configure a new analysis. Name the analysis Utilization and set the analysis type
to Expression.
Configure the expressions for the hourly total of Gross Generation and Utilization.
Note: The HourlyTotal must be multiplied by 24, as the Performance Equation function
TagTot assumes the units of the input attributes are per day. Conversion factors should not
be used elsewhere with PI Asset Framework, as UOM conversions occur automatically.
Define two new output attribute templates by clicking Map -> New Attribute
Template.
58
Analyzing PI System Data
The UOMs can be set to MWh and % in the Attribute Templates tab:
Switch back to the Unit Template Analysis Templates tab to schedule the Analysis
Template to run periodically at the top of each hour.
Page 59
Analyzing PI System Data
Set GAO01 as the Example Element and click on the Evaluate button to validate
the expressions.
60
Analyzing PI System Data
Prior to backfilling data into the PI Data Archive, it is usually a good idea to preview
the results. Right-click Utilization and select Preview Results. Look at the results
for the past 7 days:
Page 61
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
At this point, all the analyses for event frame generation have been set up for all
the units of Fleet Generation. In order to calculate past Utilization values and
generate history for analysis, the calculations must be backfilled.
Approach:
From PI System Explorer, select the Management plugin
Right now, the only Analyses that exist are those we just created, so one can simply select
All or Enabled to view the Utilization Analyses that we want to backfill.
62
Analyzing PI System Data
Normally there would be several types of calculations, so we’d want to filter them by setting
up a search. Create a new search:
Name the search Utilization, then do Add Criteria -> Name and enter the name of the
Analyses and click OK.
Page 63
Analyzing PI System Data
Click the checkbox to select all Utilization Analyses. Then Select the Queue operation and
set the start time to “*-7d” and the end time to “*”, select “Permanently delete existing data
and recalculate”, then click Queue:
64
Analyzing PI System Data
Objective:
Not all of the electricity produced by our generators will make it out to the grid.
Some will be consumed by the internal circuity in the generator itself. The net
generation is defined as the amount of gross generation, or the amount of
electricity that a generator produces, less the electricity required to operate the
unit. Calculate the generating efficiency, or the ratio between the net generation
to the gross generation, expressed as a percentage.
Approach:
• In the PI System Explorer, navigate to the Library in the Fleet Generation
database.
• Under Element Templates, select the UNIT element template.
• Select the Analysis Templates tab to configure the expression for
generating efficiency, named Generating Efficiency.
• Specify and configure an attribute named Generating Efficiency to store
the results with units of %.
• Schedule the calculation to run periodically every hour.
• Evaluate the calculation using example element GAO01 and preview the
results.
• Backfill all Efficiency analyses for the past seven days.
Page 65
Analyzing PI System Data
6.3 Rollups
The second analysis capability of the PI Analysis Service Analytics is known as
rollups. Rollups allow for the calculation of summary statistics for a set of attribute
values.
The types of summary statistics that are allowed are:
• Sum
• Average
• Minimum
• Maximum
• Count
• Median
66
Analyzing PI System Data
During the configuration of a rollup template analysis, when the source of the
attributes to roll up are from the child elements, PI System Explorer is not aware
of which parent element to retrieve child elements from. As such, when
configuring a roll-up analysis template, you will need to specify an example
element. Note that when configuring a roll-up at the element level, one will not
need to select an example element as the child elements are from the specific,
selected element.
The general process to properly configure and backfill an analysis template is:
Page 67
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective: Management would like to have visibility over the average percent
utilization of all generating units for each substation. Roll up the average
utilization to the substation level.
Approach:
From PI System Explorer, go to the Library. Then select the Station Element
Template. From the Analysis Templates tab, create a new Analysis called
Average Utilization with Analysis Type Rollup.
Specify the rollup attributes from child elements and set the example element to
be Central\Albertsville.
Set the attribute name field to Utilization. This mask will automatically select all
Utilization attributes from the child elements of the Albertsville station. However in
the preview only the Utilization from the Sample Child Element will be shown:
68
Analyzing PI System Data
Select Average as the rollup function and create a new Output Attribute called
Average Utilization.
Set the default UOM of this new attribute to % in the Attribute Templates tab:
Page 69
Analyzing PI System Data
In the Analysis Templates tab, Click on the Evaluate button to verify the result of the rollup
function.
From the element hierarchy, verify that the PI tag exists for the attribute.
From the Management pane, backfill your Average Utilization rollup analyses for the past 7
days and verify the data has been backfilled by trending the Average Utilization attributes.
70
Analyzing PI System Data
6.3.2 Exercise – Calculate Total Hourly Gross Generation for Each Station
Objective:
Management would like to gain more insight into the Total Hourly Gross
Generation at each station. Create a rollup analysis to totalize the Total Hourly
Gross Generation at the station level.
Which station produces the most power?
Approach:
• Open up the Station Element Template from the Fleet Generation
Database Library.
• Add a new analysis called Total Hourly Gross Generation with analysis
type of Rollup.
• Select Central\Albertsville as the example element.
• Specify the criteria to select the attributes used for the rollup calculation.
• Use the Sum function and output Attribute Total Hourly Gross
Generation.
• Specify the output attributes (ensure tags are created).
• Set the UOM to MWhr.
• Schedule the calculation to be event-triggered.
• Verify data using Evaluate and Preview Results.
• Backfill for the past 7 days and verify.
Page 71
Analyzing PI System Data
Events are important process or business time periods that represent something
happening that affects your operations. In the PI System, events are known as
event frames. Thanks to PI Event Frames, you can analyze your PI data in the
context of these events rather than by continuous time periods. Instead of
searching by time, PI Event Frames enables users to easily search the PI System
for the events they are trying to analyze or report on.
With PI Event Frames, the PI System helps you capture, store, find, compare and
analyze the important events and their related data.
PI Event frames represent occurrences in your process that you want to know
about, for example:
The following table presents some of the features and advantages of PI Event
Frames:
72
Analyzing PI System Data
A ‘Unit Status’ attribute is associated with each generating plant in our hierarchy.
This attribute will be used to monitor the downtime associated with each plant.
'Exhaust Gas Temperature - #1 Probe’ and 'Exhaust Gas Temperature - #2
Probe' are associated only with gas turbines and they will be used to monitor the
temperature anomalies.
There are three time range retrieval methods, the use of which depends on what data is to
be captured, and how it is to be displayed.
Time Range
This method allows a time range to be supplied by the end user. When any single value
query is made, this period of time is used for calculations. If, however a period of time is
supplied from an application, such as a generated Event Frame or Vision display, then the
user specified time range is discarded and the application time period is used.
The Time Range Override behaves in the same way as the Time Range method during all
single value queries, as uses the user specified time period. When a period of time is
supplied from an application, the application time range is discarded and the user specified
period is used.
Not Supported
Not Supported does not allow for a time range to be supplied by the end user. As such, an
error is returned by any request for a single value. If a period of time is supplied however,
then this range is adopted by the method for the calculation. The result is then the same as
the Time Range method.
There are different use cases for the methods, so care must be taken to ensure the correct
method is used.
Page 73
Analyzing PI System Data
74
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
The gas turbines in the Fleet Generation database each have two temperature
sensors. Create an Event Frame template with appropriate attributes to help
monitor and analyze potential issues with gas turbines. The event frame should
capture the real-time data specific to gas turbines and the current status and
duration of the gas turbine.
Approach:
• Create an Event Frame template.
Select the Attribute Templates tab. Right click in the white space to create an
attribute.
Page 75
Analyzing PI System Data
Name the Attribute Unit Status. Select Enumeration Sets => Status as the
value type.
Note: Substitution parameters cannot be used to make a reference to an attribute from the
Element Template that is not a PI Tag.
Upon completing the definition, click OK. The Settings will be completed as
seen below:
76
Analyzing PI System Data
Create a second attribute to store the Duration (UOM = second) of the event
frame. The Duration attribute will be populated by the new EventFrame()
function in a later exercise. It’s just a placeholder for now.
Create a third attribute to store the Technology. For the Value Type, select
String and for the Data Reference, select String Builder.
Note: When the event frame attribute’s data reference is set to PI Point, the syntax
.\Elements[.]|Attribute only allows for the reference to PI Point Data Reference attributes.
Element attributes configured as formulas and table lookups cannot be passed to event
frames using a PIPoint Data Reference. Instead, for attributes configured as formulas or
table lookups, select String Builder as the data reference.
Page 77
Analyzing PI System Data
Continue to create the following additional attributes. Make sure units are
properly set. The fastest way to accomplish this is to copy and paste
these attributes templates from the Gas Turbine element template.
Once these 5 attribute have been pasted into the Gas Turbine Temperature
Anomaly Event Frame Template, select them all and enter
.\Elements[.]|%Attribute%;TimeRangeMethod=StartTime as the
configuration string (copy/paste from Unit Status) to set the data references
and retrieval method in bulk:
78
Analyzing PI System Data
Note: %attribute% will substitute in the name of the event frame attribute template. This will
then point to the corresponding attribute in the referenced element. You can also select
multiple attributes when making modifications to the attribute configuration.
Create a manual Event Frame to test the PI Point Data Reference configurations and
naming pattern. From the Event Frames section, right click on Event Frame Search 1 and
create a New Event Frame.
Page 79
Analyzing PI System Data
Check the Attributes tab, check in, refresh, and confirm that there are no errors in
the PI Point DRs (Attributes). Also confirm that the naming pattern resolved
correctly. You will get different values than the screenshot since values are
randomly generated and of course get a different timestamp (Event Frame start
time) than when the screenshot was taken.
80
Analyzing PI System Data
Page 81
Analyzing PI System Data
Objective:
Generating units sometimes trip or go down. Management would like to
understand these downtimes, and determine how much demand was not
serviced. Event frames can help capture and bookmark these events for future
analysis. Develop an Event Frame template, called Inactivity using the same
Naming Pattern as the previous exercise, with fields required to track the desired
plant information to create reports for management. Specifically, management
would like to know the following:
Hints:
• For metadata, use String Builder as the Data Reference.
• For Total Demand, configure the attribute’s source units as MJ / s By
Time as “Time Range”, Relative time as “-1s” and By Time Range as
“Total”
• Verify correct event frame template configuration through the creation of a
test event frame.
82
Analyzing PI System Data
Some notable features of Event Frame Generation in the PI Analysis Service include
the following:
Generate events: Easily configure event generation and automatically generate your events
from the trigger tags that are already collecting data in the PI Data Archive.
Handle multiple event types: Generate all your different event types, such as downtime,
excursions, batches, and other events, on the same asset with no restrictions on overlapping
events.
Standardize using event frame templates and populate event attributes: Different event
types have different attributes and information that are important for analysis. Standardize
your events using event frame templates, and use the PI Analysis Service to automatically
populate event’s attributes with data from the PI Data Archive and PI Asset Framework.
Backfill events: PI Analysis Service enables you to define your history backfill time window,
then it backfills the events from previous time periods automatically.
Root Cause: Event frames are great for capturing events that have occurred.
However, often times, the time period prior to the event provides more
information on the cause of the event. PI Analysis Service allows for root cause
analysis and will capture a fixed time period (default five minutes) before the
event start time for further analysis. This will be recorded as a Child Event Frame.
Time True: The trigger condition for event frames could potentially be noisy. PI
Analysis Service allows for the specification of a minimum time true period before
an event frame will generate.
Page 83
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Each gas turbine has multiple temperature sensors. If any temperature reading
deviates more than 20% from the average, then servicing is required. Use the
Gas Turbine Temperature Anomaly Event Frame Temperature to help define
these types of events.
Approach:
From the Fleet Generation Library, select the Gas Turbine Element Template
and select the Analysis Templates tab. Create a new analysis template called
Gas Turbine Temperature Anomaly, Set the example element to GAO01, and
set the Event Frame Template to Gas Turbine Temperature Anomaly.
84
Analyzing PI System Data
Page 85
Analyzing PI System Data
Evaluate and preview the results to confirm there are no syntax errors.
86
Analyzing PI System Data
From the Analyses plug-in, backfill event frames for the past seven days for all Gas Turbine
Temperature Anomaly analysis templates.
Page 87
Analyzing PI System Data
Objective:
Engineering would like to perform a deeper analysis into events over the past
week in which the generating units are inactive. Configure the event frame
generation to automatically capture new events and detect historical events.
Approach:
• Open up the UNIT Element Template from the Fleet Generation Database
Library.
• Add a new analysis called Inactive Units with analysis type of Event
Frame Generation.
• Specify the event frame template: Inactivity.
• Define the trigger condition to automatically detect inactive events.
• Add an Output Expression using the EventFrame(“Duration”) function.
• Verify data.
• Backfill for the past seven days.
88
Analyzing PI System Data
7.3 Discussion
Objective: Brainstorm some real world uses for event frames at your own
company. Event frames can be used to capture duration and summary
information for events such as process excursions or downtime, but how would
this be implemented in your workplace?
Approach
• What kinds of events are of interest in your own process?
• Can you think of reliable trigger conditions?
• Do you have all the required data to identify these events?
Page 89
Analyzing PI System Data
8 Analyzing Events
8.1 Objectives
PI Event Frames are stored in PI AF databases. These event frames can be
viewed, filtered, analyzed using PI tools such as PI System Explorer, PI Vision,
and PI DataLink.
From the properties of an Event Frame Search, you can specify the following
search parameters for the time of the event frame, and the properties of the event
frame:
Search type: Specify how to perform an event frame search. Find all event
frames that are entirely between a start and end time? Starting or ending
between a start and end time?
Search start: Specify the start time for event frame search.
Search end: Specify the end time for event frame search.
Include descendants: Search for all child event frames in addition to parent
event frames.
Event Frame Name: Filter based on the name of an event frame. Can use
wildcards.
90
Analyzing PI System Data
Element Name: Filter based on the name of the referenced element. Can use
wildcards.
The resulting search query is combined into a string within the search field. This
allows for direct manipulation of the data fields without using the menu options.
Page 91
Analyzing PI System Data
The default search results bring back fields detailing the duration, start time, end
time, description, category, template, and a Gantt chart. Any of these fields can
be hidden by using the settings cog on the top right corner of the search results.
Additionally, values from the event frame attributes can be pulled back into the
search results through this same option list.
92
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Find all Inactive events for the unit GAO01 and GAO02 over the past 24 hours.
Examine the technologies that are involved in these inactive events.
Approach:
Click on the event frame plug-in. Right-click on Event Frame Search 1 and select
Properties.
Page 93
Analyzing PI System Data
From the Event Frame Search screen, specify the search start to “*-1d”, end to “*”, and
uncheck the “All Descendants” checkbox. For the Element Name textbox, specify GAO0?
and set the Template to Inactivity.
The search will return several inactive event frames. Select all of them and click on OK.
Click on the gear icon to the right of the fields, and remove the description and category
fields. Then click on “Select Attributes.”
Examine the Technology that is leading to the downtime for these Inactive Units.
94
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Find all temperature anomaly events for the gas turbines over the past 48 hours
that last for more than one hour. Add columns for Fuel Gas Pressure and for each
of the two gas temperature sensors.
Which unit has the highest starting Gas Fuel Pressure during a temperature
anomaly, and when was it?
Approach:
Perform an event frame search and format results for the desired attributes.
Page 95
Analyzing PI System Data
There are two retrieval methods for Event Frames inside of PI DataLink:
Explore: Find Event Frames that meet the specified criteria and display them in a
hierarchical format, which is useful to analyze events sharing the same EF
template.
Compare: Find Event Frames that meet the specified criteria and compare their
attributes in a flat format. This allows a flat list of events with attributes relating to
child events all within a single row.
For either the Compare or Explore Events, you can specify parameters to search
for specific event frames. You can specify the following:
Event Name: Search pattern to search for specifically named event frames.
Search Start: Search for all event frames that occurred after this time.
Search End: Search for all event frames that occurred before this time.
96
Analyzing PI System Data
Element Name: Search pattern for the name of the event frame.
More search options: Search based on attribute values, duration, and category.
Number of child event levels: Only for “Explore Events” and allows for the
hierarchical display of events.
Page 97
Analyzing PI System Data
When searching with Explore Events, the results can be displayed hierarchically
based on the relationships between child and parent event frames.
To return more than 1000 event frames in the search preview, go to Settings in
the ribbon. Change the setting to 10,000 Event Frames.
98
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Temperature deviations could potentially mean damaged machinery. Engineering
is interested in analyzing the Natural Gas units. Find out how many instances of
temperature deviations occurred for gas turbines that lasted for more than 30
minutes.
Approach:
From the PI DataLink tab in of Excel, select cell A1 and click Compare in the
ribbon.
Page 99
Analyzing PI System Data
100
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Inactivity events can be costly as the generating units are not generating any
power. Analyze with PI DataLink the total number of Inactivity events as well as the
total amount of time the units were in an Inactive state for the 24 hours.
Which generating unit had the most downtime events? Which generating unit had
the largest total downtime?
Approach:
Use PI DataLink to search for PI Event Frames and specify which attributes to return. Use
Excel to aggregate the events. It’s probably easiest to use a Pivot Table.
Page 101
Analyzing PI System Data
To view events, open the Events tab on the left side. Here you will find events related to your
process, the color to the left of each event indicates its severity. By default, the time range of
the display and the context of the symbols in the display determine what events are shown in
the Events list in PI Vision. To discover additional events, modify the time range or choose
Edit Search Criteria. When you edit the search criteria, there are a number of filtering options
to find the Event Frames you are looking for.
102
Analyzing PI System Data
You can select an event to find its Data Items (event attributes) and its start and end time.
By right clicking on an event, you can choose Apply Time Range apply the
event’s time range to the display.
Page 103
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Visualize Inactivity Events using PI Vision.
Approach:
Create a new PI Vision display. Drill down to asset GAO01 in the Fleet Generation database
Trend the Exhaust Gas Temperature Probes for the past 24 hours.
Click on Events in the top left and check “Automatically refresh the list”. By default, this will
load Event Frames for Assets on the display (in this case Turbine GAO01).
Right-click on one of the Inactivity Events and select Apply Time Range. The time range
will be applied to the temperature trends.
104
Analyzing PI System Data
Right-click on one of the Events and select Compare Similar Events by Type. Trends of
the Event Frame trigger attributes for the selected Event Frame and 10 recent event frames
will be shown.
Page 105
Analyzing PI System Data
Edit Search Criteria to compare 100 Inactivity Events for All Turbines:
106
Analyzing PI System Data
Other attributes from the Event Frames can be trended, but instead we will trend attributes
that are not included in the Event Frame but are included in the Asset. In the Attributes
Pane, drill into the turbine:
Then drag/drop the Fuel Gas Flow onto the trend area to add new trends
Page 107
Analyzing PI System Data
Use the scroll wheel on the right to scroll down and see the new trends
108
Analyzing PI System Data
8.5 Discussion
Objective: Event frames can be difficult to grasp at first. Let’s repeat the
discussion from the previous chapter now that you’ve seen some examples.
Brainstorm some real world uses for event frames at your own company. Event
frames can be used to capture duration and summary information for events such
as process excursions or downtime, but how would this be implemented in your
workplace?
Approach
• What kinds of events are of interest in your own process?
• Can you think of reliable trigger conditions?
• Do you have all the required data to identify these events?
Page 109
Analyzing PI System Data
Another important functionality that PI Integrator for BA has is event view and
streaming view. Streaming view is available in advanced edition of the integrator.
PI System Access license also includes PI SQL Client and the Real Time Query
Provider (RTQP) Engine. PI SQL Client is generally more flexible than PI
Integrator for BA. The main drawback is the difficulty of writing SQL queries and
reduced throughput. For example, PI SQL Client would take longer to directly
import several million rows during a report refresh. These two options have their
GUI and are comfortable to work with. PI System Access also includes
programmatic access to PI System Data through developer technologies, such as
PI AF SDK, PI WEB API, that don’t have any user interface and requires good
programming knowledge. This allows better integration with different types of 3 rd
party applications and makes PI System data available for building advanced
analytics and applying different data science methods.
In this part we will explore different tools for interacting with PI data as a data scientist. We
will learn how to explore PI data within the system before exporting it to develop a model. At
the end of this part, you will have gone through a small data science problem from data
exploration to deployment using data stored in the PI System.
110
Analyzing PI System Data
Business Problem
We need to reduce the energy that is spent for cooling by optimizing daily startup of
individual cooling units in a commercial building.
Every day, the VAVCO cooling units adapt to changing temperature, relative humidity,
thermostat setpoints, building occupancy level, and other factors. The control system adjusts
several factors to provide the necessary cooling to the rooms to ensure tenant comfort.
• Turn-on at some point in the morning, before 7 AM, and bring the room temperature
to setpoint by 7 AM
• Keep the room temperature at setpoint during occupied hours
• Shut-down at 7 PM when the building becomes unoccupied
The initial startup should finish as close to 7 AM as possible. Reaching the setpoint too early
results in wasted energy cooling an unoccupied room. Conversely, if the setpoint is not
reached by 7 AM, employees will not be comfortable in rooms that have not been cooled.
The business unit believes that the current startup schedule could be improved by examining
the historical data.
Our objective is to predict how long a unit will take to reach the setpoint depending on
current conditions to ensure the unit reaches the setpoint as close to 7 AM as possible.
In this part of the exercise you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
• Better understand the data set used in the following chapters
Approach:
Page 111
Analyzing PI System Data
We need to know where to find the data before thinking about developing a model. Unlike
the PI administrators and subject matter experts (SMEs) who know their system intimately,
we don’t know what data is available, where to look to find that data, and how that data is
structured.
Questions:
• How many floors are in the building? How many units in each floor?
112
Analyzing PI System Data
We now have a good sense for the data related to the asset in question. However, this may
not tell the whole story. Some information included in other assets could be useful in the final
model. Clear communication with subject matter experts is critical here – some data may be
important even though it’s not directly associated with the asset in the AF structure. What
other data might be valuable when considering air conditioning?
In this part of the exercise you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
• Plot dynamics during times of interest (PI Vision)
Page 113
Analyzing PI System Data
Approach:
Since we’re not experts in facilities management, it’s important to start building intuition
about how these data behave. This added context can help inform decisions down the line
as well as communication with SMEs. Visualizing evolution in time is necessary to
understand the time series data stored in the PI System.
Questions:
• Does the interior temperature always reach the setpoint after cooling begins?
• Drag the Room Temperature and Occupied Setpoint attributes to a new trend just
below the % cooling
114
Analyzing PI System Data
We can now see that the unit continues to cool even after the room temperature reaches the
setpoint. We will come back to this later. Now, we’re looking at a trend for a specific VAVCO
unit. Did other units run that day? We can leverage AF templates to quickly change contexts
to other units.
• In the Asset dropdown, select VAVCO 6-06
Notice that the trends update to reflect the new asset. This unit reaches the setpoint more
quickly than VAVCO 6-09.
Page 115
Analyzing PI System Data
116
Analyzing PI System Data
Page 117
Analyzing PI System Data
Now we know where to find important data for this system. For the purposes of this exercise,
we will be working with attributes on the Weather element and elements based on the
VAVCO template.
We will now shape the data into something that can be used in a machine learning
algorithm. Since we’re trying to predict the time it takes to cool a room to setpoint at the
beginning of the day, we’ll need to format the data into row-column format where each row is
a different cooling period and columns represent the outcome as well as possible predictors.
Event Frames enable this sort of time-based aggregation.
Questions:
118
Analyzing PI System Data
• Compare the temperature profiles for a few event frames. On a given day, how much
do they vary between floors? Units? For a given unit, how much does it vary over few
consecutive days?
Solution:
• Open PI System Explorer, click the Library tab, expand Event Frame Templates
and select VAVCO startup
The Attribute Templates tab shows the data that the event frame will aggregate. Note that
it includes both start and end time values as well as the name of the reference element. See
below for a description of each of the attributes.
• % Cooling at VAV start -> The “cooling rate” at the start of the event
• Actual Airflow at VAV Start -> The airflow at the start of the event
• Damper Position at VAV Start -> The position of the damper at the start of the
event
• Outside Air Temperature at VAV Start -> The temperature outside the building at
the start of the event which is measured by a weather station placed at the top of the
building
• Outside Relative Humidity at VAV Start -> The outside Relative Humidity at the
start of the event
• Room Temperature at VAV Start -> The room temperature at the start of the event
Page 119
Analyzing PI System Data
• Setpoint at VAV Start -> The occupied setpoint at the start of the event. We have
captured this attribute because the setpoint can be changed manually and it’s not
always constant.
• Setpoint Offset at start time -> Calculated attribute to calculate how many degrees
off the setpoint we are at the start of the event
• Space Humidity at VAV Start -> The humidity of the room at the start of the event
• Setpoint reached -> Indicates if the setpoint was reached within the period of the
event frame or not. During some very hot days, the setpoint is never reached that
day. Since we are only interested in the daily startup events, we close the events at 8
AM even if the setpoint hasn’t been reached.
The start and end triggers of the event frames must be defined for the system to know which
time periods are of interest. Explore the triggers for the event frames that have already been
created:
This guarantees that the event frames will only captures the first event of the day and will
end when either the room temperature is within 0.5 degrees of the setpoint or it does not
reach the setpoint by 8 am. For the purposes of this exercise, we will only consider cases
where the setpoint is reached before 8 am. Let’s compare event frames directly.
• Click the Events tab on the PI Vision display that you just created
120
Analyzing PI System Data
• Right click on the event frame and select Apply Time Range
The display now only shows data within the time range of the event frame.
• Right click on the event frame and select Compare Similar Events by Type
• Save the display if you have not done so already
A trend of the attribute value over the course of each event frame that matches the search
criteria appears. This gives some sense of how the values change with time and how those
changes vary from event frame to event frame.
Page 121
Analyzing PI System Data
PI DataLink is an Excel add-in that enables users to easily retrieve data from the PI system
into Excel spreadsheets via several different functions. Given Excel’s ubiquity and low
learning curve, PI DataLink and Excel can be a good tool for quick ad-hoc analysis if you’re
less comfortable with Python.
Questions:
Now that relevant pieces of data have been structured by Event Frames, let’s put that data
together where we can do some analysis.
122
Analyzing PI System Data
The event frames will be loaded into Excel. You should notice that there are nearly 2000
event frames starting on March 3, 2017 and ending on December 12, 2017. It makes sense
that we wouldn’t have any more recent event frames as air conditioning isn’t needed during
the winter months. The Space Humidity has a value of “No Data” for all event frames before
April 29. In practice, this can mean that the humidity sensor wasn’t running before then, or at
least the data was not stored before that date.
Page 123
Analyzing PI System Data
9.2.2 Directed Activity – Load PI Data into data frames with Python and R
Python and R have become very popular for data science. The AVEVA (PI) Technology
Enablement Team developed some libraries for data scientists to interact with PI data using
these familiar tools. They leverage the PI Web API, a REST endpoint for the PI System, to
access and shape data.
For the purposes of this exercise, some helper functions have been added on top of these
libraries to simplify the syntax.
If you’re not comfortable with Python, feel free to analyze the data in Excel. The worked
examples use Jupyter notebooks, but RStudio is also available if you’re more familiar with R.
We are going to use Jupyter notebook in further exercises. Let’s take a look on
how to work with Jupyter.
Jupyter notebook is an open-source, interactive web application that allows users
to create and share documents that contain interactive calculations, code,
images, etc. Users can combine data, code, and visualizations into a single
notebook, and create interactive “stories” that they can edit and share. Notebooks
are documents which contain both computer code (such as Python) and other
text elements such as paragraph, markdown, figures, links, etc.
We are not going to write any code by ourselves, required code is already
available for us. To run the code cell we are going to use hot keys Shift + Enter or
button, which is marked in red on the image above.
To clear the variable outputs and restart the code please select Kernel tab:
124
Analyzing PI System Data
Questions:
We’ve installed the Anaconda Python distribution for the purposes of this exercise.
Page 125
Analyzing PI System Data
We’ll need to convert the data to appropriate data types. Notice that the Setpoint reached
and Space Humidity at VAV Start have DICT types as values (curly braces {} show
dictionary data type). Those attributes correspond to formula data references in the event
frames. We’ll need to convert those to numerical values before we can proceed.
• Run these commands, which create a new column “Setpoint reached_values” in our
data frame
• This column includes extracted Setpoint value
126
Analyzing PI System Data
• Run these commands, which create a new column “Space Humidity at VAV
Start_values” in our data frame
• This column includes extracted “Space Humidity at VAV Start” values
Page 127
Analyzing PI System Data
Note that the time stamps don’t align with what we expect. This is because they have been
translated to UTC time. Make sure to adjust them to Pacific time:
For the purposes of this exercise, we will focus on the event frames that eventually reach
their setpoint. Filter the event frames based on the “Setpoint reached” attribute.
Then convert the end time to a float value. E.g. if setpoint was reached at
7:30am, then the endtime is 7.5h.
If the setpoint was reached earlier than 7am, then we call it waste and find the
value of it.
128
Analyzing PI System Data
If the setpoint was reached later than 7am, then we name it as employee
discomfort.
Let’s exclude the features (columns) that don’t affect event frame duration
(reaching setpoint / cooling process):
Based on the correlation plots of each of the possible attributes, a few attributes appear
related to the event frame duration:
Page 129
Analyzing PI System Data
Note that we do not include % cooling at VAV start because SMEs have informed us that
this is not a property of the room, but just reflects how the unit is operating. It’s very
important to use your SMEs. We also don’t include room temperature because we already
have the setpoint and the setpoint offset (difference between room temperature and
setpoint). We could include the room temperature instead of the setpoint but not both since
they are not independent.
We’ll use Python to build a model to describe the data. Rather than accessing the PI data
each time we run the model, the PI Integrator for Business Analytics will automatically shape
and export the PI data to a flat file that can be read for analysis.
Objective: Export data to a flat file using the PI Integrator for Business Analytics
Questions:
• Follow the instructions below to export the PI data to a text file. Does it have the
format that you expect?
• How is this different from loading data with PI DataLink or the PI Web API libraries?
Solution:
We’ll use Python or R to train a model to describe the data. Rather than accessing the PI
data each time the model is run, we’ll use the PI Integrator for Business Analytics to
automatically shape and export the PI data to a SQL table. The SQL table will then be read
by our Python/R code for analysis.
130
Analyzing PI System Data
• Click Save
• The Matches list should populate with all the event frames.
Page 131
Analyzing PI System Data
• Click Next
Now is a good opportunity to review the data that the integrator has retrieved. You may
notice that the Duration column has integer values of 0-3, reflecting an integer number of
hours that the event frame was active. This does not give us much information. We’ll want to
use a floating data type with minutes as the unit of measure.
132
Analyzing PI System Data
• Click Next
Now we must point to the destination for the formatted data.
Now the data is stored in a format that can easily be loaded into whatever software you like.
Page 133
Analyzing PI System Data
Questions:
We can load the data from the output file into our program of choice.
• Open the Jupyter notebook titled 3. Model Training and Evaluation and follow the
instructions
• There may be a compatibility warning – this can be safely ignored
• Load the modules
• Load the data file, that PI Integrator for BA generated in the previous exercise
Assign “Event Frame duration” to y (what we try to predict - target), and assign features to x,
what helps us to a prediction.
134
Analyzing PI System Data
The rest of the notebook goes through the process of assigning appropriate data types,
splitting the data into a training and a test set, then fitting the training data using an ordinary
least squares (OLS) model:
𝑦 = ∑ 𝛽𝑖 𝑥𝑖
Where 𝑥𝑖 is the predictor value and 𝛽𝑖 is a fitted coefficient. This model has the advantage of
being easily interpreted, as variable importance can be easily compared using the
coefficients.
A naïve solution to this problem – using the mean startup time in all cases – would result in
significant energy savings but also increase the amount of time that rooms fail to reach the
set point on time. The OLS model would save more energy than the naïve approach without
the added discomfort (note that this only includes data from the test set – actual savings
would be much higher):
Page 135
Analyzing PI System Data
Now that we’re satisfied with the model, we’ll move forward with operationalizing it. For sure,
there is a room for improvement, you can try other prediction methods on your own using
this data. OLS model is saved to C:\Lab\Python\ols.pkl, but feel free to export whatever
model you’ve created.
136
Analyzing PI System Data
• Building real-time streaming data pipelines that reliably get data between systems or
applications.
• Building real-time streaming applications that transform or react to the streams of
data.
Kafka is run as a cluster on one or more servers that can span multiple datacenters. The
Kafka cluster stores streams of records in categories called topics. Each record consists of a
key, a value, and a timestamp.
Producer API
Consumer API
Applications can subscribe to topics and process the stream of records produced to
them.
Streams API
Applications can act as a stream processor, consuming an input stream from one or
more topics and producing an output stream to one or more output topics, effectively
transforming the input streams to output streams.
Page 137
Analyzing PI System Data
Connector API
Build and run reusable producers or consumers that connect Kafka topics to existing
applications or data systems. For example, a connector to a relational database
might capture every change to a table.
In Kafka the communication between the clients and the servers is done with a simple, high-
performance, language agnostic TCP protocol. This protocol is versioned and maintains
backwards compatibility with older version. The Java client is provided for Kafka, but clients
are available in many languages.
Starting with v2.8, Kafka can be run without ZooKeeper. However, this update isn’t ready for
use in production.
For any distributed system, there needs to be a way to coordinate tasks. Kafka is a
distributed system that was built to use ZooKeeper.
Objective: Configure a local instance of Apache Kafka as a target for the integrator
Questions:
• Start Zookeeper and the Kafka server. Are you able to stream values via the
console?
• Configure the integrator to stream to the server. Do you see the expected objects?
• When would we want to poll vs. send live updates?
Solution:
Before we implement the model, let’s start Zookeeper and Kafka on PISRV01. Run the
following batch commands:
138
Analyzing PI System Data
Once both services have started up, we’ll publish a streaming view to the integrator.
Page 139
Analyzing PI System Data
o
• Change the VAVCO and Floor elements in the shape to their templates so all values
will get streamed
• Select Search Derived Templates when editing the VAVCO shape element
• Click Next
• Make sure the Message Trigger is set to every 1 minute
• Make sure that the message does not have multiple copies of the same attribute –
the pre-release build that we’re using does not always handle this elegantly
140
Analyzing PI System Data
• If you do find duplicates, remove all of them except “VAVCO” using the Message
Designer until there’s only one left
Page 141
Analyzing PI System Data
• Click Next
• Set Target Configuration to Kafka and check the Topic Name (matches view name
by default). Click “Get Topics” button. If the topic name has changed to some
GUID, please select the right topic name from the drop-down list.
• Click Publish
Check the topic publication from the command prompt
142
Analyzing PI System Data
Questions:
• Write a script that consumes the topic created by the integrator. Are you able to see
the predicted values written to the console?
• Is there another way that we could achieve this same goal?
Solution:
• Open the Spyder editor on the desktop – it should open 4-StreamKafka.py in the
editor pane
• Review the different elements of the script
o Section 1 contains configuration information
o Section 2 translates the data stream to a prediction
o Section 3 connects to the PI Server using the PI Web API
o Section 4 maps the column names from the streaming view to the column
names of the event view (these will differ if the event frame attribute has a
different name than the element attribute)
o Section 5 connects to the Kafka topic where the integrator is publishing, then
makes the prediction and writes back to PI
• Update the parameters in the Configuration section of the script
o view_name: the name of the streaming view created by the integrator
o password: Windows password
o vavco_name: name of the key that contains the VAVCO value from the
streaming view – should not need to be changed in most cases
• Click Run > Run in Spyder and watch the new values written to the console
Page 143
Analyzing PI System Data
We have now taken a data science project from initial understanding to the proof-of-concept
stage built around PI System data. The model would result in measurable cost savings for
the customer. We used several off-the-shelf PI System products to enable this process,
while also leveraging Python and Apache Kafka for model development and implementation.
This exercise is not meant to cover every way a data scientist might interact with PI data. For
example, we used the PI Integrator for Business Analytics to shape and export the raw data.
You could also use the PI OLEDB Enterprise for this purpose, though that would require a bit
more scripting. The Optional section, below, walks through that process.
144
Analyzing PI System Data
With all this data we are now ready to build a report from scratch. We are going
to repeat some concepts that we learned before in the course and also take a
look at one of the other approaches, how PI Data can be integrated with 3rd party
application. This approach needs SQL scripting knowledge. No need to worry
about it for now. Just follow the instructions and analyze the results. Appendix A
of the book has all the required information about accessing PI Data with SQL
queries. So, you can check it later and do self-paced exercises with your own
pace (Chapter 11. Appendix A).
Page 145
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Modify the existing query to only include static data and exclude unnecessary
columns.
Approach:
• Open a Power BI Report and inspect a pre-existing query
• Modify the SELECT statement to only include static data columns
• Replace the query in the query editor with a new query
Copy the query to a new query in PI SQL Commander and Execute it as a sanity check:
146
Analyzing PI System Data
Now the actual modifications. Edit the select statement to only include the Name field and
static attributes: Carbon Emissions, Generation Rate, Hourly Capacity, Operator, Shift
Hours, and Technology.
Page 147
Analyzing PI System Data
We’re also going to want Station and Region information which we could get from
the ParentName() function, however PrimaryPath is not available in the
UNIT_Snapshot view. We’ll have to:
• JOIN to the Element Table using the element ID
• Alias the UNIT_Snapshot view as us and the Element table as e
• Explicitly specify whether each column is from e or us
Paste the above query back into the Power BI query editor and click OK
148
Analyzing PI System Data
Page 149
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Start with the predefined query for UNIT_GetSampledValues and restrict the result set to
include relevant time-series data only. UNIT_GetSampledValues contains 1 interpolated
value per interval per element based on start time, end time and interval for the Unit
template. You can find more information on how UNIT_GetSampledValues was created in
Appendix A.
Approach:
• Execute predefined query for UNIT_GetSampledValues
• Modify the SELECT statement to only include Name and time-series data
columns
• Import the data set to Power BI
150
Analyzing PI System Data
Modify the select statement to remove the TOP 100 part, change the start time
to *-7d, and change the end time to *. Alias e.Name as Unit. The query is then:
SELECT e.Name as [Unit], s.*
FROM
(
SELECT ID, Name, Template
FROM [Master].[Element].[Element]
)e
CROSS APPLY
[PowerBIReports].[DataModels].[UNIT_GetSampledValues]
(
e.ID, --Element ID
'*-7d', --Start Time
'*', --End Time
'1h' --Time Step
)s
WHERE e.Template = N'UNIT'
Then change the s.* part to only include the following columns: Timestamp,
Demand, Generating Efficiency, Gross Generation, Net Generation, Shift,
Total Hourly Gross Generation, and Utilization. Include UNITs and Gas
Turbines. The query becomes:
SELECT e.Name as [Unit], s.[Timestamp], s.[Demand], s.[Generating
Efficiency], s.[Gross Generation],
s.[Net Generation], s.[Shift], s.[Total Hourly Gross Generation],
s.[Utilization]
FROM
(
SELECT ID, Name, Template
FROM [Master].[Element].[Element]
)e
CROSS APPLY
[PowerBIReports].[DataModels].[UNIT_GetSampledValues]
(
e.ID, --Element ID
'*-7d', --Start Time
'*', --End Time
'1h' --Time Step
)s
WHERE e.Template = N'UNIT' or e.Template = 'Gas Turbine'
Execute the query to make sure it still works. Then head back to Power BI.
Page 151
Analyzing PI System Data
In Power BI, do Get Data -> More -> Other -> OLE DB. Build the connection
string, enter the query where it says Advanced options, and click OK. Inspect
the preview and Load the data.
We’ve done this before so there isn’t a screenshot for every click this time.
It should import 5070 rows, 30 units x 24 hours x 7 days = 5040, plus 30 rows (1
per unit) for the start time.
152
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Inspect the automatically created table relationship. Power BI should have
detected two identically named columns exhibiting a one-to-many relationship.
Approach:
• Open the Power BI relationships tab and inspect the existing relationship
In Power BI, Go to the Relationships tab, then move the Unit Performance table
to the right so that the relationship line is clearly visible and click on the line:
We can see that Power BI has already detected the relationship between the two
tables. This can be thought of as a graphical representation of an INNER JOIN
statement. These tables are now joined on the Unit column. For this to work, one
of the tables must only contain unique values in the Unit column (ie. the column
can serve as a key), as is the case here. This is referred to as a one-to-many
relationship in some documentation. Each Unit only appears once in the Unit
Specifications table, whereas each Unit appears many times in the Unit
Performance table.
Page 153
Analyzing PI System Data
154
Analyzing PI System Data
Next we will add a few calculations to the Unit Performance table that will help
assess the total Emissions produced and the total cost of generation. We will also
add columns for the day of the week and sort the Weekday in Sunday ->
Saturday order.
10.2.1 Directed Activity – Calculate the amount of CO2 produced every hour
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Add a DAX formula Calculate the amount of CO2 produced every hour
Approach:
• Add an additional column to the Unit Performance table with the amount
of carbon emissions produced.
In Power BI, navigate to the Data Tab and select the Unit Performance table.
Right-click any column and add a new column. Enter the following formula:
CO2 = 'Unit Performance'[Total Hourly Gross Generation]*RELATED('Unit
Specifications'[Carbon Emissions])
Note that Total Hourly Gross Generation has units of MWh, and Carbon Emissions has units
of g/kWh. Grams/kWh is the same as Kilograms/MWh, and therefore the result will be in KG.
Page 155
Analyzing PI System Data
Objective: Add the cost calculation column to your Unit Performance table
Approach:
• Add and additional column named Cost to the Unit Performance table
with the dollar cost per hour.
• Hint: Use this formula
Cost = 'Unit Performance'[Total Hourly Gross Generation]*RELATED('Unit
Specifications'[Generation Rate])*1000
• Take note of the input units. Cost should be in dollars.
10.2.3 Exercise – Add Column for Day of the Week and sort
Objective: Add the day of the week to your Unit Performance table, also add a
column with the numerical day of the week and sort by this value
Approach:
• Add and additional column named Weekday which shows the day of the
week as a string using the FORMAT() function
Weekday = FORMAT('Unit Performance'[Timestamp],"DDD")
• Add another column named Numday which gives the numerical day of
the week using the WEEKDAY() function
Numday = WEEKDAY('Unit Performance'[Timestamp],1)
• Sort Weekday by Numday
156
Analyzing PI System Data
Approach:
• Add a Table showing Average Generating Efficiency and Average
Utilization by Unit
Page 157
Analyzing PI System Data
• Add a Pie Chart showing how the C02 emissions from each generation
technology contribute to the whole. Add a Tooltip that shows the Cost
when the user hovers over the Pie Chart
158
Analyzing PI System Data
• Add a Clustered Column Chart showing the Sum of Total Hourly Gross
Generation with Technology as the Legend and Weekday as the Axis
• Add a Clustered Bar Chart showing the Average Hourly Cost with
Operator as the Legend and Technology as the Axis.
• Optionally improve the look and feel of the report through the use of
formatting. Bump up the font sizes, adjust column names and titles, etc.
Page 159
Analyzing PI System Data
Now you are going to create another list within Fleet Generation report, which
includes interactive view of the carbon footprint on a US map. This part is
considered to be more solo exercise with less hints and step by step approaches.
Objective:
• Determine the carbon footprint of each unit and display on a US map.
Also create a report to analyze downtime (Inactivity) events.
Approach:
• Create a new Sheet in the Fleet Generation Report (the imported tables
will be re-used)
• Geospatial information for all units in Fleet Generation is located in
C:\Class\Final Exercise\Unit Coordinates.xlsx. This data will need to
be imported into the data cube.
• To get the Inactivity Events, you can either use PI SQL Client or PI
Integrator for BA (both examples and hints you can find in hints section
below in “Hints: Event Frames with PI Integrator for BA and PI SQL
Client”, It is explained based on a different event frame template
specifically).
o You need a column to form the relationship between the Unit
Specifications table and the Inactivity Event Frames, it’s probably
easiest to join on Unit Name (GAO01, etc).
o Extract Event frames for the last 7 days
o If using PI Integrator for BA to publish the event frames, it’s
probably easiest to add the Unit Name to the Event Frame
template.
o If using PI SQL Client, Unit Name is the primary referenced
element. Start with the
ft_TransposeEventFrameSnapshot_Inactivity predefined query
and modify it as necessary.
• Import the Inactivity events for the last 7 days using whichever method
you prefer.
160
Analyzing PI System Data
• Insert a map within the client to display the location of each of the units and the
associated total hourly carbon emissions.
• Insert a table showing the number of downtime events (Inactivity Event Frames) and
average duration of event frames for each unit. Add the Average Utilization to the
same table.
• Configure the report in such a way that the Table relationships are tested. Use data
from multiple tables in the same Visual.
• Customize the display to make it more user friendly for later use and report
generation. Improve the formatting and add slicers.
Hints:
• If using PI Integrator for BA to publish the Inactivity Event frames, the Data Context
must be set to Second or else it will round to the nearest whole hour (which will
always be zero).
Page 161
Analyzing PI System Data
• Use the ordinary map, not the ESRI one. Drag and drop latitude and longitude from
the table that was imported from the Unit Coordinates.xlsx spreadsheet.
162
Analyzing PI System Data
10.5 Hints: Event Frames with PI Integrator for BA and PI SQL Client
Objective:
Create an Event View with PI Integrator for BA (first approach)
Approach:
We’ll create an event view for Gas Turbine Temperature Anomaly events. Open
Chrome and navigate to https://pisrv01.pischool.int:444/.
Create an Event View:
Page 163
Analyzing PI System Data
Point at the Fleet Generation database, since that’s where the Event Frames are:
164
Analyzing PI System Data
You just need to find an event frame of the proper type in order to start building
the shape, but let’s look at the filtering options which will allow you to narrow
down the search.
Click the filter icon:
These settings allow filtering the preview and will help you find the event you’re
looking for. On a production system there could be over a million Event Frames
spanning many different types.
Page 165
Analyzing PI System Data
Filter Events by Events allows you to filter by Event Frame name or Event Frame
Template. Select Gas Turbine Temperature Anomaly as the Event Template
and click Apply Filters to filter out the Inactivity Events:
And while we won’t set anything, let’s take a look at More Options. Click the
question mark to see explanations for the Search Mode. Minimum Duration and
Maximum Duration are self-explanatory. All descendants applies to hierarchical
event frames, which we don’t have.
166
Analyzing PI System Data
If you can’t find Gas Turbine Temperature Anomaly events, it’s possible that they
weren’t generated in one of the previous chapters. You may have to go back to
the Event Frame Generation chapter and complete the exercises.
Once you see some events, you can start to configure the Shape. Click one of
the Events, then drag and drop all Attributes:
Page 167
Analyzing PI System Data
Uncheck the box next to Event Frame Name and match Event Frames by
Template then Save.
Expand the Event Frame and drag and drop the Gas Turbine to the shape
configuration:
168
Analyzing PI System Data
Page 169
Analyzing PI System Data
Change the Event Frame Duration Data Content to Second otherwise it will be
displayed as a round number of hours.
Be sure to click Apply.
Change the Start Time to *-7d and end time to *, click Apply, then move to the
Next Screen:
Select SQL Server as the target and have it run on an hourly schedule to keep
the Event Frames current, then click Publish.
170
Analyzing PI System Data
Use SQL Server Management Studio to confirm that Event Frames were written
to the SQL Server table:
Remember that to import to Power BI from SQL Server, you use the SQL Server
provider:
Page 171
Analyzing PI System Data
Using option 2 for example, you could clean up the query in SQL Server
Management studio to the following in SQL Server Management Studio by
starting with the Select Top 1000 Rows query:
SELECT [Gas Turbine] as [Unit], [Gas Turbine Temperature Anomaly] as [Event Frame],
[Duration], [Exhaust Gas Temperature - #1 Probe], [Exhaust Gas Temperature - #2 Probe],
[Gas Fuel Flow], [Gas Fuel Pressure], [Gas Turbine Speed], [Technology], [Unit Status]
FROM [PIInt].[dbo].[Gas Turbine Temperature Anomalies]
10.5.1 Directed Activity (optional) – Create an Event Frame Query with PI SQL Client
(second approach)
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
172
Analyzing PI System Data
Approach:
We’ll get the same Gas Turbine Temperature Anomaly events as in the previous
exercise, but using PI SQL Client.
Open PI SQL Commander and Connect to the Fleet Generation database.
Recall that Template-Specific Data Models for both types of Event Frames were
created in a previous exercise. Execute the predefined query for the Gas Turbine
Temperature Anomaly View:
Page 173
Analyzing PI System Data
174
Analyzing PI System Data
Page 175
Analyzing PI System Data
176
Analyzing PI System Data
Following the SELECT command identifies the columns to be selected from the tables(s).
SELECT * - retrieves all the columns from the table being referenced.
Page 177
Analyzing PI System Data
The FROM command identifies the first (or perhaps only) table being queried.
SELECT column1, column2, column3 FROM tablename – retrieves all data for the 3
columns of tablename.
The WHERE command contains criteria to filter the data being retrieved.
equal (=)
greater than (>)
less than (<)
greater than or equal (>=)
less than or equal (<=)
not equal to (<>)
LIKE (which is a pattern matching operator)
Note: If the conditional clause is set to compare to text, the text value is encased in single
quotes (‘text’).
Retrieves only rows where column1 has a value equal to the number 5.
The LIKE operator is used to search for a specific pattern in a column. In conjunction with
the LIKE operator a wildcard of % is used for comparison. The % can represent a single
character or multiple characters. Another wildcard is the underscore (_) which can be used
to represent a single character.
178
Analyzing PI System Data
Retrieves rows from tablename where column2 values end with the letters
‘unk’
Retrieves rows from tablename where column2 values contain the letters ‘un’
Retrieves rows from the tablename where column 2 values only contains 4
characters and the middle two characters are un.
Retrieves rows from tablename where column2 values start with the letter ‘j’
To work with column/table names which have special characters, such as a space, use
square brackets:
If you wish to SELECT a column called Product Orders, enclose it in square brackets:
[Product Orders]
If you’re referring to a table whose full path is Fleet Generation, Region, Station, Unit, that
must be written as [Fleet Generation].[SouthEast].[Brick Canyon].[PLT02]
Any name may be wrapped in square brackets, so when in doubt as to what constitutes a
special character, wrap the name in square brackets.
PI SQL Client
The RTQP Engine via PI SQL Client became available in PI Server 2018 SP2
and is the successor to PI OLEDB Enterprise. The functionality is very similar to
PI OLEDB Enterprise in that PI AF and the underlying tags can be queried using
SQL statements. The main benefits over PI OLEDB Enterprise are better
performance, simpler queries, and simplified software architecture.
The following video, which is part of the new PI SQL Online course, explains the
history and landscape of the various PI SQL family of products. Your instructor
will play the video in class.
https://www.youtube.com/watch?v=W2nj29eNseQ
PI SQL Commander
The PI SQL Client installation includes a test environment, which handles the
OLE connection process and allows the user to execute queries and perform
other tasks. This test environment is PI SQL Commander Lite.
Page 179
Analyzing PI System Data
180
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Review predefined queries associated with the tables defined in PI SQL Commander.
Approach:
From within the Object Explorer, right click on PI SQL Client and click Connect:
Page 181
Analyzing PI System Data
Enter PISRV01 as the AF Server and Fleet Generation as the AF Database, then click
OK:
After making the connection, drill down to and expand the Master Catalog, then drill
down further to the Element Tables.
182
Analyzing PI System Data
This runs a sample query, which returns the top 100 elements from the Fleet Generation AF
Database:
PI SQL Commander includes one sample SQL query for each Table.
Page 183
Analyzing PI System Data
Expand the Element Table and examine the available columns. It looks like the Sample
Query doesn’t return all available columns. Modify the query to select everything and then
Execute the query:
SELECT *
FROM [Master].[Element].[Element]
Now all columns are returned, the PrimaryPath would probably come in handy.
Looks like we probably don’t need most of these columns after all, pare it down to
just the Name, HasChildren, PrimaryPath, and Template columns and Execute
the following:
SELECT Name, HasChildren, PrimaryPath, Template
FROM [Master].[Element].[Element]
184
Analyzing PI System Data
Now exclude the Regions and Stations. There are two easy ways to do this:
SELECT Name, HasChildren, PrimaryPath, Template
FROM [Master].[Element].[Element]
WHERE HasChildren = False
OR
SELECT Name, HasChildren, PrimaryPath, Template
FROM [Master].[Element].[Element]
WHERE Template = 'Gas Turbine' OR Template = 'UNIT'
Page 185
Analyzing PI System Data
In this part of the class you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Explore the Query Compendium
Approach:
We will now explore the Query Compendium, which is simply a library of example
queries.
Click Query Compendium along the top bar
The PI SQL Query Compendium will open on the right hand side
186
Analyzing PI System Data
Page 187
Analyzing PI System Data
The first query sample has the structure to retrieve Fuel Gas Flow readings for
Boilers. This won’t run against the Fleet Generation database since there is no
Boiler template. There are also samples for:
• Sampled Data (Interpolated data with a start time, end time, and interval)
• Data sampled at a specific timestamp
• Summaries (Averages, Maximums, Minimums, etc)
All of the samples in the Query Compendium run against the NuGreen database.
XML files to deploy the sample NuGreen database are bundled with every installation of PI
SQL Client and can be found in C:\Program Files\PIPC\SQL\SQL Commander\PI SQL
Query Compendium\PI SQL Client\NuGreen.
For the Query Compendium queries to run as-is, we’ll have to connect to the
NuGreen database in PI SQL Client:
188
Analyzing PI System Data
Note that it returns the current value of Fuel Gas Flow for each Boiler:
Highlight the second query (Sampled Boiler attribute values) and execute it.
Page 189
Analyzing PI System Data
The text for these queries can be copied to a new query and modified.
190
Analyzing PI System Data
In this part of the class you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Use the Query Compendium as a starting point to develop a query.
Approach:
We will now modify one of the queries from the Query Compendium to run
against the Fleet Generation database. Let’s get a list of Generating Units and
their current Demand and Net Generation values.
Ensure that you’re connected to Fleet Generation:
Page 191
Analyzing PI System Data
Change Boiler to UNIT, change Fuel Gas Flow to Demand, and change the units
to MW then execute the query.
Now we can add Net Generation to the mix and remove some of the optional
columns.
Be mindful of the commas, you’ll need to add a comma between each attribute
grouping and also remove the final comma within each attribute grouping.
192
Analyzing PI System Data
Now modify the SELECT statement to remove unnecessary columns (explicitly select
Demand_Value and Net Generation_Value). Also alias the columns. We’ll discuss aliases in
the following sections.
You can copy and paste the below query if you’re having trouble getting
everything perfect:
SELECT e.Name, v.Demand_Value as Demand, v.[Net
Generation_Value] as [Net Generation]
FROM Master.Element.Element e
INNER JOIN Master.Element.Value
<
'UNIT',
{
'|Demand',
'Demand_TimeStamp',
'Demand_Value'
},
{
'|Net Generation',
'Net Generation_TimeStamp',
'Net Generation_Value'
}
>v
ON e.ID = v.ElementID
WHERE e.Template = 'UNIT'
Execute the query to make sure it functions. The output should be as follows
however keep in mind that everyone will have different data.
Page 193
Analyzing PI System Data
Field Aliases
There’s a lot going on in the previous query that may have been glossed over. Let’s take a
closer look at the individual parts starting with the first line.
Aliases are being used to provide cleaner column names, essentially renaming the columns.
We made a decision not to include the timestamp columns and the _Value part is therefore
unnecessary.
The keyword AS is used anytime an ALIAS is defined, however the as keyword can be
omitted.
In addition, square brackets (eg. v.[Net Generation_Value] as [Net Generation]) are
necessary for any fields that contain spaces. This is sort of like using double quotes when
specifying the path to a file. When PI SQL Commander sees a space, it expects that the field
or command has ended and wants to process the next thing.
Table Aliases
Tables are also being aliased in the previous query though the AS keyword is omitted:
Sometimes table name or columns are lengthy or lack clarity. Using an ALIAS can simplify
typing and clarify table field names that are otherwise unclear.
In the above query, e is being used to identify the Element Table and v is being
used to identify the Value table.
Parameters
Parameters are being passed into Master.Element.Value between the < >. If you
expand Element\Templates you’ll see the core functions that are used in the 3-
TemplateSpecificData.sql examples.
194
Analyzing PI System Data
Builtin Functions
PI SQL Client has some built-in functions specific to the PI System. If you are
familiar with SQL, you may already be familiar with functions. For example,
aggregation functions such as Max() or Avg() return the maximum or average of
a group of rows.
An entire list of built-in functions is available in the SQL for RTQP Engine
Reference Guide. The documentation is also available as a PDF download on the
OSIsoft Customer portal.
Page 195
Analyzing PI System Data
A commonly used function is the ParentName function, which gives the name of
an element’s parent. For example, we can change the query as follows to get the
Station name for each UNIT:
SELECT e.Name, ParentName(e.PrimaryPath,0) as Station,
v.Demand_Value as Demand, v.[Net Generation_Value] as [Net
Generation]
FROM Master.Element.Element as e
INNER JOIN Master.Element.Value
<
'UNIT',
{
'|Demand',
'Demand_TimeStamp',
'Demand_Value'
},
{
'|Net Generation',
'Net Generation_TimeStamp',
'Net Generation_Value'
}
> as v
ON e.ID = v.ElementID
WHERE e.Template = 'UNIT'
The first argument in ParentName is an AF element path, and the second
argument tells it how many levels up to look:
• ParentName(e.PrimaryPath,0) gives the direct parent (eg. Octavia
Station)
• ParentName(e.PrimaryPath,1) gives the parent of the parent (eg. NORTH
Region)
196
Analyzing PI System Data
UNION Statements
You may have noticed that the query we’ve been running only return Assets that
use the UNIT template and not those that use the Gas Turbine template, despite
the Gas Turbine template being derived from the UNIT template via template
inheritance. One way to address this is with UNIONs.
In simple terms, UNIONs take the results of two queries and stack the result sets
on top of each other to form a single result set. One limitation of UNIONs is that
the input result sets must have identical columns, which may require removing
and aliasing columns to match the data sets. This will be demonstrated in the
following exercise.
The syntax is quite simple, place the keyword UNION in between two queries to
union them together. The OPTION statement must be at the very end:
SELECT * FROM Table1 WHERE Condition=’TRUE’
UNION
SELECT * FROM Table2 WHERE Condition=’TRUE’
Page 197
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Create a query to display Demand and Net Generation for all UNITs, including
Gas Turbines.
Approach:
You may have noticed that the query we’ve been working with has been missing
the Gas Turbines. One way to address this with a UNION. It’s not necessarily the
best approach, but you’ll use a UNION in a later exercise.
Start with the below query:
SELECT e.Name, ParentName(e.PrimaryPath,0) as Station,
v.Demand_Value as Demand, v.[Net Generation_Value] as [Net
Generation]
FROM Master.Element.Element as e
INNER JOIN Master.Element.Value
<
'UNIT',
{
'|Demand',
'Demand_TimeStamp',
'Demand_Value'
},
{
'|Net Generation',
'Net Generation_TimeStamp',
'Net Generation_Value'
}
> as v
ON e.ID = v.ElementID
WHERE e.Template = 'UNIT'
198
Analyzing PI System Data
Copy it, paste the copy directly below it, add a UNION in between, and change
the template in the bottom half to Gas Turbine, then execute.
There should now be 30 rows in the result set. The bottom 10 rows are the Gas
Turbines.
Page 199
Analyzing PI System Data
JOIN Statements
Rarely does data exist in one place or in one table. Sometimes the results of a query have to
come from a correlation of two or more distinct tables. To JOIN tables, a relationship is
required between the tables and must be identified in the SQL statement.
Within the joining operations, we want a result set than contains assets with useful
information from both tables, like performing a logical AND operation. There should be no
gaps where a match could not be found. This is called an INNER JOIN, and is the default
joining operation used by PI SQL Commander. Therefore, INNER JOIN and JOIN may be
used interchangeably
Two key words are used when creating joins between tables. The words JOIN and ON can
be used in the statement to identify the relationship between the tables being used.
The key word ON sets up the relationship of columns in the selected tables so the desired
rows are returned.
In our query, the time series data from the Master.Element.Value table (aliased
as v) is being JOINed to the Master.Element.Element table (aliased as e) where
the ID column from e matches the ElementID column from v.
This is necessary in order to display the element names rather than the element
ids. Master.Element.Value doesn’t contain actual element names or path
information.
200
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Get some experience with JOINs by combining data from multiple tables.
Approach:
In this example, we’ll create a query with the following columns:
• Net Generation Attribute Values
• Attribute Categories (the Category of Net Generation from PI AF)
• Unit Name and Station information
Start by executing the predefined query for Attribute under PISRV01\Fleet
Generation:
Page 201
Analyzing PI System Data
Note that the Path is not the element path, it is the attribute path. They are all |
(root) because there are no child attributes.
202
Analyzing PI System Data
Join the tables, set aliases, select only the relevant columns, and only include
attributes named Net Generation to arrive at the following query:
SELECT a.Name as [Attribute Name], a.Value, ac.Category
FROM [Master].[Element].[Attribute] as a JOIN
[Master].[Element].[AttributeCategory] as ac ON a.ID = ac.AttributeID
WHERE a.Name = 'Net Generation'
Upon inspection of the Attribute and Element columns, it looks like we can look
up the PrimaryPath by joining on the element ID.
Join the tables, set aliases, and apply the ParentName function to arrive at the
following query:
Page 203
Analyzing PI System Data
204
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Create Template-Specific Data Models for the Fleet Generation database to be
used in analyzing unit generation data.
Approach:
First we’ll create a new Catalog and Schema to keep our custom database objects
separated from the Master Catalog. This step isn’t mandatory, it’s just to avoid cluttering the
Master Catalog. Catalogs can be used to organize custom database objects. For example,
each person could have their own sandbox, or objects could be separated by report or
application.
Name it PowerBIReports:
Page 205
Analyzing PI System Data
Name it DataModels:
Now you can right-click the new Schema and select Create Template-Specific Data
Model…:
The process is fairly straightforward. Select whether you want to create them for Event
Frames or Elements. In our case we want Elements of type UNIT. Click Next:
206
Analyzing PI System Data
Page 207
Analyzing PI System Data
Name it UNIT_Snapshot, then drag and drop all attributes except Location and Name and
click OK:
208
Analyzing PI System Data
Leave the default name, drag and drop all attributes, then click OK:
Page 209
Analyzing PI System Data
Now that both data models have been specified, click Next:
210
Analyzing PI System Data
Once the data models have been created, we can execute the predefined queries
and modify the query to suit the application or report:
Page 211
Analyzing PI System Data
Objective:
• Create Template-Specific Data Models for Inactivity and Gas Turbine
Temperature Anomaly Event Frames
Approach:
• Create Template-Specific Data Models for Inactivity and Gas Turbine
Temperature Anomaly Event Frames.
• Use the default names and include all attributes.
• Verify the results of the transpose function through the execution of the
pre-defined query.
Hints: The steps are almost identical to the previous exercise except this time for
Event Frames.
You’ll have to go through the wizard for each type of Event Frame.
Only the Event Frame View is necessary for each Event Frame type.
Saved Views
Often Administrators would prefer to create Views for end-users who are not
familiar with SQL queries. Often Views are queried using a basic SELECT *
query to return all data without any WHERE clause and without selecting
individual columns. This masks the complexity and size of the query (eg. table
JOINS and UNIONs of several tables) but places the burden of maintaining the
212
Analyzing PI System Data
PI SQL Commander supports the creation of views. Views allow you to name a
stored query and it is this name that appears in the table list when importing data
into BI clients. Views are the easiest way to allow users to select which datasets
they want from PI AF when creating a report, as they do not need to understand
the complexity of the underlying SQL query.
Views are created using SQL syntax, but PI SQL Client can give you a template
to start with.
Selecting Scripts -> Create View -> New Query Editor Window produces the
beginning of a query:
At this point, it is a matter of naming the View by replacing <view name> and
copy pasting the query by into <view definition> placeholder.
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Create a View using a pre-existing query.
Page 213
Analyzing PI System Data
Approach:
In the PowerBIReports catalog, run the Create View script:
214
Analyzing PI System Data
You should see Sample View. Execute the predefined query to confirm that it’s
functional:
Page 215
Analyzing PI System Data
In this part of the class, you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Import PI SQL Client query results into Power BI.
Approach:
Open Power BI and start a new report.
Rather than using the SQL Server provider we’ll use the PI SQL Client provider
which is found under Get data -> More:
216
Analyzing PI System Data
Page 217
Analyzing PI System Data
At this point you have a couple options, you can expand Advanced and paste in a
query or click through and use a Wizard to pick from the available Views. We’ll
show the wizard first.
Click OK:
Expand PowerBIReports -> DataModels and the Views will be available for
selection:
218
Analyzing PI System Data
Check the box next to UNIT_Snapshot. From here you can click Load to import
all columns, which is fine but will import a lot of columns we won’t ever use.
If you choose Transform Data, you’ll have the opportunity to perform a variety of
operations, notably removing columns.
Page 219
Analyzing PI System Data
But we’re not going to use this just now. Close the Power Query Editor, click
Not now.
Ignore the banner. You can kill it but it will just come back. We’ll eventually
be discarding this Power BI file and starting fresh in the next Chapter.
Now we go through the entire Get data process again to get to the point
where we paste in a query instead of selecting a View.
220
Analyzing PI System Data
Page 221
Analyzing PI System Data
222
Analyzing PI System Data
Don’t click OK at the next screen or you’ll have to start the get data process
over again!
Expand Advanced options, you can paste in any query from PI SQL Commander
in the SQL Statement field:
Page 223
Analyzing PI System Data
Click Load.
224
Analyzing PI System Data
Select the Data view and confirm the data has been imported:
Close the report without saving. We’ll start fresh in the next chapter.
Discussion
Objective:
Discuss differences between PI Integrator for BA and PI SQL Client
Approach
• Which method do you prefer to create views? PI Integrator for Business
Analytics or PI SQL Client?
• Pros and Cons of both systems?
• What format would we like the data to be in for processing by BI clients?
• What should be added to the SQL queries to improve the format?
• Do these queries match what we want in our reports?
• If not, what is lacking?
Page 225
Analyzing PI System Data
226
Analyzing PI System Data
SQL Server Reporting Services is a builtin feature of Microsoft SQL Server that
provides a web-based portal for hosting reports that leverage relational data
sources. It’s not flashy, but it meets a lot of reporting requirements. As we will
see, SSRS reports are more difficult to configure than Power BI reports. A big
benefit of SSRS is that it is likely already installed and available at your
organization. If you’re lucky enough to have admin rights on your own SQL
Server, then you could get started without going to IT for additional licenses.
We will be using Microsoft SQL Server Report Builder to build a simple report and
publish this report to the SSRS Reports portal.
In this part of the class you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
• Better understand the data set used in the following chapters
Approach:
We will take a few minutes to review the PI Big Tires Co. PI AF Database. We
are going to build upon the Tire Press example, which you may recognize from
the Building PI System Assets and Analytics with PI AF Class. Open PI System
Explorer and navigate to the PI Big Tires Co. database.
Page 227
Analyzing PI System Data
The database is loosely based on a workshop one of our Systems Engineers did
with a tire company. It’s a basic hierarchy with 12 Tire Presses.
Here is how a Tire Curing Press works: raw tires are loaded individually into a
Tire Curing Press. Once the tire is loaded, the press closes up and temperature
and pressure is applied to cook and mold the tire. After the cooking time has
elapsed, the press opens up and the tire is unloaded into a cooling unit where
fans blow air until the tire reaches a specific temperature.
228
Analyzing PI System Data
Objective:
Write a PI SQL Client query that returns all elements that use the Site Template.
This will be used later during the report configuration. The desired data set is:
Approach:
• Connect to the PI Big Tires Co. database in PI SQL Commander
• In the Query Compendium, find a sample query that returns all Elements
that are based on a certain template.
• Include only the Name column and change the template to ‘Site’.
• Optionally use the ORDER BY statement to sort the output alphabetically
Page 229
Analyzing PI System Data
Objective:
Write a PI SQL Client query that returns all elements that use the PressTemplate
template and the site they belong to. This will be used later during the report
configuration. The desired data set is:
Approach:
• Start with the sample query from the previous exercise.
• Remove the Description column and change the template to
‘PressTemplate’.
• Use the ParentName() function to get the Site associated with each
Press.
• Optionally use the ORDER BY statement to sort the output alphabetically
by Press
230
Analyzing PI System Data
In this part of the class you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective:
Get the daily Net Tires Produced and Scrap Tires values, which will be used later
in an SSRS Report.
Approach:
The desired result set is one row per day for the past several days. The columns
will be Press, the Date, Net Tires Produced, and Scrap Tires.
We will start by modifying a query from the query compendium that returns
sampled data.
Double click on 3-TemplateSpecificData.sql and find the query for Sampled Boiler
attribute values
Page 231
Analyzing PI System Data
Modify the query to use PressTemplate and the Net Tires Produced and Scrap
Tires attributes. Remove unnecessary attribute properties. Your query should
then be:
SELECT e.Name, sv.*
FROM Master.Element.Element e
CROSS APPLY Master.Element.GetSampledValues
<
'PressTemplate',
{
'|Net Tires Produced',
'Net Tires Produced_Value'
},
{
'|Scrap Tires',
'Scrap Tires_Value'
}
>(e.ID, 'y', 't', '1h') sv
WHERE e.Template = 'PressTemplate'
Change the start time to ‘T-6d-1s’, end time to ‘T-1s’, and sample interval to ‘1d’.
This will sample at the close of the day (11:59:59 PM) before the count resets to
zero, while still giving the date the production occurred in, rather than the day
after.
Execute the below query which includes the aforementioned modifications.
SELECT e.Name, sv.*
FROM Master.Element.Element e
CROSS APPLY Master.Element.GetSampledValues
<
'PressTemplate',
{
'|Net Tires Produced',
'Net Tires Produced_Value'
},
{
'|Scrap Tires',
'Scrap Tires_Value'
}
>(e.ID, 'T-6d-1s', 'T-1s', '1d') sv
WHERE e.Template = 'PressTemplate'
232
Analyzing PI System Data
Format the TimeStamp column to remove the hours, minutes, and seconds using
the Format() function and add some aliases to improve the column names.
SELECT e.Name as Press,
sv.[Net Tires Produced],
sv.[Scrap Tires],
FORMAT(TimeStamp, 'dd-MMM-yy') as Date
FROM Master.Element.Element e
CROSS APPLY Master.Element.GetSampledValues
<
'PressTemplate',
{
'|Net Tires Produced',
'Net Tires Produced'
},
{
'|Scrap Tires',
'Scrap Tires'
}
>(e.ID, 'T-6d-1s', 'T-1s', '1d') sv
WHERE e.Template = 'PressTemplate'
There is a requirement for the report that the current tire production be displayed
on the report for the bottom row using today’s date. GetSampledValues uses
interpolation, and hence can’t provide this value. We will have to UNION the
above query with another query that provides the current values.
A similar process to the above can be used to get the current values. You can
just copy and paste this from the workbook .docx in the class folder.
SELECT e.Name as Press,
v.[Net Tires Produced_Value] as [Net Tires Produced],
v.[Scrap Tires_Value] as [Scrap Tires],
FORMAT(DATETIME('T'),'dd-MMM-yy') as Date
FROM Master.Element.Element e
INNER JOIN Master.Element.Value
<
'PressTemplate',
{
'|Net Tires Produced',
'Net Tires Produced_TimeStamp',
'Net Tires Produced_Value'
},
{
'|Scrap Tires',
'Scrap Tires_TimeStamp',
'Scrap Tires_Value'
}
>v
ON e.ID = v.ElementID
WHERE e.Template = 'PressTemplate'
Page 233
Analyzing PI System Data
Finally, UNION the above two queries together and order the results by Press
and Date. The final query is then:
SELECT e.Name as Press,
sv.[Net Tires Produced],
sv.[Scrap Tires],
FORMAT(TimeStamp, 'dd-MMM-yy') as Date
FROM Master.Element.Element e
CROSS APPLY Master.Element.GetSampledValues
<
'PressTemplate',
{
'|Net Tires Produced',
'Net Tires Produced'
},
{
'|Scrap Tires',
'Scrap Tires'
}
>(e.ID, 'T-6d-1s', 'T-1s', '1d') sv
WHERE e.Template = 'PressTemplate'
UNION
SELECT e.Name as Press,
v.[Net Tires Produced_Value] as [Net Tires Produced],
v.[Scrap Tires_Value] as [Scrap Tires],
FORMAT(DATETIME('T'),'dd-MMM-yy') as Date
FROM Master.Element.Element e
INNER JOIN Master.Element.Value
<
'PressTemplate',
{
'|Net Tires Produced',
'Net Tires Produced_TimeStamp',
'Net Tires Produced_Value'
},
{
'|Scrap Tires',
'Scrap Tires_TimeStamp',
'Scrap Tires_Value'
}
>v
ON e.ID = v.ElementID
WHERE e.Template = 'PressTemplate'
ORDER BY Press, Date
234
Analyzing PI System Data
We will use Report Builder as the report-authoring tool for the tire production
report. The client itself is a free download from Microsoft and there are numerous
tutorials available online. Let’s dive into the report configuration and learn by
example.
In this part of the class you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Approach:
Launch Report Builder from the Start Menu, Taskbar or the desktop shortcut
Page 235
Analyzing PI System Data
It will take a little while to connect to the report server on first launch. Eventually
you’ll be prompted to open or create a report. Select Blank Report.
Click the Title box and enter the title “Daily Tire Production” and center the text.
Typical text editing capabilities are available on the Home tab.
236
Analyzing PI System Data
The Insert Tab includes all the objects that can be included in the report. Most of
the visuals require data which we have not yet imported. For now add a text box
and enter whatever text you’d like:
There’s not much on the report, but we can do some sanity test. In the Home tab,
click Run to render the report on the client. This is used during report
development before the report is published to the report server.
Page 237
Analyzing PI System Data
Note that the ExecutionTime placeholder shows the time the report was run
Publish the report to the Report Server by doing a File -> Save As:
238
Analyzing PI System Data
Now that the report has been saved, launch Chrome and navigate to
https://pisrv01.pischool.int/Reports. There’s a shortcut on the Desktop and the
bookmark bar.
Page 239
Analyzing PI System Data
The report should then load. At this point you can download or print the report in
a variety of formats, but most of the time you would just view the report online in
the portal.
240
Analyzing PI System Data
In this part of the class you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Approach:
In order to use the queries developed in the previous exercises, we will need to
define a data source. It’s a judgement call, but much of the time it’s better to
expose the data source through the portal so that it can be shared amongst
several users and reports.
Go back to the SSRS Reports home page by clicking SQL Server Reporting
Services. Three data sources are defined already: One for SQL Server, one for
the NuGreen AF database via PI OLEDB Enterprise and another for the NuGreen
AF database via PI SQL Client.
Page 241
Analyzing PI System Data
Before creating out own, let’s inspect NuGreen_PISQLClient. Click the ellipses
and select Manage.
Note that the connection type is OLE DB and review the Connection string.
There’s no GUI for configuring the Connection string here, so it has to be
manually configured. You might decide to copy an existing connection string
and modify the relevant fields to make this easier.
Click Home to go back
242
Analyzing PI System Data
Since there is no GUI in the report portal for generating the connection string,
we’ll have to enter it manually or use an existing example.
Page 243
Analyzing PI System Data
Open the file and the Data Link properties window should launch.
244
Analyzing PI System Data
Enter PISRV01 as the AF Server and PI Big Tires Co. as the AF Database, then
test the connection.
Page 245
Analyzing PI System Data
On the All tab, confirm that Integrated Security is set to SSPI. Click OK to finish
editing the file.
Now open ConnectionString.udl with a text editor to view the text, which is a
connection string we can paste into the data source configuration.
Paste it into the Data Source configuration, Test the connection, and click Create
246
Analyzing PI System Data
The new data source will now be available to those with access to the report
server and sufficient permissions.
Page 247
Analyzing PI System Data
In this part of the class you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective: Import the daily production data and display the results in a simple table
Approach:
Using the same Tire Production Report from the previous exercises, add the Data
Source we just created. Right-click on Data Sources -> Add Data Source…
248
Analyzing PI System Data
Page 249
Analyzing PI System Data
Now that a Data Source is defined, we can add a dataset which contains the
actual imported data.
Right-click the PIBigTires_PISQLClient Data Source and select Add Dataset…
250
Analyzing PI System Data
Page 251
Analyzing PI System Data
You can quickly check that the query is valid by examining the fields list. If the
query returned results then the fields list will show the column headers from the
result set.
252
Analyzing PI System Data
Click Fields, then OK once you’ve confirmed that the fields show up.
Next let’s add a table. From the Insert tab, select Table -> Insert Table
Page 253
Analyzing PI System Data
Click some white space where you want the table to show and the table object
will appear:
Now simply drag and drop fields from the TireProduction dataset to add them to
the table. Add the Date, Press, Net_Tires_Produced, and Scrap_Tires fields
to the table.
254
Analyzing PI System Data
Page 255
Analyzing PI System Data
Another sanity check, Save the report and go back to the report server
https://pisrv01.pischool.int/Reports.
Open your freshly saved report and confirm that the data displays correctly.
Depending on when the report is viewed, you may see a problem with the report.
The dates are actually casted as strings and listed in alphabetical order. If
the result set straddles a boundary between months, then they won’t display in
chronological order. Let’s fix this.
We’re going to add another column to the dataset to convert the Date strings to
the date data type. There are of course other ways to do this.
256
Analyzing PI System Data
Page 257
Analyzing PI System Data
Paste in the below expression and click OK. You’d find the expression by
Googling “SSRS Report Builder convert string to date” or something like that.
=Date.Parse(Fields!Date.Value)
258
Analyzing PI System Data
Optionally Run the report and confirm the sort order. You won’t see any
difference if all the dates are in the same month.
Page 259
Analyzing PI System Data
In this part of the class you will perform a learning activity to explore the
different concepts presented in this chapter or section. You may be invited to
watch what the instructor is doing or perform the same steps at the same time.
You may play a game or hold a quiz. Your instructor will have directions.
Objective: Add user selections for Site and Press in order to filter the table
Approach:
We’ll add some parameters to filter the report. We want to be able to filter by Site
and by Press.
The first step is to add datasets for the parameter selections.
Create a new dataset.
260
Analyzing PI System Data
Page 261
Analyzing PI System Data
Check the Fields tab to confirm the query is valid. There should be one field
called Site. Click OK.
262
Analyzing PI System Data
Page 263
Analyzing PI System Data
264
Analyzing PI System Data
Check the Fields tab to confirm the query is valid. Click OK.
On the View tab, check the box for Parameters and the parameters section will
be added to the report.
Page 265
Analyzing PI System Data
On the Available Values tab, Get values from a query using Dataset Sites. Use
the Site column for both the Value field and the Label field. Click OK.
266
Analyzing PI System Data
Run the report from the Home tab. When the report is run the user will be able
to select a Site. The selection doesn’t filter the report yet. This is just a
sanity check.
Page 267
Analyzing PI System Data
268
Analyzing PI System Data
Select the Parameters category, then double-click on Site to set the expression.
Click OK.
Page 269
Analyzing PI System Data
The filter configuration should look like the following screenshot. Click OK.
Keep in mind that the Site selection does nothing at this point except filter
the Presses Dataset. This does not yet interact with the Tire Production table.
We still have to add a Press parameter and filter the TireProduction Dataset
based on the selection.
270
Analyzing PI System Data
Page 271
Analyzing PI System Data
In the Available Values tab, Get the press list from the filtered query. Click OK.
The Press selection still has no impact on the Tire Production table. We still
need to filter the TireProduction dataset based on the Press parameter.
Go back to Design Mode.
Modify the TireProduction properties.
272
Analyzing PI System Data
Add a filter. Use Press as the first selection. Click the expression icon.
Page 273
Analyzing PI System Data
We want to filter the Dataset to include all rows that match the Press parameter.
Double-click Press to complete the expression. Click OK.
274
Analyzing PI System Data
Page 275
Analyzing PI System Data
Discussion
Approach
• Which tool is easier to configure?
• How much of a game changer is the slicer capability?
• Why would you use one tool over the other?
276
Analyzing PI System Data
Page 277
Analyzing PI System Data
278