SSIS 2005 Hands On Training Lab
SSIS 2005 Hands On Training Lab
1. Introduction
SQL Server Integration Services (SSIS), the successor to DTS in SQL Server 2005, is an all-new application
that provides a data integration platform from easy-to-use tasks and transforms for the non-developer to a
robust object model supporting creation of custom tasks data transformations. With the SSIS platform you
can create solutions integrating data from non homogenous data sources, cleansing, and aggregation as
well as work flow surrounding the data processing. SSIS goes beyond standard ETL processing (Extract
Transform Load) providing components such as Web Service, XML, WMI tasks, and many more. Add to the
rich list of out of the box components a full object model underneath, and users can create their own tasks,
transformations, data sources/destinations, and Log Providers to suit almost any scenario.
Someone can gear up on the above topics from the SSIS portal on MSDN.
http://msdn.microsoft.com/SQL/sqlwarehouse/SSIS/default.aspx
Where they can find links to the following two webcasts as well as other webcasts and articles.
• Introducing SQL Server Integration Services for SQL Server 2005 (Level 200)
• TechNet Webcast: Deploying, Managing and Securing Integration Services (Level 300)
3. Scenarios
• This Lab is comprised of several smaller labs to better cover various portions of the SSIS product as well as
provide natural checkpoint for the training such that if a particular exercise cannot be completed they still
participate in later exercises.
I. Installation
• Extract the zip file to the root c:\ so you end up with “C:\_SSIS_Training”.
• Attach the following 2 databases. “SSISLOGGING.mdf” for audit and logging data (tables myfileaudit
and ssis_ErrorRows)and database “SSISTRAINING.mdf” (tables mydescriptions) for data destination.
• We will be using data from the AdventureWorks which ships with SQL 2005 (optional during installation)
•
4 Add a Derived Column Transform to 1. From the Toolbox, Double-click or drag a ‘Derived Column’
the dataflow Transform to the Data Flow Canvas.
• Double-Clicking will automatically 2. If the transform is not connected to the OLE DB Source, then
connect the new object to the click on the OLE DB Source, click the Green Output arrow and
previous object, assuming its still connect it to the Derived Column Transform.
selected. If not select the upstream
object and drag the green ‘output’
arrow to the object below.
6 Add a Data Viewer to the data path 1. Double click the Green Data Flow path between the Derived
between the Derived Column and column and destination.
DataReader Dest 2. In the Data Flow Path editor click ‘Data Viewers’ on the left.
• The viewer shows 1 ‘buffer’ of data 3. Click the ‘Add’ button below and leave the default ‘Grid’ as the
vs you specifying a # of rows. A type of viewer.
buffer is part of the dataflow 4. Select the ‘Grid’ tab at the top and note by default, all columns
architecture. See BOL topic will be displayed in the viewer. Lets leave that for now.
suggestions at bottom. 5. Click ‘ok’ and ‘ok’ again.
• Imagine using a data viewer to
trouble shoot bad data…your in the
viewer, see the issue and can hit the
Author: Craigg Page 9 of 53 Microsoft Corporation
‘Copy Data’ button to get data to the
Windows clipboard which you can
now paste into and fire off an email
to your support team or data vendor.
8 Execute the Package 1. Save the package using the icon or from the “File” menu
• You can add more than 1 viewer. choose “save selected items”.
Perhaps 1 full grid and another that 2. Click the button on the toolbar or select “Debug Start
is only 2 keys fields, and perhaps a Debugging” menu option.
graph. 3. The package should execute and open the Data Viewer.
• Data viewers are only ‘functional’ 4. In the Data Viewer scroll all the way to the right and note the
while in BIDS and do nothing and derived column we created is there and should contain a
affect nothing when the package is number indicating the location of the “@” symbol per row.
executed at the command line with 5. Note the objects stay yellow while the data viewer is attached
dtexec.exe and open. The objects will turn green when all the buffers
• If someone asks ‘how to trouble have been viewed and\or you close the viewer and execution
shoot data a row at a time’, the data can complete.
viewer is the answer. Again, you 6. This data set is small and not as easy to test the full data
cannot control the number of rows viewer functionality but you can use the green arrow to
returned, its based on 1 ‘buffer’ advance the data viewer to the next buffer. Detach will
which is a unit of measure in the ‘unhook’ the viewer from the dataflow and the package will run
10 Edit the Data Viewer 1. Double click the Green Constraint data path between the
• To include only our 2 derived Derived column and destination.
columns and the original email 2. In the Dataflow path editor click ‘Data Viewers’ on the left
field. So “EmailAddress”, “Derived 3. The existing Grid viewer should be selected on the right
column1”, and “DerivedColumn2”. 4. Click the ‘Configure’ button at the bottom
5. This time, lets only have the 3 relevant columns in the viewer
so remove all fields but the 3 we want using the < button or
remove the all with << and then add back the 3 we want with
>
6. Click OK to close the data view dialog
7. Click OK to close the Data Path Editor dialog
3.3.1.Comments
• It pays to verify expressions, parsing, and inbound data in general before pushing it further down your flow.
Discovering a mistake at the end just means that much more to edit on the way back up. This is be design
overall, because each component in the flow holds meta data about the objects its dealing with as inputs
and outputs. Because each component can do so much ‘locally’ with its known meta data and because the
wide variety of transforms which do a wide variety of things to the local meta data, its not realistic to have
changes made to one component automatically reflected up or down the flow.
• (Another Example of a Great platform. ) As an example of how great a platform SSIS is, lets look at how
there are 3 basic levels to parsing….1) This lab did some basic parsing with the Derived column transform,
which can handle rather sophisticated logic. However if you need very advanced parsing and per row
sniffing, your best off using the 2) Script Component, which will allow you to use VB.net code, yet it’s aware
of buffers and rows in the data flow task. The script component, with a bit of advanced tweaking can allow
you to have one inbound row you parse in some way, to result in more than 1 outbound row. 3) for even
more advanced handling or perhaps just to ease re-use, there is the Data Flow API and you can write your
own custom transform for sophisticated logic OR just to make re-use easier because the custom transform
can be available from the Toolbox
3.3.2.BOL
• “How to: Add a Data Viewer to a Data Flow”
• “Debugging Data Flow”
• “Precedence Constraints”
• “Creating a Transformation Component with Asynchronous Outputs”
4 Or Output Condition
• So we will create the 4 conditions in de Name
the Conditional Split, which creates 4 r
outputs.
1 1to1000 ProductDescriptionID > 0 &&
• The next step(s) we create
ProductDescriptionID < 1001
downstream processing for each of
2 1001to1 ProductDescriptionID > 1000 &&
the outputs.
500 ProductDescriptionID < 1501
3 1501to1 ProductDescriptionID > 1500 &&
800 ProductDescriptionID < 1801
4 over180 ProductDescriptionID > 1800
1
5 1. Click OK to close the Conditional Split
6 Add a Flat File Destination 1. From the Toolbox add a Flat File Destination
• We are creating downstream 2. When the destination component is connected to the Conditional Split,
processing for 1 of the 4 outputs a dialog will ask you to decide which output to use. For this first
output select “1to1000” from the previous table.
• We want each output from the
3. Open the Flat File Destination and note the box “overwrite data in
conditional split to end/create a the file” is checked so each execution will result in a unique dataset
file. (vs. cumulative).
4. Click ‘New’ to add a new connection manager of type ‘Delimited’.
5. Use “ProductDescriptions1” for both Connection Manager name and
description of the connection manager.
6. For filename enter
“C:\_SSIS_Training\LoopFiles\ProductDescriptions1.txt”
7. In the window ‘Flat File Connection manager editor” Check the box
“column names in the first data row’
8 Add 3 more destinations for the other 1. Add a new Flat File Destination first then link it to the
3 outputs
upstream Conditional Split, by clicking on the Conditional
Split and note you get another green data path to
Author: Craigg Page 16 of 53 Microsoft Corporation
use\drag to the new Flat File Destination.
2. Each flat file destination needs to use a unique file
connection manager and point to\create a unique
filename. See the table below
3. Be sure to configure the other 3 Connection Managers as
we did the first one .e.g. check the Unicode box, add the
pipe “|” text qualifier, Check box for column names in first
• row, On the Advanced page…Length of Description field
400, and only the Description field has “Text Qualifier” set
to TRUE.
4.
9 Path Name Filename to use in Connection
/Description Mgr
1001to1 ProductDescriptio C:\_SSIS_Training\LoopFiles\Produ
500 ns2 ctDescriptions2.txt
1501to1 ProductDescriptio C:\_SSIS_Training\LoopFiles\Produ
800 ns3 ctDescriptions3.txt
over180 ProductDescriptio C:\_SSIS_Training\LoopFiles\Produ
1 ns4 ctDescriptions4.txt
1 Visually your package will look like….
• You may want to play with the auto
layout option under Format>>Auto
Layout>>Diagram menu option
1 Add a new user variable to hold row 1. From the “SSIS” menu choose “Variables”. You should see the
counts “myfilenamefull” already there.
• Later, inside the DataFlow task we will
map RowCount transform to this variable, 2. Click the Add new variable button . You may want to widen the
in effect storing the row count of the Data variables window.
Flow Path. 3. Name should be “myrowcount”
• The scope of a variable cannot be 4. Scope should be the top most, of the package itself, so
changed. If you notice the scope is “Loopandloadproductiontable”.
wrong, delete the existing variable and 5. Data Type of int32
create a new one. If you are trying to 6. Value can be left at 0
create a package level variable, click the
Control Flow canvas, (not any of the
containers on it) then when you create a
new variable it is assumed you are
creating one at the package level.
1 Add a dataflow task 1. From the Toolbox add a Data Flow Task, to the inside of the loop
• The task will be processed once per container, and open the task. You need to drag and drop, the easier
iteration of the loop. Therefore in our double-click method does not work when adding objects to the inside
case for each file in the folder the of a container.
1 Add Flat File Source and Connection 1. Open the Data Flow Task
Manager 2. From the Toolbox add a Flat File Source to the Data Flow Task, and
• We define a single and specific file in this open the Flat file Source.
step, it could be to any of our existing files. 3. Click new to create a new Flat File Connection Manager
After this step we will define a Property 4. For “Description” and “Connection Manager Name” enter
Expression on the new “LoadProductDescriptions”
LoadProductDescriptions connection 5. For the “File Name” point to our first file
manager. The Property Expression will “C:\_SSIS_Training\LoopFiles\ProductDescriptions1.txt” (no
dynamically alter our connection manager to quotes)
load a different file per iteration of the loop. 6. Check the box “Unicode”
7. Use the default type of Delimited
8. Enter the pipe symbol “|” as the text qualifier ** We want to do
this because some of the descriptions contain double quotes,
commas, and semi-colons which can throw off normal delimiter
parsing.
9. Check the box “Column names in the first data row’
10. Click the “Columns” Page and see a preview of the data
11. Click the “Advanced” page. The column “ProductDescriptionID”
should be highlighted.
12. Change the “Text Qualifier” property to False.
13. Click the ‘Description’ Field, and set the OutPutColumnWidth to 400,
and Change the “Text Qualifier” property to True.
14. Click the column “Row Guid” and change the “Text Qualifier” property
to False.
15. Click the column “Modified Date” and change the “Text Qualifier”
property to False. We only want our “|” text qualifier for the
Description field\data.
16. Click OK to close the Connection Manager dialog.
17. Click OK to close the “Flat File Source Editor”.
2 Modify Connection String to 1. Stop execution of the package if you have not done so already.
dynamically change (via property 2. In the Connection manager window select (but not open) the
expression) with loop iterations… “LoadProductDescriptions” connection manager. We want to view its
• Remember the variable “myfilenamefull” properties in the property sheet not the editor window. (**)
we created earlier. We need that to feed
3. If the property sheet is not already visible in the right side of your
our connection string per iteration of the
loop, via a property expression. screen click the button or choose “properties” from the View
• We will build another property expression
in the LoadApplications sample. menu.
• “User::” indicates the NameSpace.
4. In the properties pane click in the empty row for “Expressions” and
4.3.Conclusion
4.3.1.Comments
• One of the most common uses for Property Expressions is for dynamic Connection Strings.
4.3.2.BOL
• “Foreach Loop Container”
• “How to: Create a Property Expression
• ”Advanced Integration Services Expressions”
• “Using Property Expressions to Specify Data Flow Property Values”
4 View the results of the audit data… 1. Open SQL Management Studio, and connect to your SQL server
“DBIL405”
2. Expand the Databases Folder, down to the database
SSISLOGGING, and table “myfileaudit”.
3. Right click the table name “myfileaudit” and choose “Open Table”
4. You should see data rows, with a row count column matching the
values we visually see.
5. You can keep re-running the packages and refresh the query in
SQL Management Studio with and see rows build up. Of
course the row counts match as we are re-running the same files
Author: Craigg Page 26 of 53 Microsoft Corporation
but the ExecutionID and StartTime should Differ for each
execution.
5.3.Conclusion
5.3.1.Comments
• While we earlier used the audit transform to capture extra package data per row of execution (so Data Flow
level auditing), you might want to also capture data at the Control Flow level, especially when there are loops.
You can now see this is very easy to do with an execute SQL and the various/handy system variables.
5.3.2.BOL
• “Execute SQL Task”
• “How to: Map Query Parameters to Variables in an Execute SQL Task”
2 Add Execute process task 1. Add and open an Execute Process Task to the new package.
5. Using property expressions we will 2. Click the “Process” page
configure the single task to open a 3. Enter “notepad.exe” for the ‘Executable’ property.
different application based on the day 4. Click the “Expressions” page
of the week. Open either notepad.exe
5. In the right pane click in the empty row for “Expressions” and then press the
or mspaint.exe depending on day of
ellipse button
week. Using a property Expression on
6. Choose the “Executable” property and either copy/paste the following
the 'executable' property.
expression or press the other ellipse button to go into the expression builder
6. Sunday=1, Monday=2, Tuesday=3,
and build this yourself. Its good practice to build and you can test this
4=Wednesday, 5=Thursday…
expression.
7. DATEPART("weekday", GETDATE()) ==5?"notepad.exe":"mspaint.exe"
8. Click “OK” as needed (2-3 times).
Author: Craigg Page 28 of 53 Microsoft Corporation
9. Close the Execute Process task.
6.3.Conclusion
6.3.1.Comments
• Property Expressions are a very powerful feature. One of the most common uses is for dynamic connection
manger changes like one of the previous labs. Another is for the Send Mail Task. For example the following
expression is used in a property expression on the Subject property of a SendMail task...the message will
arrive with the name, start, duration information in the email subject, very handy! You can extended it and
add in a variable that is populated with a Row Count transform, then without even opening the message
you can see who, what, when.
6.3.2.BOL
• “Using Property Expressions in Packages”
• “Execute Process Task
• “Execute Process Task Editor (General Page)”
2 Edit the Execute process task 1. Open the Execute Process Task.
• Using property expressions we will 2. Click the “Expressions” page on the left.
configure the single task to open a 3. Expand the ‘Expressions’ list on the right, you should see our existing
different application based on the day of expression for the Executable property
the week. Open either notepad.exe or 4. DATEPART("weekday", GETDATE()) ==5?"notepad.exe":"mspaint.exe"
mspaint.exe depending on day of week.
Using a property Expression on the 5. Either manually edit or click the ellipse button to edit the expression
'executable' property. 6. Replace “notepad.exe” with @app1 and mspaint.exe with @app2 (no quote) so
• Sunday=1, Monday=2, Tuesday=3.. the expression looks like…
7. DATEPART("weekday", GETDATE()) ==5?@app1:@app2
8. Click OK as needed
9. Close the task
3 Execute to insure the package 1. Save the package, and execute
behaves as it did before
4 Now add an xml configuration to 1. Stop Execution\Debugging if you have not done so already.
Author: Craigg Page 30 of 53 Microsoft Corporation
the package 2. From the SSIS menu choose “Package Configurations”.
• After you select the type of configuration 3. Click the “Enable Package Configurations” check box.
you want, you select what properties of 4. Click the Add button which will launch the configuration wizard.
which objects are added to the 5. Click Next
configuration file. 6. Leave the radio button Specify configuration settings directly’ selected.
• After the configuration is added, the 7. Choose an XML Configuration file as the type.
SSIS execution engine knows to look at 8. Enter C:\_SSIS_Training\myconfig.dtsConfig as the configuration filename.
the contents of the file, during the initial 9. Click Save.
load of the package, mapping the values 10. Click Next.
in the file to our variables. 11. Scroll up in the objects list and find our variables app1 and app2. NOTE: If you
do not see the variables listed under the package container, you likely had the
scope incorrect when you created them. Go back to the variables window and
verify the scope.
12. For each, we want to drill down to and check the box for the ‘Value’ property.
13. Click “Next” and you will see the confirmation page of the type and contents of
the configuration.
14. Click “Finish” and Close the Configuration organizer.
•
5 Review and edit the configuration 1. You want to edit the config file, C:\_SSIS_Training\myconfig.dtsConfig
file contents 2. You could use notepad.exe (you will want to choose ‘word wrap from format
menu)
3. Start>Run C:\_SSIS_Training\myconfig.dtsConfig
4. Or use BI Studio itself, by going to the File Menu>>Open >> File
7.3.Conclusion
7.3.1.Comments
• You can have more than one configuration in a package, they are executed in the order you see them in
the configuration organizer.
• A common practice could be to use SQL Configurations from a central DB. With SQL Configurations you
can have more than 1 package from more than 1 server all using the central configuration table.
7.3.2.BOL
• “Package Configurations”
• “Creating Package Configurations”
2 Add a DataFlow Task and Connection 1. Add and open a Data Flow Task
Mgr 2. In the Connection Manager window, right click and choose “New Connection”
3. Then Select the “MultiFlatFile” connection manager and click ADD
4. For the Name and Description enter “badrows”
5. For the Filename, browse to the folder
C:\_SSIS_Training\SourceDataErrorRows
6. The folder should contain 3 files with the name like “bad_data1.txt”
7. Pick one of the them and click “Open”
8. Click the ‘Columns’ tab and you should see some of the data, and you will
likely note the first column is a numeric field and some have XX making the
rows ‘bad’
4. Add a Grid Data viewer to the Error path and re-execute to view the data.
5. ErrorCode users will be able to look up in Books Online.
6. The ErrorColumn matches the ID you can see in the Advanced editor of the
flatfile Source.
7. To see the Advanced editor, (stop execution) right click the flat file source and
choose ‘Advanced Editor’.
8. Then click the “Input and Output Parameters’ tab.
9. Expand the “Flat File Source Output”
10. Then expand the “Output Columns”
11. Click on column0 and the ID should match the one in the error rows
8. Now click the Details tab. You can select which log entries you
want. For now just check the ‘Events’ box at the top which will
pick up all the log entries. Do it for each item in the Containers
window, so select the contain name on the left, then ‘Events’
check box on the right.
1 View the results of the logging data… 1. Open SQL Management Studio, and connect to your SQL server
• In an Appendix you can see samples
“DBIL405”
of Reporting Services reports that
were built on top of a centralized Log 2. Expand the Databases Folder, down to the database
provider data. SSISLOGGING, and table “sysdtslog90”.. If you cannot find the
table there and you are sure you have executed the package, go
back the logging screen and verify the settings of the chosen
connection manager. Perhaps you selected one other than
SSISLOGGING and therefore the table was created in a different
DB.
3. Right click the table name “sysdtslog90” and choose “Open
Table”
4. You should see logging rows. Note there are very useful fields
like executionid which allows you to differentiate multiple
executions of the same package.
8.3.Conclusion
8.3.1.Comments
• So now we have seen 2 different approaches to processing multiple file, using a loop container and a
multi-flat file connection manager. One is not more ‘correct’ than the other, just depends on the
application. With a large number of files the loop structure approach would take longer because it needs
to start/close the data flow engine each time vs. the other approach where only 1 data flow is used.
However, the loop approach does provider more flexibility in the control flow, for example you can take
action after each file
8.3.2.BOL
• “Using Error Outputs”
• “Integration Services Log Providers”
• “Implementing Logging in Packages”
2 Build the solution and Deployment 1. From the Build Menu choose “Build SSISTraining”
files 2. If you look in the output window you should see results similar to
• When you ‘build’ the solution SQL
the following screen shot. If the Build menu is not visible goto
Development Studio will create the
deployment file set which includes all menu View>>Other Windows>>Output.
of the packages from the Solution,
configuration files that are associated
with packages, as well as any files you
may have in the “Miscellaneous” folder
of your solution. This is a handy want
3 Review the Deployment File Set 1. Go view the folder where our files were gathered
• Users can then copy\move the file set C:\_SSIS_Training \SSISTraining\bin\
to where they want to run deployment 2. You will see our packages (*.dtsx), the configuration file we created, and a
from. That machine needs to have manifest file that is used to perform the actual deployment.
SSIS installed to deploy, or else it will
not recognize the manifest file.
4 Install Package (Deploy) to 1. In file explorer double click the manifest file
another folder on the same “SSISTraining.SSISDeploymentManifest”
machine 2. In the Package installation wizard, Choose File System deployment.
• From the desired deployment machine, a 3. Click Next
can now run the manifest file and deploy 4. Use the default file path
(copy) packages to any file share or SQL 5. C:\Program Files\Microsoft SQL Server\90\DTS\Packages\SSISTraining
server where they have permissions. 6. Click Next
• Keep in mind the packages can only 7. The Installer will note it has enough information to install, click Next
execute on machines that have SSIS 8. The install will run a bit then should pause at a “Configure Packages” Screen’
installed (dtexec.exe) 9. For now just leave the existing values, calc.exe and dtexecui.exe, click Next
10. This is screen would allow you to change current configuration values as you
install. For example if you had configured connection managers, you can change
each installation for appropriate server names, without physically altering the
package itself.
11. Note on the ‘Finish’ Screen the log information about the Installation.
12. Click ‘Finish’ to complete and close the installation.
5 Execute Package using 1. You can browse to the folder where you deployed and double click a package to
DTEXECUI.exe open DTEXECUI.exe for that package.
2. You can use DTEXECUI to execute the package directly (Execute button on
bottom) or build a command line for use in a batch file, agent, other process.
3. When you execute via DTEXECUI.EXE you will see a progress window
(below)similar to what you see in SQL BI Studio. When you execute directly with
dtexec.exe from a command prompt, you will not see that window though
dtexec.exe supports many switches including console reporting. See help topic
suggestions at the end of this lab.
7 Open SQL Management Studio to 1. Open SQL Server Management Studio, connect to your local SQL
see and execute the packages you box with the initial connection dialog.
just installed to SQL 2. Once in Mgt Studio, connect to you local SQL Server Integration
Services, server. One way is to double click the name in the
‘Registered Servers’ pane
8 Find and execute package 1. Expand the Stored packages folder down to your packages in
“LoadApplications” MSDB
9.3.Conclusion
9.3.1.Comments
• The 2nd level of Folders found in the SQL Mgt Studio ‘Stored Packages folder’ can be controlled by the
user via a configuration file used by the SSIS Windows Service. The default file installed, ships with
folder names “MSDB” and “Maintenance plans” but the user can create whatever XML configuration
(service configuration not package configuration) file they like, and then modify a registry key to tell the
SSIS service, on Service start, what file to load and where its located. See BOL topics.
9.3.2.BOL
• “Creating a Deployment Utility”
• “How to: Create an Integration Services Package Deployment Utility”
• “Installing Packages”
• “How to: Run a Package Using the DTExecUI Utility “
• “Command Prompt Utilities (SSIS) “
• “Configuring the Integration Services Service”
• “Scheduling Package Execution in SQL Server Agent”
•
4
1. Save the package
2. EXECUTE
3. Verify the number of rows now in the ‘mydescriptions’ table
4. NOTE: If the wrong package executes, you need to change the
default object in the solution by right clicking the desired package
name (“LoadAppliations.dtsx.dtsx”) in the solution explorer.
10.2.2.BOL
• “Maintenance Tasks “
• “Execute SQL Task “
• The SSIS portal on MSDN. Lots of great information including white papers, Webcasts, recommended books.
http://msdn.microsoft.com/SQL/sqlwarehouse/SSIS/default.aspx
• Of course the excellent SQL Server Books Online which ship with the product. You\customers can also download
a separate copy. Handy for initial investigations when they want some details on specific features but are not
ready yet to install and play with the product.
http://www.microsoft.com/downloads/details.aspx?FamilyId=F0D182C1-C3AA-4CAC-B45C-
BD15D9B072B7&displaylang=en
These are just examples of Reporting Services Reports you can create based on the data. SSIS is a data integration
platform that includes various ways to produce detailed ‘instance data’ (Logging, Error Rows, and Audit Information in
flow) which customers can pull together in whatever way best suits them. One reason why there is not a
detailed/fixed support console like application. Every customers needs are different and we provide the data. The
examples here were built in SQL 2005 Reporting Services and will be available at some point for customers,
downloadable or via a RS report pack.