A Material
A Material
A data warehouse is a central repository containing stable, accurate, consistent, clearly understood data that are needed for management information and decision making across the whole organization. The data assembled in a warehouse are likely to have been drawn from a variety of source systems. Integrating these disparate sources into a consistent, enterprise-wide framework can be a major challenge. Customers, for example, may be identified differently in each source system. Usually, the source data are re-organized around a particular subject, restructured specifically to suit reporting and analysis, and stored in a separate database. Data may also be summarized, though this needs careful consideration. Each of these changes can have a marked impact on performance. Most data warehouses include a copy of the data in the organizations operational systems. Typically, copies will be taken at regular intervals in order to build up an historical database capable of revealing patterns and trends over time. Clearly the volume of data can be substantial, so the level of detail retained is a key design consideration. A warehouse may also contain external data, and other information previously kept by users in personal spreadsheets and databases e.g. forecasts and competitor comparisons. The physical form in which these data are held in the warehouse is another major design consideration, but this has no real bearing on whether or not something can be considered a data warehouse. In principle, everything could be stored in flat files, but in practice, most data warehouses use a relational database. This is usually more efficient.
Data mart
A data mart is a similar information store created for a specific purpose e.g. sales analysis or performance measurement. It is likely to be tailored to the needs of one or two departments or functional groups within the organization. Different data marts may be stored in different locations on different platforms using different database products.
Data staging
The most difficult and time-consuming aspect of building a data warehouse is taking data from disparate source systems, converting them into a consistent form that can be loaded into the warehouse, checking their quality and automating this process. This is known as data staging and typically accounts for 70-80% of the effort in a data warehousing initiative. Whilst the steps needed to physically move data from one system to another may be technically complex, the real issues lie with the structure and interpretation of the data itself: if all the source systems were consistent, there would be no need to build the warehouse! The political hurdles are even more significant, and include:
establishing who owns what data; agreeing standard terminology, definitions and hierarchies; deciding which sources to use; securing resources to clean up the data.
Metadata
One of the main reasons for building a data warehouse is to provide information that is clearly understood by the business. It is therefore essential to capture and store details of the origin, location, definition, quality and freshness of the data in the warehouse. This data about the data is called meta data.
In an ideal world, meta data would be held in a standard format that could be shared by different components and tools in the warehouse environment. In practice, this is extremely difficult to achieve, but an industry group called the Meta Data Coalition are working towards it.
Business intelligence
Business Intelligence is a field closely associated with data warehousing, and is focused on the exploitation of data for business benefit, i.e. on reporting, analysis and decision support. Many business intelligence tools have evolved from those used to build executive information systems. Most of the analytic techniques available are well known in the operational research community, but it is instructive to review the way they are seen in a data warehousing context and by the various tool vendors. Organizations usually start with the simplest and work towards the more sophisticated in 4 stages:
Stage 1 - Query and reporting
The new data warehouse enables many more people to access the basic information they need to monitor performance regularly and take routine decisions. The focus is very much on improved management reporting.
Stage 2 - Multi-dimensional analysis
OLAP tools make it much easier for people to explore the data, investigate exceptions, and share insights. The focus shifts to encouraging more widespread use of basic quantitative analysis.
Stage 3 - Statistical analysis
Specialists are able to use more and better data from the warehouse for rigorous analysis, to test theories and establish which patterns are significant.
Stage 4 - Data mining
Genetic algorithms, neural networks and other mathematical techniques can be used to search for useful patterns and relationships that no one previously suspected. This requires large samples of data, specialist software and a combination of subject area and technical expertise.
Because these techniques require different tools and levels of expertise, it is quite normal to provide several different business intelligence tools to different user groups. Most vendors will claim to have tools covering all these techniques, but few are strong in more than one category. For example, several OLAP vendors have developed CHAID based modules that can generate simple decision trees, which they are marketing as data mining tools.
OLAP
The term on-line analytic processing is used to distinguish the requirements of reporting and analysis systems from those of transaction processing systems designed to run day-to-day business operations. On line transaction processing (OLTP) focuses on capturing and updating information efficiently. This works best in a normalised, relational database, where every piece of data is stored in only one place, as part of a single record in a specific table. Management reporting, on the other hand, usually requires many records to be summarized, and information from different parts of the database to be combined, e.g. to derive a useful ratio. Good performance requires a different data structure, and the use of aggregates. OLAP tools represent data as if it were held in one or more multi-dimensional arrays, known as cubes, with cells like a spreadsheet. These cubes often have more than 3 dimensions, so strictly speaking they should be called hyper cubes, but it is much easier to visualize and explain how OLAP cubes are structured in plain 3-D. The edges of the cube represent the important dimensions of the business, such as time, country and product. One edge usually represents different measures, but some tools use separate cubes for each measure.
Each cell can be uniquely identified by specifying a member from each dimension e.g. {1999, Cost of sales, UK}. By selecting one or more members from each dimension, the user can slice and dice the cube to view almost any subset of the data from different perspectives. Dimension members may be organized into a hierarchy, with summary level members such as year, region or product group. The user can then drill down from one level to the next to see more detailed data, and then drill back up. Most OLAP tools also enable the user to switch instantly between tabular and chart formats, and to save favorite views of the data as reports for future reference. By manipulating cubes in this way, it is easy to answer questions such as these:
What were our 3 best selling products last month? How does the number of customer complaints vary by store? Which regions have grown fastest over the last 5 years? How has our business mix changed since last year?
OLAP has become popular because it makes it relatively easy to explore a data warehouse or data mart, discover simple patterns and trends, and to share the insights gained.
Report types
The most common way to access a data mart or data warehouse is to run reports. Another very popular approach is to use OLAP tools. To compare different types of reporting and analysis interface, it is useful to classify reports along a spectrum of increasing flexibility and decreasing ease of use: Standard reports are designed and built centrally, then published for general use. They are often run at regular intervals to show the latest available data and distributed to those who need or request them. They can be divided into three sub-types:
Static reports (also known as canned reports) are completely fixed, and require no further input from the user, making them the fastest and easiest to use. Parameterised reports have a fixed layout, but allow the user to specify which data are to be included, usually through a series of prompts (e.g. which country and time period). They are easy to use, but take longer to initiate and, usually, to run. Interactive reports allow the user to manipulate the structure, layout and content of a generic report via buttons on the screen. They are a little harder to use, but once familiar with the basic interface, users have far greater flexibility, and can work much faster.
Ad hoc queries, as the name suggests, are queries written by (or for) the end user as a one-off exercise. The only limitations are the capabilities of the reporting tool and the data available. Ad hoc reporting requires greater expertise, but need not involve programming, as most modern reporting tools are able to generate SQL. OLAP tools can be thought of as interactive reporting environments: they allow the user to interact with a cube of data and create views that can be saved and reused as generic, interactive reports. They are excellent for exploring summarized data, and some will allow the user to drill through from the cube into the underlying database to view the individual transaction details. Having built a data warehouse or data mart, most organizations want to exploit it as quickly as possible. It is tempting to start by replacing all existing reports, but there is often considerable scope for rationalization, as many will have fallen into disuse. Also, it may be possible to replace hundreds of static reports with a few dozen interactive reports, and to design these so that they cover a large proportion of likely ad hoc queries as well.
Understanding these report types helps to clarify business users requirements and select appropriate software, but it is also important to understand the needs of different types of user.
Types of user
Most warehouse implementation teams find that the user population can be divided into three broad groups:
80% casual users, who make infrequent use of the warehouse, and prefer static or parameterized reports. 15% active users, who make frequent use of standard reports, and sometimes require assistance with ad hoc requests. They are usually comfortable with interactive reports but still use static and parameterized reports. 5% power users, who prefer interactive reporting and frequently create their own ad hoc queries. They are often expert spreadsheet users, and regularly extract data for further analysis. Most OR analysts probably fit into this category.
In addition, the warehouse development team will need one or more expert users to write standard reports for central publication and provide training and support for end users. This role requires both business and technical knowledge, and is normally fulfilled by a management information specialist.
Data Warehousing
Data Sources Data Acquisition, Cleansing, & Integration Data Stores
Program Management
BI & DW Operations
Development
Information Services
Information Delivery Business Analytics
Product Overview
This chapter includes the following topics:
Introduction PowerCenter Domain PowerCenter Repository Administration Console Domain Configuration PowerCenter Client Repository Service Integration Service Web Services Hub Metadata Manager Reference Table Manager
Introduction
PowerCenter provides an environment that allows you to load data into a centralized location, such as a data warehouse or operational data store (ODS). You can extract data from multiple sources, transform the data according to business logic you build in the client application, and load the transformed data into file and relational targets. PowerCenter also provides the ability to view and analyze business information and browse and analyze metadata from disparate metadata repositories. PowerCenter includes the following components:
PowerCenter domain. The Power Center domain is the primary unit for management and administration within PowerCenter. The Service Manager runs on a PowerCenter domain. The Service Manager supports the domain and the application services. Application services represent server-based functionality and include the Repository Service, Integration Service, Web Services Hub, and SAP BW Service. PowerCenter repository. The PowerCenter repository resides in a relational database. The repository database tables contain the instructions required to extract, transform, and load data.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
Administration Console. The Administration Console is a web application that you use to administer the PowerCenter domain and PowerCenter security. Domain configuration. The domain configuration is a set of relational database tables that stores the configuration information for the domain. The Service Manager on the master gateway node manages the domain configuration. The domain configuration is accessible to all gateway nodes in the domain. PowerCenter Client. The PowerCenter Client is an application used to define sources and targets, build mappings and mapplets with the transformation logic, and create workflows to run the mapping logic. The PowerCenter Client connects to the repository through the Repository Service to modify repository metadata. It connects to the Integration Service to start workflows. Repository Service. The Repository Service accepts requests from the PowerCenter Client to create and modify repository metadata and accepts requests from the Integration Service for metadata when a workflow runs. Integration Service. The Integration Service extracts data from sources and loads data to targets. Web Services Hub. Web Services Hub is a gateway that exposes PowerCenter functionality to external clients through web services. SAP BW Service. The SAP BW Service extracts data from and loads data to SAP NetWeaver BI. If you use PowerExchange for SAP NetWeaver BI, you must create and enable an SAP BW Service in the PowerCenter domain. Reporting Service. The Reporting Service runs the Data Analyzer web application. Data Analyzer provides a framework for creating and running custom reports and dashboards. You can use Data Analyzer to run the metadata reports provided with PowerCenter, including the PowerCenter Repository Reports and Data Profiling Reports. Data Analyzer stores the data source schemas and report metadata in the Data Analyzer repository. Metadata Manager Service. The Metadata Manager Service runs the Metadata Manager web application. You can use Metadata Manager to browse and analyze metadata from disparate metadata repositories. Metadata Manager helps you understand and manage how information and processes are derived, how they are related, and how they are used. Metadata Manager stores information about the metadata to be analyzed in the Metadata Manager repository. Reference Table Manager Service. The Reference Table Manager Service runs the Reference Table Manager web application. Use Reference Table Manager to manage reference data such as valid, default, and crossreference values. Reference Table Manager stores reference tables metadata and the users and connection information in the Reference Table Manager repository. The reference tables are stored in a staging area.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
PowerCenter Client Tools Designer Workflow Manager Workflow Monitor Repository Manager Sources Relational Flat Files Web Services Applications Mainframe Other
Service Manager Repository Service Integration Service Web Services Hub SAP BW Service Reporting Service Metadata Manager Service Reference Table Manager Service
Administration Console
Domain Configuration Data Analyzer Repository PowerCenter Repository Metadata Manager Repository Reference Table Manager Repository
Sources
PowerCenter accesses the following sources:
Relational. Oracle, Sybase ASE, Informix, IBM DB2, Microsoft SQL Server, and Teradata. File. Fixed and delimited flat file, COBOL file, XML file, and web log. Application. You can purchase additional PowerExchange products to access business sources such as Hyperion Essbase, WebSphere MQ, IBM DB2 OLAP Server, JMS, Microsoft Message Queue, PeopleSoft, SAP NetWeaver, SAS, Siebel, TIBCO, and webMethods. Mainframe. You can purchase PowerExchange to access source data from mainframe databases such as Adabas, Datacom, IBM DB2 OS/390, IBM DB2 OS/400, IDMS, IDMS-X, IMS, and VSAM. Other. Microsoft Excel, Microsoft Access, and external web services.
Targets
PowerCenter can load data into the following targets:
Relational. Oracle, Sybase ASE, Sybase IQ, Informix, IBM DB2, Microsoft SQL Server, and Teradata. File. Fixed and delimited flat file and XML. Application. You can purchase additional PowerExchange products to load data into business sources such as Hyperion Essbase, WebSphere MQ, IBM DB2 OLAP Server, JMS, Microsoft Message Queue, PeopleSoft EPM, SAP NetWeaver, SAP NetWeaver BI, SAS, Siebel, TIBCO, and webMethods. Mainframe. You can purchase PowerExchange to load data into mainframe databases such as IBM DB2 for z/OS, IMS, and VSAM. Other. Microsoft Access and external web services.
You can load data into targets using ODBC or native drivers, FTP, or external loaders.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
PowerCenter Domain
PowerCenter has a service-oriented architecture that provides the ability to scale services and share resources across multiple machines. PowerCenter provides the PowerCenter domain to support the administration of the PowerCenter services. A domain is the primary unit for management and administration of services in PowerCenter. A domain contains the following components:
One or more nodes. A node is the logical representation of a machine in a domain. A domain may contain more than one node. The node that hosts the domain is the master gateway for the domain. You can add other machines as nodes in the domain and configure the nodes to run application services such as the Integration Service or Repository Service. All service requests from other nodes in the domain go through the master gateway. A nodes runs service processes, which is the runtime representation of an application service running on a node.
Service Manager. The Service Manager is built in to the domain to support the domain and the application services. The Service Manager runs on each node in the domain. The Service Manager starts and runs the application services on a machine. Application services. A group of services that represent PowerCenter server-based functionality. The application services that run on each node in the domain depend on the way you configure the node and the application service.
You use the PowerCenter Administration Console to manage the domain. If you have the high availability option, you can scale services and eliminate single points of failure for services. The Service Manager and application services can continue running despite temporary network or hardware failures. High availability includes resilience, failover, and recovery for services and tasks in a domain. Figure 1-2 shows a sample domain with three nodes:
Figure 1-2. Domain with Three Nodes
Node 1 (Master Gateway) Service Manager Node 2 Service Manager Integration Service Node 3 Service Manager Repository Service
This domain has a master gateway on Node 1. Node 2 runs an Integration Service, and Node 3 runs the Repository Service.
Service Manager
The Service Manager is built in to the domain and supports the domain and the application services. The Service Manager performs the following functions:
Alerts. Provides notifications about domain and service events. Authentication. Authenticates user requests from the Administration Console, PowerCenter Client, Metadata Manager, and Data Analyzer. Authorization. Authorizes user requests for domain objects. Requests can come from the Administration Console or from infacmd. Domain configuration. Manages domain configuration metadata. Node configuration. Manages node configuration metadata.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
Licensing. Registers license information and verifies license information when you run application services. Logging. Provides accumulated log events from each service in the domain. You can view logs in the Administration Console and Workflow Monitor. User management. Manages users, groups, roles, and privileges.
Application Services
When you install PowerCenter Services, the installation program installs the following application services:
Repository Service. Manages connections to the PowerCenter repository. Integration Service. Runs sessions and workflows. Web Services Hub. Exposes PowerCenter functionality to external clients through web services. SAP BW Service. Listens for RFC requests from SAP NetWeaver BI and initiates workflows to extract from or load to SAP NetWeaver BI. Reporting Service. Runs the Data Analyzer application. Metadata Manager Service. Runs the Metadata Manager application. Reference Table Manager Service. Runs the Reference Table Manager application.
PowerCenter Repository
The PowerCenter repository resides in a relational database. The repository stores information required to extract, transform, and load data. It also stores administrative information such as permissions and privileges for users and groups that have access to the repository. PowerCenter applications access the PowerCenter repository through the Repository Service. You administer the repository through the PowerCenter Administration Console and command line programs. You can develop global and local repositories to share metadata:
Global repository. The global repository is the hub of the repository domain. Use the global repository to store common objects that multiple developers can use through shortcuts. These objects may include operational or application source definitions, reusable transformations, mapplets, and mappings. Local repositories. A local repository is any repository within the domain that is not the global repository. Use local repositories for development. From a local repository, you can create shortcuts to objects in shared folders in the global repository. These objects include source definitions, common dimensions and lookups, and enterprise standard transformations. You can also create copies of objects in non-shared folders.
PowerCenter supports versioned repositories. A versioned repository can store multiple versions of an object. PowerCenter version control allows you to efficiently develop, test, and deploy metadata into production. You can view repository metadata in the Repository Manager. Informatica Metadata Exchange (MX) provides a set of relational views that allow easy SQL access to the PowerCenter metadata repository. You can also create a Reporting Service in the Administration Console and run the PowerCenter Repository Reports to view repository metadata.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
Administration Console
The Administration Console is a web application that you use to administer the PowerCenter domain and PowerCenter security.
Domain Page
You administer the PowerCenter domain on the Domain page of the Administration Console. Domain objects include services, nodes, and licenses. You can complete the following tasks in the Domain page:
Manage application services. Manage all application services in the domain, such as the Integration Service and Repository Service. Configure nodes. Configure node properties, such as the backup directory and resources. You can also shut down and restart nodes. Manage domain objects. Create and manage objects such as services, nodes, licenses, and folders. Folders allow you to organize domain objects and manage security by setting permissions for domain objects. View and edit domain object properties. View and edit properties for all objects in the domain, including the domain object. View log events. Use the Log Viewer to view domain, Integration Service, SAP BW Service, Web Services Hub, and Repository Service log events.
Other domain management tasks include applying licenses and managing grids and resources. Figure 1-3 shows the Domain page:
Figure 1-3. Domain Page of the PowerCenter Administration Console
Click to display the Domain page.
Security Page
You administer PowerCenter security on the Security page of the Administration Console. You manage users and groups that can log in to the following PowerCenter applications:
Administration Console PowerCenter Client Metadata Manager Data Analyzer Manage native users and groups. Create, edit, and delete native users and groups.
SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
Configure LDAP authentication and import LDAP users and groups. Configure a connection to an LDAP directory service. Import users and groups from the LDAP directory service. Manage roles. Create, edit, and delete roles. Roles are collections of privileges. Privileges determine the actions that users can perform in PowerCenter applications. Assign roles and privileges to users and groups. Assign roles and privileges to users and groups for the domain, Repository Service, Metadata Manager Service, or Reporting Service. Manage operating system profiles. Create, edit, and delete operating system profiles. An operating system profile is a level of security that the Integration Services uses to run workflows. The operating system profile contains the operating system user name, service process variables, and environment variables. You can configure the Integration Service to use operating system profiles to run workflows.
Domain Configuration
Configuration information for a PowerCenter domain is stored in a set of relational database tables managed by the Service manager and accessible to all gateway nodes in the domain. The domain configuration database stores the following types of information about the domain:
Domain configuration. Domain metadata such as host names and port numbers of nodes in the domain. The domain configuration database also stores information on the master gateway node and all other nodes in the domain. Usage. Includes CPU usage for each application service and the number of Repository Services running in the domain. Users and groups. Information on the native and LDAP users and the relationships between users and groups. Privileges and roles. Information on the privileges and roles assigned to users and groups in the domain.
Each time you make a change to the domain, the Service Manager updates the domain configuration database. For example, when you add a node to the domain, the Service Manager adds the node information to the domain configuration. All gateway nodes connect to the domain configuration database to retrieve domain information and update the domain configuration.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
PowerCenter Client
The PowerCenter Client application consists of the following tools that you use to manage the repository, design mappings, mapplets, and create sessions to load the data:
Designer. Use the Designer to create mappings that contain transformation instructions for the Integration Service. Mapping Architect for Visio. Use the Mapping Architect for Visio to create mapping templates that can be used to generate multiple mappings. Repository Manager. Use the Repository Manager to assign permissions to users and groups and manage folders. Workflow Manager. Use the Workflow Manager to create, schedule, and run workflows. A workflow is a set of instructions that describes how and when to run tasks related to extracting, transforming, and loading data. Workflow Monitor. Use the Workflow Monitor to monitor scheduled and running workflows for each Integration Service.
PowerCenter Designer
The Designer has the following tools that you use to analyze sources, design target schemas, and build sourceto-target mappings:
Source Analyzer. Import or create source definitions. Target Designer. Import or create target definitions. Transformation Developer. Develop transformations to use in mappings. You can also develop user-defined functions to use in expressions. Mapplet Designer. Create sets of transformations to use in mappings. Mapping Designer. Create mappings that the Integration Service uses to extract, transform, and load data. Navigator. Connect to repositories and open folders within the Navigator. You can also copy objects and create shortcuts within the Navigator. Workspace. Open different tools in this window to create and edit repository objects, such as sources, targets, mapplets, transformations, and mappings. Output. View details about tasks you perform, such as saving your work or validating a mapping.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
Repository Manager
Use the Repository Manager to administer repositories. You can navigate through multiple folders and repositories, and complete the following tasks:
Manage user and group permissions. Assign and revoke folder and global object permissions. Perform folder functions. Create, edit, copy, and delete folders. Work you perform in the Designer and Workflow Manager is stored in folders. If you want to share metadata, you can configure a folder to be shared. View metadata. Analyze sources, targets, mappings, and shortcut dependencies, search by keyword, and view the properties of repository objects. Navigator. Displays all objects that you create in the Repository Manager, the Designer, and the Workflow Manager. It is organized first by repository and by folder.
Main. Provides properties of the object selected in the Navigator. The columns in this window change depending on the object selected in the Navigator. Output. Provides the output of tasks executed within the Repository Manager.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
Status Bar
Navigator
Output
Main
Repository Objects
You create repository objects using the Designer and Workflow Manager client tools. You can view the following objects in the Navigator window of the Repository Manager:
Source definitions. Definitions of database objects such as tables, views, synonyms, or files that provide source data. Target definitions. Definitions of database objects or files that contain the target data. Mappings. A set of source and target definitions along with transformations containing business logic that you build into the transformation. These are the instructions that the Integration Service uses to transform and move data. Reusable transformations. Transformations that you use in multiple mappings. Mapplets. A set of transformations that you use in multiple mappings. Sessions and workflows. Sessions and workflows store information about how and when the Integration Service moves data. A workflow is a set of instructions that describes how and when to run tasks related to extracting, transforming, and loading data. A session is a type of task that you can put in a workflow. Each session corresponds to a single mapping.
Workflow Manager
In the Workflow Manager, you define a set of instructions to execute tasks such as sessions, emails, and shell commands. This set of instructions is called a workflow. The Workflow Manager has the following tools to help you develop a workflow:
Task Developer. Create tasks you want to accomplish in the workflow. Worklet Designer. Create a worklet in the Worklet Designer. A worklet is an object that groups a set of tasks. A worklet is similar to a workflow, but without scheduling information. You can nest worklets inside a workflow. Workflow Designer. Create a workflow by connecting tasks with links in the Workflow Designer. You can also create tasks in the Workflow Designer as you develop the workflow.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
10
When you create a workflow in the Workflow Designer, you add tasks to the workflow. The Workflow Manager includes tasks, such as the Session task, the Command task, and the Email task so you can design a workflow. The Session task is based on a mapping you build in the Designer. You then connect tasks with links to specify the order of execution for the tasks you created. Use conditional links and workflow variables to create branches in the workflow. When the workflow start time arrives, the Integration Service retrieves the metadata from the repository to execute the tasks in the workflow. You can monitor the workflow status in the Workflow Monitor.
Status Bar
Navigator
Output
Main
Workflow Monitor
You can monitor workflows and tasks in the Workflow Monitor. You can view details about a workflow or task in Gantt Chart view or Task view. You can run, stop, abort, and resume workflows from the Workflow Monitor. You can view sessions and workflow log events in the Workflow Monitor Log Viewer. The Workflow Monitor displays workflows that have run at least once. The Workflow Monitor continuously receives information from the Integration Service and Repository Service. It also fetches information from the repository to display historic information. The Workflow Monitor consists of the following windows:
Navigator window. Displays monitored repositories, servers, and repositories objects. Output window. Displays messages from the Integration Service and Repository Service. Time window. Displays progress of workflow runs. Gantt Chart view. Displays details about workflow runs in chronological format. Task view. Displays details about workflow runs in a report format.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
11
Task View
Navigator Window
Output Window
Time Window
Repository Service
The Repository Service manages connections to the PowerCenter repository from repository clients. A repository client is any PowerCenter component that connects to the repository. The Repository Service is a separate, multi-threaded process that retrieves, inserts, and updates metadata in the repository database tables. The Repository Service ensures the consistency of metadata in the repository. The Repository Service accepts connection requests from the following PowerCenter components:
PowerCenter Client. Use the Designer and Workflow Manager to create and store mapping metadata and connection object information in the repository. Use the Workflow Monitor to retrieve workflow run status information and session logs written by the Integration Service. Use the Repository Manager to organize and secure metadata by creating folders and assigning permissions to users and groups. Command line programs. Use command line programs to perform repository metadata administration tasks and service-related functions. Integration Service. When you start the Integration Service, it connects to the repository to schedule workflows. When you run a workflow, the Integration Service retrieves workflow task and mapping metadata from the repository. The Integration Service writes workflow status to the repository. Web Services Hub. When you start the Web Services Hub, it connects to the repository to access webenabled workflows. The Web Services Hub retrieves workflow task and mapping metadata from the repository and writes workflow status to the repository.
You install the Repository Service when you install PowerCenter Services. After you install the PowerCenter Services, you can use the Administration Console to manage the Repository Service.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
12
Integration Service
The Integration Service reads workflow information from the repository. The Integration Service connects to the repository through the Repository Service to fetch metadata from the repository. A workflow is a set of instructions that describes how and when to run tasks related to extracting, transforming, and loading data. The Integration Service runs workflow tasks. A session is a type of workflow task. A session is a set of instructions that describes how to move data from sources to targets using a mapping. A session extracts data from the mapping sources and stores the data in memory while it applies the transformation rules that you configure in the mapping. The Integration Service loads the transformed data into the mapping targets. Other workflow tasks include commands, decisions, timers, pre-session SQL commands, post-session SQL commands, and email notification. The Integration Service can combine data from different platforms and source types. For example, you can join data from a flat file and an Oracle source. The Integration Service can also load data to different platforms and target types. You install the Integration Service when you install PowerCenter Services. After you install the PowerCenter Services, you can use the Administration Console to manage the Integration Service.
Batch web services. Includes operations to run and monitor sessions and workflows and access repository information. Batch web services are installed with PowerCenter. Real-time web services. Workflows enabled as web services that can receive requests and generate responses in SOAP message format. You create real-time web services when you enable PowerCenter workflows as web services.
Use the Administration Console to configure and manage the Web Services Hub. Use the Web Services Hub Console to view information about the web service and download WSDL files necessary for creating web service clients.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
13
Overview
PowerCenter Getting Started provides lessons that introduce you to PowerCenter and how to use it to load transformed data into file and relational targets. The lessons in this book are designed for PowerCenter beginners. This tutorial walks you through the process of creating a data warehouse. The tutorial teaches you how to perform the following tasks:
Create users and groups. Add source definitions to the repository. Create targets and add their definitions to the repository. Map data between sources and targets. Instruct the Integration Service to write data to targets. Monitor the Integration Service as it writes data to targets.
In general, you can set the pace for completing the tutorial. However, you should complete an entire lesson in one session, since each lesson builds on a sequence of related tasks.
Getting Started
The PowerCenter administrator must install and configure the PowerCenter Services and Client. Verify that the administrator has completed the following steps:
Installed the PowerCenter Services and created a PowerCenter domain. Created a repository. Installed the PowerCenter Client.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
14
You also need information to connect to the PowerCenter domain and repository and the source and target database tables. Use the tables in PowerCenter Domain and Repository on page 20 to write down the domain and repository information. Use the tables in PowerCenter Source and Target on page 21 to write down the source and target connectivity information. Contact the PowerCenter administrator for the necessary information. Before you begin the lessons, read Product Overview on page 1. The product overview explains the different components that work together to extract, transform, and load data.
Create a group with all privileges on a Repository Service. The privileges allow users in to design mappings and run workflows in the PowerCenter Client. Create a user account and assign it to the group. The user inherits the privileges of the group.
Repository Manager. You use the Repository Manager to create a folder to store the metadata you create in the lessons. Designer. Use the Designer to create mappings that contain transformation instructions for the Integration Service. Before you can create mappings, you must add source and target definitions to the repository. In this tutorial, you use the following tools in the Designer:
Source Analyzer. Import or create source definitions. Target Designer. Import or create target definitions. You also create tables in the target database based on the target definitions. Mapping Designer. Create mappings that the Integration Service uses to extract, transform, and load data.
Workflow Manager. Use the Workflow Manager to create and run workflows and tasks. A workflow is a set of instructions that describes how and when to run tasks related to extracting, transforming, and loading data. Workflow Monitor. Use the Workflow Monitor to monitor scheduled and running workflows for each Integration Service.
Domain
Use the tables in this section to record the domain connectivity and default administrator information. If necessary, contact the PowerCenter administrator for the information.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
15
Administrator
Use Table 2-2 to record the information you need to connect to the Administration Console as the default administrator:
Table 2-2. Default Administrator Login
Administration Console Default Administrator User Name Default Administrator Password Administrator
Use the default administrator account for the lessons Creating Users and Groups on page 23. For all other lessons, you use the user account that you create in lesson Creating a User on page 25 to log in to the PowerCenter Client.
Note: The default administrator user name is Administrator. If you do not have the password for the default administrator, ask the PowerCenter administrator to provide this information or set up a domain administrator account that you can use. Record the user name and password of the domain administrator.
Note: Ask the PowerCenter administrator to provide the name of a repository where you can create the folder, mappings, and workflows in this tutorial. The user account you use to connect to the repository is the user account you create in Creating a User.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
16
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
17
Tutorial Lesson 1
This chapter includes the following topics:
Creating Users and Groups Creating a Folder in the PowerCenter Repository Creating Source Tables
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
18
Open Microsoft Internet Explorer or Mozilla Firefox. In the Address field, enter the following URL for the Administration Console login page:
http://<host>:<port>/adminconsole
If you configure HTTPS for the Administration Console, the URL redirects to the HTTPS enabled site. If the node is configured for HTTPS with a keystore that uses a self-signed certificate, a warning message appears. To enter the site, accept the certificate. The Informatica PowerCenter Administration Console login page appears
3.
Enter the default administrator user name and password. Use the Administrator user name and password you recorded in Table 2-2 on page 21.
4. 5. 6.
Select Native. Click Login. If the Administration Assistant displays, click Administration Console.
Creating a Group
In the following steps, you create a new group and assign privileges to the group.
To create the TUTORIAL group: 1. 2. 3.
In the Administration Console, go to the Security page. Click Create Group. Enter the following information for the group.
Property Name Description Value TUTORIAL Group used for the PowerCenter tutorial.
4.
Click OK to save the group. The TUTORIAL group appears on the list of native groups in the Groups section of the Navigator. The details for the new group displays in the right pane.
5.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
19
6. 7. 8. 9. 10.
Click Edit. In the Edit Roles and Privileges dialog box, click the Privileges tab. Expand the privileges list for the Repository Service that you plan to use. Click the box next to the Repository Service name to assign all privileges to the TUTORIAL group. Click OK. Users in the TUTORIAL group now have the privileges to create workflows in any folder for which they have read and write permission.
Creating a User
The final step is to create a new user account and add that user to the TUTORIAL group. You use this user account throughout the rest of this tutorial.
To create a new user: 1. 2.
On the Security page, click Create User. Enter a login name for the user account. You use this user name when you log in to the PowerCenter Client to complete the rest of the tutorial.
3.
Enter a password and confirm. You must retype the password. Do not copy and paste the password.
4.
Click OK to save the user account. The details for the new user account displays in the right pane.
5. 6.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
20
7.
8.
Select the group name TUTORIAL in the All Groups column and click Add. The TUTORIAL group displays in Assigned Groups list.
9.
Click OK to save the group assignment. The user account now has all the privileges associated with the TUTORIAL group.
Folder Permissions
Permissions allow users to perform tasks within a folder. With folder permissions, you can control user access to the folder and the tasks you permit them to perform. Folder permissions work closely with privileges. Privileges grant access to specific tasks, while permissions grant access to specific folders with read, write, and execute access. Folders have the following types of permissions:
Read permission. You can view the folder and objects in the folder. Write permission. You can create or edit objects in the folder. Execute permission. You can run or schedule workflows in the folder.
When you create a folder, you are the owner of the folder. The folder owner has all permissions on the folder which cannot be changed.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
21
Launch the Repository Manager. Click Repository > Add Repository. The Add Repository dialog box appears.
3.
Enter the repository and user name. Use the name of the repository in Table 2-3 . Use the name of the user account you created in Creating a User
4.
5.
Click Repository > Connect or double-click the repository to connect. The Connect to Repository dialog box appears.
6.
In the connection settings section, click Add to add the domain connection information. The Add Domain dialog box appears.
7. 8.
Enter the domain name, gateway host, and gateway port number from Table 2-1 on page 21. Click OK. If a message indicates that the domain already exists, click Yes to replace the existing domain.
9. 10. 11.
In the Connect to Repository dialog box, enter the password for the Administrator user. Select the Native security domain. Click Connect.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
22
Creating a Folder
For this tutorial, you create a folder where you will define the data sources and targets, build mappings, and run workflows in later lessons.
To create a new folder: 1.
2.
Enter your name prefixed by Tutorial_ as the name of the folder. By default, the user account logged in is the owner of the folder and has full permissions on the folder.
3.
Click OK. The Repository Manager displays a message that the folder has been successfully created.
4.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
23
5.
CUSTOMERS DEPARTMENT DISTRIBUTORS EMPLOYEES ITEMS ITEMS_IN_PROMOTIONS JOBS MANUFACTURERS ORDERS ORDER_ITEMS PROMOTIONS STORES
The Target Designer generates SQL based on the definitions in the workspace. Generally, you use the Target Designer to create target tables in the target database. In this lesson, you use this feature to generate the source tutorial tables from the tutorial SQL scripts that ship with the product. When you run the SQL script, you also create a stored procedure that you will use to create a Stored Procedure transformation in another lesson.
To create the sample source tables: 1.
Launch the Designer, double-click the icon for the repository, and log in to the repository. Use your user profile to open the connection.
2. 3. 4.
Double-click the Tutorial_yourname folder. Click Tools > Target Designer to open the Target Designer. Click Targets > Generate/Execute SQL.
The Database Object Generation dialog box gives you several options for creating tables.
5. 6.
Click the Connect button to connect to the source database. Select the ODBC data source you created to connect to the source database.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
24
Enter the database user name and password and click Connect. You now have an open connection to the source database. When you are connected, the Disconnect button appears and the ODBC name of the source database appears in the dialog box.
8.
Make sure the Output window is open at the bottom of the Designer. If it is not open, click View > Output.
9.
Click the browse button to find the SQL file. The SQL file is installed in the following directory:
C:\Program Files\Informatica PowerCenter\client\bin
10.
Select the SQL file appropriate to the source database platform you are using. Click Open.
Platform Informix Microsoft SQL Server Oracle Sybase ASE DB2 Teradata File smpl_inf.sql smpl_ms.sql smpl_ora.sql smpl_syb.sql smpl_db2.sql smpl_tera.sql
Alternatively, you can enter the path and file name of the SQL file.
11.
Click Execute SQL file. The database now executes the SQL script to create the sample source database objects and to insert values into the source tables. While the script is running, the Output window displays the progress. The Designer generates and executes SQL scripts in Unicode (UCS-2) format.
12.
30
SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
25
Tutorial Lesson 2
This chapter includes the following topics:
In the Designer, click Tools > Source Analyzer to open the Source Analyzer. Double-click the tutorial folder to view its contents. Every folder contains nodes for sources, targets, schemas, mappings, mapplets, cubes, dimensions and reusable transformations.
3. 4. 5.
Click Sources > Import from Database. Select the ODBC data source to access the database containing the source tables. Enter the user name and password to connect to this database. Also, enter the name of the source table owner, if necessary. Use the database connection information you entered in Table 2-4 on page 22. In Oracle, the owner name is the same as the user name. Make sure that the owner name is in all caps. For example, JDOE.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
31
26
6. 7.
Click Connect. In the Select tables list, expand the database owner and the TABLES heading. If you click the All button, you can see all tables in the source database. A list of all the tables you created by running the SQL script appears in addition to any tables already in the database.
8.
CUSTOMERS DEPARTMENT DISTRIBUTORS EMPLOYEES ITEMS ITEMS_IN_PROMOTIONS JOBS MANUFACTURERS ORDERS ORDER_ITEMS PROMOTIONS STORES
Hold down the Ctrl key to select multiple tables. Or, hold down the Shift key to select a block of tables. You may need to scroll down the list of tables to select all tables.
Note: Database objects created in Informix databases have shorter names than those created in other types of databases. For example, the name of the table ITEMS_IN_PROMOTIONS is shortened to ITEMS_IN_PROMO.
9.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
27
The Designer displays the newly imported sources in the workspace. You can click Layout > Scale to Fit to fit all the definitions in the workspace.
A new database definition (DBD) node appears under the Sources node in the tutorial folder. This new entry has the same name as the ODBC data source to access the sources you just imported. If you doubleclick the DBD node, the list of all the imported sources appears.
Double-click the title bar of the source definition for the EMPLOYEES table to open the EMPLOYEES source definition. The Edit Tables dialog box appears and displays all the properties of this source definition. The Table tab shows the name of the table, business name, owner name, and the database type. You can add a comment in the Description section.
2.
Click the Columns tab. The Columns tab displays the column descriptions for the source table.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
28
Note: The source definition must match the structure of the source table. Therefore, you must not modify source column definitions after you import them.
3.
Click the Metadata Extensions tab. Metadata extensions allow you to extend the metadata stored in the repository by associating information with individual repository objects. For example, you can store contact information, such as name or email address, with the sources you create. In this lesson, you create user-defined metadata extensions that define the date you created the source definition and the name of the person who created the source definition.
4. 5. 6. 7. 8. 9. 10.
Click the Add button to add a metadata extension. Name the new row SourceCreationDate and enter todays date as the value. Click the Add button to add another metadata extension and name it SourceCreator. Enter your first name as the value in the SourceCreator row. Click Apply. Click OK to close the dialog box. Click Repository > Save to save the changes to the repository.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
29
In the following steps, you copy the EMPLOYEES source definition into the Target Designer to create the target definition. Then, you modify the target definition by deleting and adding columns to create the definition you want.
To create the T_EMPLOYEES target definition: 1. 2.
In the Designer, click Tools > Target Designer to open the Target Designer. Drag the EMPLOYEES source definition from the Navigator to the Target Designer workspace. The Designer creates a new target definition, EMPLOYEES, with the same column definitions as the EMPLOYEES source definition and the same database type. Next, modify the target column definitions.
3. 4.
Double-click the EMPLOYEES target definition to open it. Click Rename and name the target definition T_EMPLOYEES.
Note: If you need to change the database type for the target definition, you can select the correct database type when you edit the target definition.
5.
Click the Columns tab. The target column definitions are the same as the EMPLOYEES source definition.
Add Button Delete Button
6. 7.
Select the JOB_ID column and click the delete button. Delete the following columns:
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
30
When you finish, the target definition should look similar to the following target definition:
Note that the EMPLOYEE_ID column is a primary key. The primary key cannot accept null values. The Designer selects Not Null and disables the Not Null option. You now have a column ready to receive data from the EMPLOYEE_ID column in the EMPLOYEES source table.
Note: If you want to add a business name for any column, scroll to the right and enter it.
8. 9.
Click OK to save the changes and close the dialog box. Click Repository > Save.
In the workspace, select the T_EMPLOYEES target definition. Click Targets > Generate/Execute SQL. The Database Object Generation dialog box appears.
3.
If you installed the PowerCenter Client in a different location, enter the appropriate drive letter and directory.
4. 5.
If you are connected to the source database from the previous lesson, click Disconnect, and then click Connect. Select the ODBC data source to connect to the target database.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
31
6.
Enter the necessary user name and password, and then click Connect.
7. 8.
Select the Create Table, Drop Table, Foreign Key and Primary Key options. Click the Generate and Execute button. To view the results, click the Generate tab in the Output window. To edit the contents of the SQL file, click the Edit SQL File button. The Designer runs the DDL code needed to create T_EMPLOYEES.
9.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
32
Tutorial Lesson 3
This chapter includes the following topics:
Creating a Pass-Through Mapping Creating Sessions and Workflows Running and Monitoring Workflows
Input Port
Input/Output Port
The source qualifier represents the rows that the Integration Service reads from the source when it runs a session. If you examine the mapping, you see that data flows from the source definition to the Source Qualifier transformation to the target definition through a series of input and output ports.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
39
33
The source provides information, so it contains only output ports, one for each column. Each output port is connected to a corresponding input port in the Source Qualifier transformation. The Source Qualifier transformation contains both input and output ports. The target contains input ports. When you design mappings that contain different types of transformations, you can configure transformation ports as inputs, outputs, or both. You can rename ports and change the datatypes.
Creating a Mapping
In the following steps, you create a mapping and link columns in the source EMPLOYEES table to a Source Qualifier transformation.
To create a mapping: 1. 2.
Click Tools > Mapping Designer to open the Mapping Designer. In the Navigator, expand the Sources node in the tutorial folder, and then expand the DBD node containing the tutorial sources.
3.
Drag the EMPLOYEES source definition into the Mapping Designer workspace. The Designer creates a new mapping and prompts you to provide a name.
4.
In the Mapping Name dialog box, enter m_PhoneList as the name of the new mapping and click OK. The naming convention for mappings is m_MappingName. The source definition appears in the workspace. The Designer creates a Source Qualifier transformation and connects it to the source definition.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
34
5. 6.
Expand the Targets node in the Navigator to open the list of all target definitions. Drag the T_EMPLOYEES target definition into the workspace. The target definition appears. The final step is to connect the Source Qualifier transformation to the target definition.
Connecting Transformations
The port names in the target definition are the same as some of the port names in the Source Qualifier transformation. When you need to link ports between transformations that have the same name, the Designer can link them based on name. In the following steps, you use the autolink option to connect the Source Qualifier transformation to the target definition.
To connect the Source Qualifier transformation to the target definition: 1.
Click Layout > Autolink. The Auto Link dialog box appears.
2. 3.
Select T_EMPLOYEES in the To Transformations field. Verify that SQ_EMPLOYEES is in the From Transformation field. Autolink by name and click OK. The Designer links ports from the Source Qualifier transformation to the target definition by name. A link appears between the ports in the Source Qualifier transformation and the target definition.
Note: When you need to link ports with different names, you can drag from the port of one transformation to a port of another transformation or target. If you connect the wrong columns, select the link and press the Delete key.
4.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
35
5.
In the Select Targets dialog box, select the T_EMPLOYEES target, and click OK. The Designer rearranges the source, Source Qualifier transformation, and target from left to right, making it easy to see how one column maps to another.
6. 7.
Drag the lower edge of the source and Source Qualifier transformation windows until all columns appear. Click Repository > Save to save the new mapping to the repository.
You create and maintain tasks and workflows in the Workflow Manager. In this lesson, you create a session and a workflow that runs the session. Before you create a session in the Workflow Manager, you need to configure database connections in the Workflow Manager.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
36
Launch Workflow Manager. In the Workflow Manager, select the repository in the Navigator, and then click Repository > Connect. Enter a user name and password to connect to the repository and click Connect. Use the user profile and password you entered in Table 2-3 on page 21. The native security domain is selected by default.
4.
Click Connections > Relational. The Relational Connection Browser dialog box appears.
5.
Click New in the Relational Connection Browser dialog box. The Select Subtype dialog box appears.
6.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
37
The Connection Object Definition dialog box appears with options appropriate to the selected database platform.
7.
In the Name field, enter TUTORIAL_SOURCE as the name of the database connection. The Integration Service uses this name as a reference to this database connection.
8. 9.
Enter the user name and password to connect to the database. Select a code page for the database connection. The source code page must be a subset of the target code page.
10. 11.
In the Attributes section, enter the database name. Enter additional information necessary to connect to this database, such as the connect string, and click OK. Use the database connection information you entered for the source database in Table 2-5 on page 22. TUTORIAL_SOURCE now appears in the list of registered database connections in the Relational Connection Browser dialog box.
12.
Repeat steps 5 to 10 to create another database connection called TUTORIAL_TARGET for the target database. The target code page must be a superset of the source code page. Use the database connection information you entered for the target database in Table 2-5 on page 22.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
38
When you finish, TUTORIAL_SOURCE and TUTORIAL_TARGET appear in the list of registered database connections in the Relational Connection Browser dialog box.
13.
Click Close.
You have finished configuring the connections to the source and target databases. The next step is to create a session for the mapping m_PhoneList.
In the Workflow Manager Navigator, double-click the tutorial folder to open it. Click Tools > Task Developer to open the Task Developer. Click Tasks > Create. The Create Task dialog box appears.
4. 5.
Select Session as the task type to create. Enter s_PhoneList as the session name and click Create.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
39
6.
Select the mapping m_PhoneList and click OK. The Workflow Manager creates a reusable Session task in the Task Developer workspace.
7. 8.
Click Done in the Create Task dialog box. In the workspace, double-click s_PhoneList to open the session properties. The Edit Tasks dialog box appears. You use the Edit Tasks dialog box to configure and edit session properties, such as source and target database connections, performance properties, log options, and partitioning information. In this lesson, you use most default settings. You select the source and target database connections.
9. 10.
Click the Mapping tab. Select Sources in the Transformations pane on the left.
11.
In the Connections settings on the right, click the Browse Connections button in the Value column for the SQ_EMPLOYEES - DB Connection. The Relational Connection Browser appears.
12. 13.
Select TUTORIAL_SOURCE and click OK. Select Targets in the Transformations pane.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
40
14.
In the Connections settings, click the Edit button in the Value column for the T_EMPLOYEES - DB Connection. The Relational Connection Browser appears.
Select TUTORIAL_TARGET and click OK. Click the Properties tab. Select a session sort order associated with the Integration Service code page. For English data, use the Binary sort order.
These are the session properties you need to define for this session.
18. 19.
Click OK to close the session properties with the changes you made. Click Repository > Save to save the new session to the repository.
You have created a reusable session. The next step is to create a workflow that runs the session.
Creating a Workflow
You create workflows in the Workflow Designer. When you create a workflow, you can include reusable tasks that you create in the Task Developer. You can also include non-reusable tasks that you create in the Workflow Designer. In the following steps, you create a workflow that runs the session s_PhoneList.
To create a workflow: 1. 2. 3.
Click Tools > Workflow Designer. In the Navigator, expand the tutorial folder, and then expand the Sessions node. Drag the session s_PhoneList to the Workflow Designer workspace.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
41
4.
Enter wf_PhoneList as the name for the workflow. The naming convention for workflows is wf_WorkflowName.
5.
Click the Browse Integration Services button to choose an Integration Service to run the workflow. The Integration Service Browser dialog box appears.
6. 7. 8. 9.
Select the appropriate Integration Service and click OK. Click the Properties tab to view the workflow properties. Enter wf_PhoneList.log for the workflow log file name. Click the Scheduler tab.
Edit Scheduler.
Run On Demand
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
42
By default, the workflow is scheduled to run on demand. The Integration Service only runs the workflow when you manually start the workflow. You can configure workflows to run on a schedule. For example, you can schedule a workflow to run once a day or run on the last day of the month. Click the Edit Scheduler button to configure schedule options.
10. 11.
Accept the default schedule for this workflow. Click OK to close the Create Workflow dialog box. The Workflow Manager creates a new workflow in the workspace, including the reusable session you added. All workflows begin with the Start task, but you need to instruct the Integration Service which task to run next. To do this, you link tasks in the Workflow Manager.
Note: You can click Workflows > Edit to edit the workflow properties at any time.
12. 13.
Click Tasks > Link Tasks. Drag from the Start task to the Session task.
14.
Click Repository > Save to save the workflow in the repository. You can now run and monitor the workflow.
In the Workflow Manager, click Tools > Options. In the General tab, select Launch Workflow Monitor When Workflow Is Started. Click OK.
Next, you run the workflow and open the Workflow Monitor.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
43
To run a workflow: 1. 2.
Verify the workflow is open in the Workflow Designer. In the Workflow Manager, click Workflows > Start Workflow.
Tip: You can also right-click the workflow in the Navigator and select Start Workflow.
The Workflow Monitor opens, connects to the repository, and opens the tutorial folder.
3. 4.
Click the Gantt Chart tab at the bottom of the Time window to verify the Workflow Monitor is in Gantt Chart view. In the Navigator, expand the node for the workflow. All tasks in the workflow appear in the Navigator.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
44
Previewing Data
You can preview the data that the Integration Service loaded in the target with the Preview Data option.
To preview relational target data: 1. 2. 3.
Open the Designer. Click on the Mapping Designer button. In the mapping m_PhoneList, right-click the target definition, T_EMPLOYEES and choose Preview Data. The Preview Data dialog box appears.
4. 5. 6. 7.
In the ODBC data source field, select the data source name that you used to create the target table. Enter the database username, owner name and password. Enter the number of rows you want to preview. Click Connect. The Preview Data dialog box displays the data as that you loaded to T_EMPLOYEES.
8.
Click Close.
You can preview relational tables, fixed-width and delimited flat files, and XML files with the Preview Data option.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
45
Tutorial Lesson 4
This chapter includes the following topics:
Using Transformations Creating a New Target Definition and Target Creating a Mapping with Aggregate Values Designer Tips Creating a Session and Workflow
Using Transformations
In this lesson, you create a mapping that contains a source, multiple transformations, and a target. A transformation is a part of a mapping that generates or modifies data. Every mapping includes a Source Qualifier transformation, representing all data read from a source and temporarily stored by the Integration Service. In addition, you can add transformations that calculate a sum, look up a value, or generate a unique ID before the source data reaches the target. Table 6-1 lists the transformations displayed in the Transformation toolbar in the Designer:
Table 6-1. Transformation Descriptions
Transformation Aggregator Application Source Qualifier Application Multi-Group Source Qualifier Custom Expression External Procedure Filter Input Joiner Lookup Description Performs aggregate calculations. Represents the rows that the Integration Service reads from an application, such as an ERP source, when it runs a workflow. Represents the rows that the Integration Service reads from an application, such as a TIBCO source, when it runs a workflow. Sources that require an Application Multi-Group Source Qualifier can contain multiple groups. Calls a procedure in a shared library or DLL. Calculates a value. Calls a procedure in a shared library or in the COM layer of Windows. Filters data. Defines mapplet input rows. Available in the Mapplet Designer. Joins data from different databases or flat file systems. Looks up values.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
53
46
Note: The Advanced Transformation toolbar contains transformations such as Java, SQL, and XML Parser transformations.
In this lesson, you complete the following tasks: 1. 2. Create a new target definition to use in a mapping, and create a target table based on the new target definition. Create a mapping using the new target definition. Add the following transformations to the mapping:
Lookup transformation. Finds the name of a manufacturer. Aggregator transformation. Calculates the maximum, minimum, and average price of items from each manufacturer. Expression transformation. Calculates the average profit of items, based on the average price.
3. 4.
Learn some tips for using the Designer. Create a session and workflow to run the mapping, and monitor the workflow in the Workflow Monitor.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
47
Note: You can also manually create a target definition, import the definition for an existing target from a database, or create a relational target from a transformation in the Designer.
To create the new target definition: 1. 2. 3.
Open the Designer, connect to the repository, and open the tutorial folder. Click Tools > Target Designer. Drag the MANUFACTURERS source definition from the Navigator to the Target Designer workspace. The Designer creates a target definition, MANUFACTURERS, with the same column definitions as the MANUFACTURERS source definition and the same database type. Next, you add target column definitions.
4.
Double-click the MANUFACTURERS target definition to open it. The Edit Tables dialog box appears.
5. 6. 7.
Click Rename and name the target definition T_ITEM_SUMMARY. Optionally, change the database type for the target definition. You can select the correct database type when you edit the target definition. Click the Columns tab. The target column definitions are the same as the MANUFACTURERS source definition.
8. 9.
For the MANUFACTURER_NAME column, change precision to 72, and clear the Not Null column. Add the following columns with the Money datatype, and select Not Null:
Use the default precision and scale with the Money datatype. If the Money datatype does not exist in the database, use Number (p,s) or Decimal. Change the precision to 15 and the scale to 2.
10.
Click Apply.
11.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
48
If the target database is Oracle, skip to the final step. You cannot add an index to a column that already has the PRIMARY KEY constraint added to it.
In the Indexes section, click the Add button. Enter IDX_MANUFACTURER_ID as the name of the new index, and then press Enter. Select the Unique index option. In the Columns section, click Add. The Add Column To Index dialog box appears. It lists the columns you added to the target definition.
16. 17.
Select MANUFACTURER_ID and click OK. Click OK to save the changes to the target definition, and then click Repository > Save.
Select the table T_ITEM_SUMMARY, and then click Targets > Generate/Execute SQL. In the Database Object Generation dialog box, connect to the target database. Click Generate from Selected tables, and select the Create Table, Primary Key, and Create Index options. Leave the other options unchanged.
4.
Click Generate and Execute. The Designer notifies you that the file MKT_EMP.SQL already exists.
5.
Click OK to override the contents of the file and create the target table. The Designer runs the SQL script to create the T_ITEM_SUMMARY table.
6.
Click Close.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
49
Finds the most expensive and least expensive item in the inventory for each manufacturer. Use an Aggregator transformation to perform these calculations. Calculates the average price and profitability of all items from a given manufacturer. Use an Aggregator and an Expression transformation to perform these calculations.
You need to configure the mapping to perform both simple and aggregate calculations. For example, use the MIN and MAX functions to find the most and least expensive items from each manufacturer.
Switch from the Target Designer to the Mapping Designer. Click Mappings > Create. When prompted to close the current mapping, click Yes. In the Mapping Name dialog box, enter m_ItemSummary as the name of the mapping. From the list of sources in the tutorial folder, drag the ITEMS source definition into the mapping. From the list of targets in the tutorial folder, drag the T_ITEM_SUMMARY target definition into the mapping.
Click Transformation > Create to create an Aggregator transformation. Click Aggregator and name the transformation AGG_PriceCalculations. Click Create, and then click Done. The naming convention for Aggregator transformations is AGG_TransformationName. The Mapping Designer adds an Aggregator transformation to the mapping.
3.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
50
When you drag ports from one transformation to another, the Designer copies the port description and links the original port to its copy. If you click Layout > Copy Columns, every port you drag is copied, but not linked.
4.
From the Source Qualifier transformation, drag the PRICE column into the Aggregator transformation. A copy of the PRICE port now appears in the new Aggregator transformation. The new port has the same name and datatype as the port in the Source Qualifier transformation. The Aggregator transformation receives data from the PRICE port in the Source Qualifier transformation. You need this information to calculate the maximum, minimum, and average product price for each manufacturer.
5.
Drag the MANUFACTURER_ID port into the Aggregator transformation. You need another input port, MANUFACTURER_ID, to provide the information for the equivalent of a GROUP BY statement. By adding this second input port, you can define the groups (in this case, manufacturers) for the aggregate calculation. This organizes the data by manufacturer.
6. 7.
Double-click the Aggregator transformation, and then click the Ports tab. Clear the Output (O) column for PRICE. You want to use this port as an input (I) only, not as an output (O). Later, you use data from PRICE to calculate the average, maximum, and minimum prices.
8. 9.
Select the Group By option for the MANUFACTURER_ID column. Click the Add button three times to add three new ports. When you select the Group By option for MANUFACTURER_ID, the Integration Service groups all incoming rows by manufacturer ID when it runs the session.
10.
Tip: You can select each port and click the Up and Down buttons to position the output ports after the
Now, you need to enter the expressions for all three output ports, using the functions MAX, MIN, and AVG to perform aggregate calculations.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
51
Click the open button in the Expression column of the OUT_MAX_PRICE port to open the Expression Editor.
Open Button
2.
The Formula section of the Expression Editor displays the expression as you develop it. Use other sections of this dialog box to select the input ports to provide values for an expression, enter literals and operators, and select functions to use in the expression.
3.
Double-click the Aggregate heading in the Functions section of the dialog box. A list of all aggregate functions now appears.
4.
Double-click the MAX function on the list. The MAX function appears in the window where you enter the expression. To perform the calculation, you need to add a reference to an input port that provides data for the expression.
5. 6.
Move the cursor between the parentheses next to MAX. Click the Ports tab. This section of the Expression Editor displays all the ports from all transformations appearing in the mapping.
7.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
52
A reference to this port now appears within the expression. The final step is to validate the expression.
8.
Click Validate. The Designer displays a message that the expression parsed successfully. The syntax you entered has no errors.
9.
Click OK to close the message box from the parser, and then click OK again to close the Expression Editor.
Enter and validate the following expressions for the other two output ports:
Port OUT_MIN_PRICE OUT_AVG_PRICE Expression MIN(PRICE) AVG(PRICE)
Both MIN and AVG appear in the list of Aggregate functions, along with MAX.
2.
3.
Click Repository > Save and view the messages in the Output window. When you save changes to the repository, the Designer validates the mapping. You can notice an error message indicating that you have not connected the targets. You connect the targets later in this lesson.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
53
Click Transformation > Create. Select Expression and name the transformation EXP_AvgProfit. Click Create, and then click Done. The naming convention for Expression transformations is EXP_TransformationName. The Mapping Designer adds an Expression transformation to the mapping.
3. 4. 5.
Open the Expression transformation. Add a new input port, IN_AVG_PRICE, using the Decimal datatype with precision of 19 and scale of 2. Add a new output port, OUT_AVG_PROFIT, using the Decimal datatype with precision of 19 and scale of 2.
Note: Verify OUT_AVG_PROFIT is an output port, not an input/output port. You cannot enter expressions in input/output ports.
6.
7. 8. 9.
Validate the expression. Close the Expression Editor and then close the EXP_AvgProfit transformation. Connect OUT_AVG_PRICE from the Aggregator to the new input port.
10.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
54
Create a Lookup transformation and name it LKP_Manufacturers. The naming convention for Lookup transformations is LKP_TransformationName. A dialog box prompts you to identify the source or target database to provide data for the lookup. When you run a session, the Integration Service must access the lookup table.
2.
Click Source.
3. 4.
Select the MANUFACTURERS table from the list and click OK. Click Done to close the Create Transformation dialog box. The Designer now adds the transformation. Use source and target definitions in the repository to identify a lookup source for the Lookup transformation. Alternatively, you can import a lookup source.
5. 6.
Open the Lookup transformation. Add a new input port, IN_MANUFACTURER_ID, with the same datatype as MANUFACTURER_ID. In a later step, you connect the MANUFACTURER_ID port from the Aggregator transformation to this input port. IN_MANUFACTURER_ID receives MANUFACTURER_ID values from the Aggregator transformation. When the Lookup transformation receives a new value through this input port, it looks up the matching value from MANUFACTURERS.
Note: By default, the Lookup transformation queries and stores the contents of the lookup table before the rest of the transformation runs, so it performs the join through a local copy of the table that it has cached.
7.
Click the Condition tab, and click the Add button. An entry for the first condition in the lookup appears. Each row represents one condition in the WHERE clause that the Integration Service generates when querying records.
8.
Note: If the datatypes, including precision and scale, of these two columns do not match, the Designer displays a message and marks the mapping invalid.
9.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
55
Click OK. You now have a Lookup transformation that reads values from the MANUFACTURERS table and performs lookups using values passed through the IN_MANUFACTURER_ID input port. The final step is to connect this Lookup transformation to the rest of the mapping.
11. 12.
Click Layout > Link Columns. Connect the MANUFACTURER_ID output port from the Aggregator transformation to the IN_MANUFACTURER_ID input port in the Lookup transformation.
13.
Created a target definition and target table. Created a mapping. Added transformations.
Drag the following output ports to the corresponding input ports in the target:
Transformation Lookup Lookup Aggregator Aggregator Aggregator Expression Output Port MANUFACTURER_ID MANUFACTURER_NAME OUT_MIN_PRICE OUT_MAX_PRICE OUT_AVG_PRICE OUT_AVG_PROFIT Target Input Port MANUFACTURER_ID MANUFACTURER_NAME MIN_PRICE MAX_PRICE AVG_PRICE AVG_PROFIT
2.
Click Repository > Save. Verify mapping validation in the Output window.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
56
Designer Tips
This section includes tips for using the Designer. You learn how to complete the following tasks:
Use the Overview window to navigate the workspace. Arrange the transformations in the workspace.
Click View > Overview Window. You can also use the Toggle Overview Window icon.
Drag the viewing rectangle (the dotted square) within this window. As you move the viewing rectangle, the perspective on the mapping changes.
Arranging Transformations
The Designer can arrange the transformations in a mapping. When you use this option to arrange the mapping, you can arrange the transformations in normal view, or as icons.
To arrange a mapping: 1.
Click Layout > Arrange. The Select Targets dialog box appears showing all target definitions in the mapping.
2. 3.
Select Iconic to arrange the transformations as icons in the workspace. Select T_ITEM_SUMMARY and click OK. The following mapping shows how the Designer arranges all transformations in the pipeline connected to the T_ITEM_SUMMARY target definition.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
57
m_PhoneList. A pass-through mapping that reads employee names and phone numbers. m_ItemSummary. A more complex mapping that performs simple and aggregate calculations and lookups.
You have a reusable session based on m_PhoneList. Next, you create a session for m_ItemSummary in the Workflow Manager. You create a workflow that runs both sessions.
Open the Task Developer and click Tasks > Create. Create a Session task and name it s_ItemSummary. Click Create. In the Mappings dialog box, select the mapping m_ItemSummary and click OK.
3. 4. 5.
Click Done. Open the session properties for s_ItemSummary. Click the Connections setting on the Mapping tab. Select the source database connection TUTORIAL_SOURCE for SQ_ITEMS. Use the database connection you created in Configuring Database Connections in the Workflow Manager on page 43.
6.
Click the Connections setting on the Mapping tab. Select the target database connection TUTORIAL_TARGET for T_ITEM_SUMMARY. Use the database connection you created in Configuring Database Connections in the Workflow Manager on page 43.
7.
Now that you have two sessions, you can create a workflow and include both sessions in the workflow. When you run the workflow, the Integration Service runs all sessions in the workflow, either simultaneously or in sequence, depending on how you arrange the sessions in the workflow.
Click Tools > Workflow Designer. Click Workflows > Create to create a new workflow. If a workflow is already open, the Workflow Manager prompts you to close the current workflow. Click Yes to close any current workflow. The workflow properties appear.
3. 4.
Name the workflow wf_ItemSummary_PhoneList. Click the Browse Integration Service button to select an Integration Service to run the workflow. The Integration Service Browser dialog box appears.
5.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
58
6.
Click the Properties tab and select Write Backward Compatible Workflow Log File. The default name of the workflow log file is wf_ItemSummary_PhoneList.log.
7.
Click the Scheduler tab. By default, the workflow is scheduled to run on demand. Keep this default.
8.
Click OK to close the Create Workflow dialog box. The Workflow Manager creates a new workflow in the workspace including the Start task.
From the Navigator, drag the s_ItemSummary session to the workspace. Then, drag the s_PhoneList session to the workspace. Click the link tasks button on the toolbar. Drag from the Start task to the s_ItemSummary Session task. Drag from the Start task to the s_PhoneList Session task.
By default, when you link both sessions directly to the Start task, the Integration Service runs both sessions at the same time when you run the workflow. If you want the Integration Service to run the sessions one after the other, connect the Start task to one session, and connect that session to the other session.
13.
Click Repository > Save to save the workflow in the repository. You can now run and monitor the workflow.
Right-click the Start task in the workspace and select Start Workflow from Task.
Tip: You can also right-click the workflow in the Navigator and select Start Workflow.
The Workflow Monitor opens and connects to the repository and opens the tutorial folder. If the Workflow Monitor does not show the current workflow tasks, right-click the tutorial folder and select Get Previous Runs.
2.
Click the Gantt Chart tab at the bottom of the Time window to verify the Workflow Monitor is in Gantt Chart view.
Note: You can also click the Task View tab at the bottom of the Time window to view the Workflow Monitor in Task view. You can switch back and forth between views at any time.
3.
In the Navigator, expand the node for the workflow. All tasks in the workflow appear in the Navigator.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
59
Right-click the workflow and select Get Workflow Log to view the Log Events window for the workflow. -orRight-click a session and select Get Session Log to view the Log Events window for the session.
2.
Select a row in the log. The full text of the message appears in the section at the bottom of the window.
3. 4. 5.
Sort the log file by column by clicking on the column heading. Optionally, click Find to search for keywords in the log. Optionally, click Save As to save the log as an XML document.
Log Files
When you created the workflow, the Workflow Manager assigned default workflow and session log names and locations on the Properties tab. The Integration Service writes the log files to the locations specified in the session properties.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
60
Tutorial Lesson 5
This chapter includes the following topics:
Stored Procedure. Call a stored procedure and capture its return values. Filter. Filter data that you do not need, such as discontinued items in the ITEMS table. Sequence Generator. Generate unique IDs before inserting rows into the target.
You create a mapping that outputs data to a fact table and its dimension tables. Figure 7-1 shows the mapping you create in this lesson:
Figure 7-1. Mapping with Fact and Dimension Tables
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
61
Creating Targets
Before you create the mapping, create the following target tables:
F_PROMO_ITEMS. A fact table of promotional items. D_ITEMS, D_PROMOTIONS, and D_MANUFACTURERS. Dimensional tables.
Open the Designer, connect to the repository, and open the tutorial folder. Click Tools > Target Designer. To clear the workspace, right-click the workspace, and select Clear All.
3. 4. 5. 6.
Click Targets > Create. In the Create Target Table dialog box, enter F_PROMO_ITEMS as the name of the new target table, select the database type, and click Create. Repeat step 4 to create the other tables needed for this schema: D_ITEMS, D_PROMOTIONS, and D_MANUFACTURERS. When you have created all these tables, click Done. Open each new target definition, and add the following columns to the appropriate table:
D_ITEMS Column ITEM_ID ITEM_NAME PRICE D_PROMOTIONS Column PROMOTION_ID PROMOTION_NAME DESCRIPTION START_DATE END_DATE D_MANUFACTURERS Column MANUFACTURER_ID MANUFACTURER_NAME F_PROMO_ITEMS Column PROMO_ITEM_ID FK_ITEM_ID FK_PROMOTION_ID FK_MANUFACTURER_ID Datatype Integer Integer Integer Integer Precision NA NA NA NA Not Null Not Null Key Primary Key Foreign Key Foreign Key Foreign Key Datatype Integer Varchar Precision NA 72 Not Null Not Null Key Primary Key Datatype Integer Varchar Varchar Datetime Datetime Precision NA 72 default default default Not Null Not Null Key Primary Key Datatype Integer Varchar Money Precision NA 72 default Not Null Not Null Key Primary Key
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
62
Not Null
Key
The next step is to generate and execute the SQL script to create each of these new target tables.
To create the tables: 1. 2. 3. 4. 5. 6.
Select all the target definitions. Click Targets > Generate/Execute SQL. In the Database Object Generation dialog box, connect to the target database. Select Generate from Selected Tables, and select the options for creating the tables and generating keys. Click Generate and Execute. Click Close.
In the Designer, switch to the Mapping Designer, and create a new mapping. Name the mapping m_PromoItems. From the list of target definitions, select the tables you just created and drag them into the mapping. From the list of source definitions, add the following source definitions to the mapping:
5.
Delete all Source Qualifier transformations that the Designer creates when you add these source definitions.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
63
6.
Add a Source Qualifier transformation named SQ_AllData to the mapping, and connect all the source definitions to it.
When you create a single Source Qualifier transformation, the Integration Service increases performance with a single read on the source database instead of multiple reads.
7. 8.
Click View > Navigator to close the Navigator window to allow extra space in the workspace. Click Repository > Save.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
64
Create a Filter transformation and name it FIL_CurrentItems. Drag the following ports from the Source Qualifier transformation into the Filter transformation:
3. 4. 5.
Open the Filter transformation. Click the Properties tab to specify the filter condition. Click the Open button in the Filter Condition field. The Expression Editor dialog box appears.
6. 7. 8.
Select the word TRUE in the Formula field and press Delete. Click the Ports tab. Enter DISCONTINUED_FLAG = 0.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
65
9.
Click Validate, and then click OK. The new filter condition now appears in the Value field.
10.
Now, you need to connect the Filter transformation to the D_ITEMS target table. Currently sold items are written to this target.
To connect the Filter transformation: 1.
Connect the ports ITEM_ID, ITEM_NAME, and PRICE to the corresponding columns in D_ITEMS.
2.
The starting number (normally 1). The current value stored in the repository. The number that the Sequence Generator transformation adds to its current value for every request for a new ID. The maximum value in the sequence. A flag indicating whether the Sequence Generator transformation counter resets to the minimum value once it has reached its maximum value.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
66
The Sequence Generator transformation has two output ports, NEXTVAL and CURRVAL, which correspond to the two pseudo-columns in a sequence. When you query a value from the NEXTVAL port, the transformation generates a new value. In the new mapping, you add a Sequence Generator transformation to generate IDs for the fact table F_PROMO_ITEMS. Every time the Integration Service inserts a new row into the target table, it generates a unique ID for PROMO_ITEM_ID.
To create the Sequence Generator transformation: 1. 2. 3.
Create a Sequence Generator transformation and name it SEQ_PromoItemID. Open the Sequence Generator transformation. Click the Ports tab. The two output ports, NEXTVAL and CURRVAL, appear in the list.
Note: You cannot add any new ports to this transformation or reconfigure NEXTVAL and CURRVAL.
4.
Click the Properties tab. The properties for the Sequence Generator transformation appear. You do not have to change any of these settings.
5. 6.
Click OK. Connect the NEXTVAL column from the Sequence Generator transformation to the PROMO_ITEM_ID column in the target table F_PROMO_ITEMS.
7.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
67
DB2
LANGUAGE SQL P1: BEGIN -- Declare variables DECLARE SQLCODE INT DEFAULT 0; -- Declare handler DECLARE EXIT HANDLER FOR SQLEXCEPTION SET SQLCODE_OUT = SQLCODE; SELECT COUNT(*) INTO SP_RESULT FROM ORDER_ITEMS WHERE ITEM_ID=ARG_ITEM_ID; SET SQLCODE_OUT = SQLCODE; END P1
Teradata
CREATE PROCEDURE SP_GET_ITEM_COUNT (IN ARG_ITEM_ID integer, OUT SP_RESULT integer) BEGIN SELECT COUNT(*) INTO: SP_RESULT FROM ORDER_ITEMS WHERE ITEM_ID =: ARG_ITEM_ID; END;
In the mapping, add a Stored Procedure transformation to call this procedure. The Stored Procedure transformation returns the number of orders containing an item to an output port.
To create the Stored Procedure transformation: 1.
Create a Stored Procedure transformation and name it SP_GET_ITEM_COUNT. The Import Stored Procedure dialog box appears.
2.
Select the ODBC connection for the source database. Enter a user name, owner name, and password. Click Connect.
3.
Select the stored procedure named SP_GET_ITEM_COUNT from the list and click OK.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
68
4.
In the Create Transformation dialog box, click Done. The Stored Procedure transformation appears in the mapping.
5. 6.
Open the Stored Procedure transformation, and click the Properties tab. Click the Open button in the Connection Information section. The Select Database dialog box appears.
7.
Select the source database and click OK. You can call stored procedures in both source and target databases.
Note: You can also select the built-in database connection variable, $Source. When you use $Source or $Target, the Integration Service determines which source database connection to use when it runs the session. If it cannot determine which connection to use, it fails the session.
8. 9. 10. 11.
Click OK. Connect the ITEM_ID column from the Source Qualifier transformation to the ITEM_ID column in the Stored Procedure transformation. Connect the RETURN_VALUE column from the Stored Procedure transformation to the NUMBER_ORDERED column in the target table F_PROMO_ITEMS. Click Repository > Save.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
69
Connect the following columns from the Source Qualifier transformation to the targets:
Source Qualifier PROMOTION_ID PROMOTION_NAME DESCRIPTION START_DATE END_DATE MANUFACTURER_ID MANUFACTURER_NAME Target Table D_PROMOTIONS D_PROMOTIONS D_PROMOTIONS D_PROMOTIONS D_PROMOTIONS D_MANUFACTURERS D_MANUFACTURERS Column PROMOTION_ID PROMOTION_NAME DESCRIPTION START_DATE END_DATE MANUFACTURER_ID MANUFACTURER_NAME
2.
The mapping is now complete. You can create and run a workflow with this mapping.
Creating a Workflow
In this part of the lesson, you complete the following steps: 1. 2. 3. Create a workflow. Add a non-reusable session to the workflow. Define a link condition before the Session task.
Click Tools > Workflow Designer. Click Workflows > Create to create a new workflow.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
70
Name the workflow wf_PromoItems. Click the Browse Integration Service button to select the Integration Service to run the workflow. The Integration Service Browser dialog box appears.
5. 6.
Select the Integration Service you use and click OK. Click the Scheduler tab. By default, the workflow is scheduled to run on demand. Keep this default.
7.
Click OK to close the Create Workflow dialog box. The Workflow Manager creates a new workflow in the workspace including the Start task.
Click Tasks > Create. The Create Task dialog box appears. The Workflow Designer provides more task types than the Task Developer. These tasks include the Email and Decision tasks.
Create a Session task and name it s_PromoItems. Click Create. In the Mappings dialog box, select the mapping m_PromoItems and click OK. Click Done. Open the session properties for s_PromoItems. Click the Mapping tab. Select the source database connection for the sources connected to the SQ_AllData Source Qualifier transformation. Select the target database for each target definition. Click OK to save the changes. Click the Link Tasks button on the toolbar. Drag from the Start task to s_PromoItems. Click Repository > Save to save the workflow in the repository.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
71
In the following steps, you create a link condition before the Session task and use the built-in workflow variable WORKFLOWSTARTTIME. You define the link condition so the Integration Service runs the session if the workflow start time is before the date you specify.
To define a link condition: 1.
Double-click the link from the Start task to the Session task. The Expression Editor appears.
2.
Expand the Built-in node on the PreDefined tab. The Workflow Manager displays the two built-in workflow variables, SYSDATE and WORKFLOWSTARTTIME.
3.
Enter the following expression in the expression window. Be sure to enter a date later than todays date:
WORKFLOWSTARTTIME < TO_DATE('8/30/2007','MM/DD/YYYY')
Tip: You can double-click the built-in workflow variable on the PreDefined tab and double-click the
Press Enter to create a new line in the Expression. Add a comment by typing the following text:
// Only run the session if the workflow starts before the date specified above.
5.
Click Validate to validate the expression. The Workflow Manager displays a message in the Output window.
6.
Click OK.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
72
After you specify the link condition in the Expression Editor, the Workflow Manager validates the link condition and displays it next to the link in the workflow.
7.
The Workflow Monitor opens and connects to the repository and opens the tutorial folder.
2. 3.
Click the Gantt Chart tab at the bottom of the Time window to verify the Workflow Monitor is in Gantt Chart view. In the Navigator, expand the node for the workflow. All tasks in the workflow appear in the Navigator.
4.
In the Properties window, click Session Statistics to view the workflow results. If the Properties window is not open, click View > Properties View. The results from running the s_PromoItems session are as follows:
F_PROMO_ITEMS 40 rows inserted D_ITEMS 13 rows inserted D_MANUFACTURERS 11 rows inserted D_PROMOTIONS 3 rows inserted
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
73
Tutorial Lesson 6
This chapter includes the following topics:
Using XML Files Creating the XML Source Creating the Target Definition Creating a Mapping with XML Sources and Targets Creating a Workflow
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
74
Open the Designer, connect to the repository, and open the tutorial folder. Click Tools > Source Analyzer. Click Sources > Import XML Definition. Click Advanced Options.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
75
The Change XML Views Creation and Naming Options dialog box opens.
5. 6. 7.
Select Override All Infinite Lengths and enter 50. Configure all other options as shown and click OK to save the changes. In the Import XML Definition dialog box, navigate to the client\bin directory under the PowerCenter installation directory and select the Employees.xsd file. Click Open. The XML Definition Wizard opens.
8.
Verify that the name for the XML definition is Employees and click Next.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
76
9.
Because you only need to work with a few elements and attributes in the Employees.xsd file, you skip creating a definition with the XML Wizard. Instead, create a custom view in the XML Editor. With a custom view in the XML Editor, you can exclude the elements and attributes that you do not need in the mapping.
10.
When you skip creating XML views, the Designer imports metadata into the repository, but it does not create the XML view. In the next step, you use the XML Editor to add groups and columns to the XML view.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
77
To work with these three instances separately, you pivot them to create three separate columns in the XML definition. You create a custom XML view with columns from several groups. You then pivot the occurrence of SALARY to create the columns, BASESALARY, COMMISSION, and BONUS. Figure 8-2 shows the XML Editor:
Figure 8-2. XML Editor
Navigator
XPath Navigator
XML Workspace
XML View
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
78
Double-click the XML definition or right-click the XML definition and select Edit XML Definition to open the XML Editor. Click XMLViews > Create XML View to create a new XML view. From the EMPLOYEE group, select DEPTID and right-click it. Choose Show XPath Navigator. Expand the EMPLOYMENT group so that the SALARY column appears. From the XPath Navigator, select the following elements and attributes and drag them into the new view:
Click the Mode icon on the XPath Navigator and choose Advanced Mode.
8.
Select the SALARY column and drag it into the XML view.
Note: The XPath Navigator must include the EMPLOYEE column at the top when you drag SALARY to the XML view.
The resulting view includes the elements and attributes shown in the following view:
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
79
9.
Drag the SALARY column into the new XML view two more times to create three pivoted columns.
Note: Although the new columns appear in the column window, the view shows one instance of SALARY.
The wizard adds three new columns in the column view and names them SALARY, SALARY0, and SALARY1.
10.
Rename the new columns. Use information on the following table to modify the name and pivot properties:
Column Name SALARY SALARY0 SALARY1 New Column Name BASESALARY COMMISSION BONUS Not Null Yes Pivot Occurrence 1 2 3
Note: To update the pivot occurrence, click the Xpath of the column you want to edit. The Specify Query Predicate for Xpath window appears. Select the column name and change the pivot occurrence.
11. 12.
Click File > Apply Changes to save the changes to the view. Click File > Exit to close the XML Editor. The following source definition appears in the Source Analyzer.
Note: The pivoted SALARY columns do not display the names you entered in the Columns window. However, when you drag the ports to another transformation, the edited column names appear in the transformation.
13.
Click Repository > Save to save the changes to the XML definition.
Each department has a separate target and the structure for each target is the same.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
80
Each target contains salary and department information for employees in the Sales or Engineering department.
Because the structure for the target data is the same for the Engineering and Sales groups, use two instances of the target definition in the mapping. In the following steps, you import the Sales_Salary schema file and create a custom view based on the schema.
To import and edit the XML target definition: 1.
In the Designer, switch to the Target Designer. If the workspace contains targets from other lessons, right-click the workspace and choose Clear All.
2. 3.
Click Targets > Import XML Definition. Navigate to the Tutorial directory in the PowerCenter installation directory, and select the Sales_Salary.xsd file. Click Open. The XML Definition Wizard appears.
4. 5.
Name the XML definition SALES_SALARY and click Next. Select Skip Create XML Views and click Finish. The XML Wizard creates the SALES_SALARY target with no columns or groups.
6. 7.
Double-click the XML definition to open the XML Editor. Click XMLViews > Create XML View. The XML Editor creates an empty view.
Right-click DEPARTMENT group in the Schema Navigator and select Show XPath Navigator. From the XPath Navigator, drag DEPTNAME and DEPTID into the empty XML view. The XML Editor names the view X_DEPARTMENT.
Note: The XML Editor may transpose the order of the attributes DEPTNAME and DEPTID. If this occurs, add the columns in the order they appear in the Schema Navigator. Transposing the order of attributes does not affect data consistency.
10. 11.
In the X_DEPARTMENT view, right-click the DEPTID column, and choose Set as Primary Key. Click XMLViews > Create XML View. The XML Editor creates an empty view.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
81
12. 13.
From the EMPLOYEE group in the Schema Navigator, open the XPath Navigator. From the XPath Navigator, drag EMPID, FIRSTNAME, LASTNAME, and TOTALSALARY into the empty XML view. The XML Editor names the view X_EMPLOYEE.
14.
Right-click the X_EMPLOYEE view and choose Create Relationship. Drag the pointer from the X_EMPLOYEE view to the X_DEPARTMENT view to create a link.
15.
The XML Editor creates a DEPARTMENT foreign key in the X_EMPLOYEE view that corresponds to the DEPTID primary key.
16.
Click File > Apply Changes and close the XML Editor. The XML definition now contains the groups DEPARTMENT and EMPLOYEE.
17.
The Employees XML source definition you created. The DEPARTMENT relational source definition you created in Creating Source Definitions on page 31. Two instances of the SALES_SALARY target definition you created. An Expression transformation to calculate the total salary for each employee.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
82
You pass the data from the Employees source through the Expression and Router transformations before sending it to two target instances. You also pass data from the relational table through another Router transformation to add the department names to the targets. You need data for the sales and engineering departments.
To create the mapping: 1. 2. 3. 4.
In the Designer, switch to the Mapping Designer and create a new mapping. Name the mapping m_EmployeeSalary. Drag the Employees XML source definition into the mapping. Drag the DEPARTMENT relational source definition into the mapping. By default, the Designer creates a source qualifier for each source.
5. 6. 7.
Drag the SALES_SALARY target definition into the mapping two times. Rename the second instance of SALES_SALARY as ENG_SALARY. Click Repository > Save. Because you have not completed the mapping, the Designer displays a warning that the mapping m_EmployeeSalary is invalid.
Next, you add an Expression transformation and two Router transformations. Then, you connect the source definitions to the Expression transformation. You connect the pipeline to the Router transformations and then to the two target definitions.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
83
Create an Expression transformation and name it EXP_TotalSalary. The new transformation appears.
2. 3.
Click Done. Drag all the ports from the XML Source Qualifier transformation to the EXP_TotalSalary Expression transformation. The input/output ports in the XML Source Qualifier transformation are linked to the input/output ports in the Expression transformation.
4. 5. 6.
Open the Expression transformation. On the Ports tab, add an output port named TotalSalary. Use the Decimal datatype with precision of 10 and scale of 2. Enter the following expression for TotalSalary:
BASESALARY + COMMISSION + BONUS
7. 8. 9.
Validate the expression and click OK. Click OK to close the transformation. Click Repository > Save.
Create a Router transformation and name it RTR_Salary. Then click Done. In the Expression transformation, select the following columns and drag them to RTR_Salary:
The Designer creates an input group and adds the columns you drag from the Expression transformation.
3.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
84
4.
On the Groups tab, add two new groups. Change the group names and set the filter conditions. Use the following table as a guide:
Group Name Sales Engineering Filter Condition DEPTID = SLS DEPTID = ENG
The Designer adds a default group to the list of groups. All rows that do not meet the condition you specify in the group filter condition are routed to the default group. If you do not connect the default group, the Integration Service drops the rows.
5. 6.
Click OK to close the transformation. In the workspace, expand the RTR_Salary Router transformation to see all groups and ports.
7.
Next, you create another Router transformation to filter the Sales and Engineering department data from the DEPARTMENT relational source.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
85
Create a Router transformation and name it RTR_DeptName. Then click Done. Drag the DeptID and DeptName ports from the DEPARTMENT Source Qualifier transformation to the RTR_DeptName Router transformation. Open RTR_DeptName. On the Groups tab, add two new groups. Change the group names and set the filter conditions using the following table as a guide:
Group Name Sales Engineering Filter Condition DEPTID = SLS DEPTID = ENG
5.
6. 7. 8.
Click OK to close the transformation. In the workspace, expand the RTR_DeptName Router transformation to see all groups and columns. Click Repository > Save.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
86
Connect the following ports from RTR_Salary groups to the ports in the XML target definitions:
Router Group Sales Router Port EMPID1 DEPTID1 LASTNAME1 FIRSTNAME1 TotalSalary1 Engineering EMPID3 DEPTID3 LASTNAME3 FIRSTNAME3 TotalSalary3 ENG_SALARY EMPLOYEE Target SALES_SALARY Target Group EMPLOYEE Target Port EMPID DEPTID (FK) LASTNAME FIRSTNAME TOTALSALARY EMPID DEPTID (FK) LASTNAME FIRSTNAME TOTALSALARY
The following picture shows the Router transformation connected to the target definitions:
2.
Connect the following ports from RTR_DeptName groups to the ports in the XML target definitions:
Router Group Sales Router Port DEPTID1 DEPTNAME1 Engineering DEPTID3 DEPTNAME3 ENG_SALARY DEPARTMENT Target SALES_SALARY Target Group DEPARTMENT Target Port DEPTID DEPTNAME DEPTID DEPTNAME
3.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
87
The mapping is now complete. When you save the mapping, the Designer displays a message that the mapping m_EmployeeSalary is valid.
Creating a Workflow
In the following steps, you create a workflow with a non-reusable session to run the mapping you just created.
Note: Before you run the workflow based on the XML mapping, verify that the Integration Service that runs the workflow can access the source XML file. Copy the Employees.xml file from the Tutorial folder to the $PMSourceFileDir directory for the Integration Service. Usually, this is the SrcFiles directory in the Integration Service installation directory.
To create the workflow: 1. 2. 3. 4.
Open the Workflow Manager. Connect to the repository and open the tutorial folder. Go to the Workflow Designer. Click Workflows > Wizard. The Workflow Wizard opens.
5.
Name the workflow wf_EmployeeSalary and select a service on which to run the workflow. Then click Next.
6.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
88
Click Next. Click Run on demand and click Next. The Workflow Wizard displays the settings you chose.
9.
Click Finish to create the workflow. The Workflow Wizard creates a Start task and session. You can add other tasks to the workflow later.
Click Repository > Save to save the new workflow. Double-click the s_m_EmployeeSalary session to open it for editing. Click the Mapping tab. Select the connection for the SQ_DEPARTMENT Source Qualifier transformation.
14.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
89
15.
Verify that the Employees.xml file is in the specified source file directory.
16.
Click the ENG_SALARY target instance on the Mapping tab and verify that the output file name is eng_salary.xml.
Click the SALES_SALARY target instance on the Mapping tab and verify that the output file name is sales_salary.xml. Click OK to close the session. Click Repository > Save. Run and monitor the workflow. The Integration Service creates the eng_salary.xml and sales_salary.xml files.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
90
Naming Conventions
Transformations
Table A-1 lists the recommended naming convention for transformations:
Table A-1. Naming Conventions for Transformations
Transformation Aggregator Application Source Qualifier Custom Expression External Procedure Filter HTTP Java Joiner Lookup MQ Source Qualifier Normalizer Rank Router Sequence Generator Sorter Naming Convention AGG_TransformationName ASQ_TransformationName CT_TransformationName EXP_TransformationName EXT_TransformationName FIL_TransformationName HTTP_TransformationName JTX_TransformationName JNR_TransformationName LKP_TransformationName SQ_MQ_TransformationName NRM_TransformationName RNK_TransformationName RTR_TransformationName SEQ_TransformationName SRT_TransformationName
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
91
Targets
The naming convention for targets is: T_TargetName.
Mappings
The naming convention for mappings is: m_MappingName.
Mapplets
The naming convention for mapplets is: mplt_MappletName.
Sessions
The naming convention for sessions is: s_MappingName.
Worklets
The naming convention for worklets is: wl_WorkletName.
Workflows
The naming convention for workflows is: wf_WorkflowName.
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
92
Course Objectives
By the end of this course you will: Understand how to use the major PowerCenter components for development Be able to build basic ETL mappings and mapplets*
Introduction
Be able to create, run and monitor workflows Understand available options for loading target data Be able to troubleshoot most problems Note: The course does not cover PowerCenter optional features or XML support.
* A mapplet is a subset of a mapping
About Informatica
Founded in 1993 Leader in enterprise solution products Headquarters in Redwood City, CA Public company since April 1999 (INFA) 2000+ customers, including over 80% of Fortune 100 Strategic partnerships with IBM, HP, Accenture, SAP, and many others Worldwide distributorship
Informatica Products
PowerCenter PowerAnalyzer ETL batch and real-time data integration BI reporting web-browser interface with reports, dashboards, indicators, alerts; handles real-time metrics Centralized metadata browsing cross-enterprise, including PowerCenter, PowerAnalyzer, DBMS, BI tools, and data modeling tools
SuperGlue*
PowerExchange Data access to mainframe, mid-size system and complex files PowerCenter Data access to transactional applications and Connect products real-time services
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
93
Informatica Resources
www.informatica.com provides information (under Services) on: Professional Services Education Services my.informatica.com sign up to access: Technical Support Product documentation (under Tools online documentation) Velocity Methodology (under Services) Knowledgebase Webzine Mapping templates devnet.informatica.com sign up for Informatica Developers Network Discussion forums Web seminars Technical papers
6 7
Transaction level data Optimized for transaction response time Current Normalized or De-normalized data
Aggregate data Cleanse data Consolidate data Apply business rules De-normalize data
Extract
8
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
94
PowerCenter 7 Architecture
Native
Sources Informatica Server
TCP/IP
Platforms:
Client tools run on Windows Servers run on AIX, HP-UX, Solaris, Redhat Linux, Windows Repositories on any major RDBMS
Native
Repository Designer Workflow Workflow Rep Server Manager Manager Monitor Administrative Console
Repository
Not Shown: Client ODBC connections from Designer to sources and targets for metadata
10 11
1. Create Source definition(s) 2. Create Target definition(s) 3. Create a Mapping 4. Create a Session Task 5. Create a Workflow with Task components 6. Run the Workflow and verify the results
12
15
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
95
Repository Server
TCP/IP
Import from: Relational database Flat file XML object Create manually
Repository Server
TCP/IP
Repository Agent
Native Native
Repository Agent
DEF
16
Repository
DEF
17
Repository
Flat File
DEF
Repository Server
TCP/IP
Repository Agent
Native
DEF
18 19
Repository
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
96
Data Previewer
XML Schema (XSD), DTD or XML File
DEF
Preview data in
Relational database sources Flat file sources Relational database targets Flat file targets
Repository Server
TCP/IP
DATA
Repository Agent
Native
DEF
22
Repository
23
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
97
Metadata Extensions
Allows developers and partners to extend the metadata stored in the Repository Metadata extensions can be:
User-defined PowerCenter users can define and create their own metadata Vendor-defined Third-party application vendor-created metadata lists
For example, applications such as Ariba or PowerCenter Connect for Siebel can add information such as contacts, version, etc.
Metadata Extensions
Can be reusable or non-reusable Can promote non-reusable metadata extensions to reusable; this is irreversible (except by Administrator) Reusable metadata extensions are associated with all repository objects of that object type A non-reusable metadata extensions is associated with a single repository object Administrator or Super User privileges are required for managing reusable metadata extensions
26
27
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
98
28
30
View Synonym
Repository Agent
Native
DEF
Repository
31
32
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
99
DATA
Repository Agent
Native
DEF
33
Repository
34
Heterogeneous Targets
By the end of this section you will be familiar with: Heterogeneous target types Heterogeneous target limitations Target conversions
35
37
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
100
Oracle table
Tables are EITHER in two different databases, or require different (schemaspecific) connect strings One target is a flat file load
Oracle table
Flat file
38
39
Relational target to flat file target Relational target to any other relational database type
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
101
Transformation Views
A transformation has three views:
Iconized shows the transformation in relation to the rest of the mapping Normal shows the flow of data through the transformation Edit shows transformation ports (= table columns) and properties; allows editing
46 47
Expression Transformation
Perform calculations using non-aggregate functions (row level)
Ports Mixed Variables allowed Create expression in an output or variable port Usage Perform majority of data manipulation
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
102
Expression Editor
An expression formula is a calculation or conditional statement for a specific port in a transformation Performs calculation based on ports, functions, operators, variables, constants and return values from other transformations
Expression Validation
The Validate or OK button in the Expression Editor will:
Parse the current expression Remote port searching (resolves references to ports in other transformations) Parse default values Check spelling, correct number of arguments in functions, other syntactical errors
48
49
Character Functions Used to manipulate character data CHRCODE returns the numeric value (ASCII or Unicode) of the first character of the string passed to this function CONCAT is for backward compatibility only. Use || instead
TO_CHAR (numeric) TO_DATE TO_DECIMAL TO_FLOAT TO_INTEGER
Conversion Functions
Used to convert datatypes
50
51
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
103
Used to process data during data cleansing METAPHONE and SOUNDEX create indexes based on English pronunciation (2 different standards)
Date Functions
Used to round, truncate, or compare dates; extract one part of a date; or perform arithmetic on a date To pass a string to a date function, first use the TO_DATE function to convert it to an date/time datatype
52
53
Test Functions Used to test if a lookup result is null Used to validate data
54
55
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
104
Variable Ports
Use in another variable port or an output port expression Local to the transformation (a variable port cannot also be an input or output port)
Use for temporary storage Variable ports can remember values across rows; useful for comparing values Variables are initialized (numeric to 0, string to ) when the Mapping logic is processed Variables Ports are not visible in Normal view, only in Edit view
56
57
Informatica Datatypes
NATIVE DATATYPES TRANSFORMATION DATATYPES
Specific to the source and target database types Display in source and target tables within Mapping Designer
Native
Transformation
Native
Selected port
Transformation datatypes allow mix and match of source and target database types When connecting ports, native and transformation datatypes must be compatible (or must be explicitly converted)
59
58
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
105
Mappings
By the end of this section you will be familiar with: The Mapping Designer interface Transformation objects and views Source Qualifier transformation The Expression transformation Mapping validation
For further information, see the PowerCenter Client Help > Index > port-to-port data conversion
60
62
Mapping Designer
Ports
All input/output
Usage
Convert datatypes For relational sources:
Iconized Mapping
Modify SQL statement User Defined Join Source Filter Sorted ports Select DISTINCT Pre/Post SQL
63
64
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
106
65
66
Connection Validation
Examples of invalid connections in a Mapping:
Connecting ports with incompatible datatypes Connecting output ports to a Source Connecting a Source to anything but a Source
Mapping Validation
Mappings must: Be valid for a Session to run Be end-to-end complete and contain valid expressions Pass all data flow rules Mappings are always validated when saved; can be validated without being saved Output Window displays reason for invalidity
68
69
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
107
Workflows
By the end of this section, you will be familiar with: The Workflow Manager GUI interface Creating and configuring Workflows Workflow properties Workflow components Workflow tasks
Workspace
Status Bar
Output Window
72
73
Workflow Structure
A Workflow is set of instructions for the Informatica Server to perform data transformation and load Combines the logic of Session Tasks, other types of Tasks and Worklets The simplest Workflow is composed of a Start Task, a Link and one other Task
Link
Task Developer
Create Session, Shell Command and Email tasks Tasks created in the Task Developer are reusable
Worklet Designer
Creates objects that represent a set of tasks Worklet objects are reusable
74 75
Start Task
Session Task
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
108
Reusable Tasks
Three types of reusable Tasks
Session Set of instructions to execute a specific Mapping Command Specific shell commands to run during any Workflow Email Sends email during the Workflow
Reusable Tasks
Use the Task Developer to create reusable tasks These tasks will then appear in the Navigator and can be dragged and dropped into any workflow
77
78
Command Task
Specify one or more Unix shell or DOS commands to run during the Workflow
Runs in the Informatica Server (UNIX or Windows)
environment
Command task status (successful completion or failure) is held in the pre-defined task variable $command_task_name.STATUS Each Command Task shell command can execute before the Session begins or after the Informatica Server executes a Session
Reusable
Non-reusable
79
80
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
109
Command Task
Specify one (or more) Unix shell or DOS (NT, Win2000) commands to run at a specific point in the workflow Becomes a component of a workflow (or worklet) If created in the Task Developer, the Command task is reusable If created in the Workflow Designer, the Command task is not reusable Commands can also be invoked under the Components tab of a Session task to run pre- or post-session
81
82
Email Task
Configure to have the Informatica Server to send email at any point in the Workflow Becomes a component in a Workflow (or Worklet)
Add Cmd Remove Cmd
If configured in the Task Developer, the Email Task is reusable (optional) Emails can also be invoked under the Components tab of a Session task to run pre- or post-session
83
84
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
110
Non-Reusable Tasks
Six additional Tasks are available in the Workflow Designer
Decision Assignment Timer Control Event Wait Event Raise
85
87
Decision Task
Specifies a condition to be evaluated in the Workflow Use the Decision Task in branches of a Workflow Use link conditions downstream to control execution flow by testing the Decision result
Assignment Task
Assigns a value to a Workflow Variable Variables are defined in the Workflow object
88
89
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
111
Timer Task
Waits for a specified period of time to execute the next Task
General Tab Timer Tab
Control Task
Stop or ABORT the Workflow
Properties Tab General Tab
90
91
the workflow
92
93
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
112
The Event Raise Task triggers the user-defined event when the Informatica Server executes the Event Raise Task
General Tab
Properties Tab
94
95
Session Task
Server instructions to run the logic of ONE specific mapping
e.g. source and target data location specifications, memory allocation, optional Mapping overrides, scheduling, processing and load instructions
Becomes a component of a
Workflow (or Worklet) If configured in the Task Developer, the Session Task is reusable (optional)
96
97
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
113
Sample Workflow
Session 1
Command Task
Concurrent
Combined
Session 2
Note: Although only session tasks are shown, can be any tasks
99
98
Creating a Workflow
Workflow Properties
Customize Workflow Properties
Workflow log displays
Select a Server
100
101
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
114
Workflow Scheduler
102
103
Workflow Links
Required to connect Workflow Tasks Can be used to create branches in a Workflow All links are executed unless a link condition is used which makes a link false
Link 1 Link 3
Conditional Links
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
115
Workflow Variables 1
Used in decision tasks and conditional links edit task or link:
Pre-defined variables User-defined variables (see separate slide)
Workflow Variables 2
User-defined variables are set in Workflow properties, Variables tab can persist across sessions
Task-specific variables
Workflow Summary
1. Add Sessions and other Tasks to the Workflow 2. Connect all Workflow components with Links 3. Save the Workflow
4. Start the Workflow
Session Tasks
After this section, you will be familiar with: How to create and configure Session Tasks Session Task source and target properties
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
116
Or Select menu Tasks | Create and select Session from the drop-down menu
111
112
Set properties
113
114
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
117
Monitoring Workflows
By the end of this section you will be familiar with: The Workflow Monitor GUI interface Monitoring views Server monitoring modes Filtering displayed items Actions initiated from the Workflow Monitor Truncating Monitor Logs
Workflow Monitor
The Workflow Monitor is the tool for monitoring Workflows and Tasks Choose between two views: Gantt chart Task view
Task view
Monitoring Operations
Perform operations in the Workflow Monitor
Stop, Abort, or Restart a Task, Workflow or Worklet Resume a suspended Workflow after a failed Task is corrected Reschedule or Unschedule a Workflow
committing data during the timeout period, the threads and processes associated with the Session are killed
Stopping a Session Task means the Server stops reading data 118 119
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
118
Monitoring filters can be set using drop down menus. Minimizes items displayed in Task View
Right-click on Session to retrieve the Session Log (from the Server to the local PC Client) Start, Stop, Abort, Resume Tasks,Workflows and Worklets 121
Filter Toolbar
Repository Manager
Select type of tasks to filter Select servers to filter Filter tasks by specified criteria Display recent runs
Repository Managers Truncate Log option clears the Workflow Monitor logs
122
123
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
119
Debugger
By the end of this section you will be familiar with: Creating a Debug Session Debugger windows and indicators Debugger functionality and options Viewing data with the Debugger Setting and using Breakpoints Tips for using the Debugger
Debugger Features
Wizard driven tool that runs a test session View source / target data View transformation data Set breakpoints and evaluate expressions Initialize variables Manually change variable values Data can be loaded or discarded Debug environment can be saved for later use
127
128
Debugger Interface
Edit Breakpoints Debugger Mode indicator Solid yellow arrow is current transformation indicator
Set Breakpoints
1. Edit breakpoint 2. Choose global or specific transformation 3. Choose to break on data condition or error. Optionally skip rows. 4. Add breakpoint(s) 5. Add data conditions
129
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
120
Debugger Tips
Server must be running before starting a Debug Session When the Debugger is started, a spinning icon displays. Spinning stops when the Debugger Server is ready The flashing yellow/green arrow points to the current active Source Qualifier. The solid yellow arrow points to the current Transformation instance
Next Instance proceeds a single step at a time; one row moves from transformation to transformation Step to Instance examines one transformation at a time, following successive rows through the same transformation
131 134
Filter Transformation
Drops rows conditionally
Ports All input / output Specify a Filter condition Usage Filter rows from input flow
Sorter Transformation
Can sort data from relational tables or flat files Sort takes place on the Informatica Server machine Multiple sort keys are supported The Sorter transformation is often more efficient than a sort performed on a database with an ORDER BY clause
Sorter Transformation
Sorts data from any source, at any point in a data flow
Sort Keys
Ports Input/Output Define one or more sort keys Define sort order for each key Example of Usage Sort data before Aggregator to improve performance
Sort Order
137
138
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
121
Sorter Properties
Aggregator Transformation
By the end of this section you will be familiar with: Basic Aggregator functionality
Cache size can be adjusted. Default is 8 Mb.
Ensure sufficient memory is available on the Informatica Server (else Session Task will fail)
Creating subtotals with the Aggregator Aggregator expressions Aggregator properties Using sorted data
139
141
Aggregator Transformation
Performs aggregate calculations
Aggregate Expressions
Aggregate functions are supported only in the Aggregator Transformation
Ports Mixed I/O ports allowed Variable ports allowed Group By allowed Create expressions in variable and output ports Usage Standard aggregations
Conditional Aggregate expressions are supported: Conditional SUM format: SUM(value, condition)
142 143
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
122
Aggregator Functions
AVG COUNT FIRST LAST MAX MEDIAN MIN PERCENTILE STDDEV SUM VARIANCE
Aggregator Properties
Sorted Input Property
Return summary values for non-null data in selected ports Use only in Aggregator transformations Use in output ports only Calculate a single value (and row) for all records in a group Only one aggregate function can be nested within an aggregate function Conditional statements can be used with these functions
Instructs the Aggregator to expect the data to be sorted Set Aggregator cache sizes for Informatica Server machine
144
145
Sorted Data
The Aggregator can handle sorted or unsorted data
Sorted data can be aggregated more efficiently, decreasing total processing time
The Server will cache data from each group and release the cached data upon reaching the first record of the next group Data must be sorted according to the order of the Aggregators Group By ports Performance gain will depend upon varying factors
No rows are released from Aggregator until all rows are aggregated
146
147
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
123
Active transformation
Can operate on groups of data rows AND/OR Can change the number of rows on the data flow Examples: Aggregator, Filter, Source Qualifier
Each separate group (one row) is released as soon as the last row in the group is aggregated
148 149
Joiner Transformation
By the end of this section you will be familiar with: When to join in Source Qualifier and when in Joiner transformation Homogeneous joins Heterogeneous joins Joiner properties Joiner conditions T Nested joins
Passive T T T
Active
Example holds true with Normalizer instead of Source Qualifier. Exceptions are: Mapplet Input and sorted Joiner transformations 150 152
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
124
153
154
Joiner Transformation
Performs heterogeneous joins on different data flows
Active Transformation Ports All input or input / output M denotes port comes from master source Examples Join two flat files Join two tables from different databases Join a flat file with a relational table
Joiner Conditions
155
156
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
125
Joiner Properties
Join types:
Normal (inner) Master outer Detail outer Full outer Set Joiner Caches
Nested Joins
Used to join three or more heterogeneous sources
Joiner can accept sorted data (configure the join condition to use the sort origin ports)
157 158
Lookup Transformation
By the end of this section you will be familiar with: Lookup principles Lookup properties Lookup conditions Lookup techniques Caching considerations Persistent caches
159
162
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
126
Lookup Transformation
Looks up values in a database table or flat file and provides data to other components in a mapping
Ports Mixed L denotes Lookup port R denotes port used as a return value (unconnected Lookup only see later) Specify the Lookup Condition Usage Get related values Verify if records exists or if data has changed
Lookup transformation
Return value(s)
163
164
Lookup Conditions
Lookup Properties
Lookup table name Lookup condition
165
166
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
127
Lookup Caching
Caching can significantly impact performance Cached
Lookup table data is cached locally on the Server Mapping rows are looked up against the cache Only one SQL SELECT is needed
Policy on multiple match: Use first value Use last value Report error
Uncached
Each Mapping row needs one SQL SELECT Rule Of Thumb: Cache if the number (and size) of records in the Lookup table is small relative to the number of mapping rows requiring the lookup
167
168
Persistent Caches
By default, Lookup caches are not persistent; when the session completes, the cache is erased Cache can be made persistent with the Lookup properties When Session completes, the persistent cache is stored on the server hard disk The next time Session runs, cached data is loaded fully or partially into RAM and reused A named persistent cache may be shared by different sessions Can improve performance, but stale data may pose a problem
169
170
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
128
Target Options
By the end of this section you will be familiar with: Default target load type Target properties Update override Constraint-based loading
Set prefix for persistent cache file name Reload persistent cache
171 174
Target Properties
Edit Tasks: Mappings Tab Session Task
Select target instance Target load type Row loading operations Error handling
175
176
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
129
Constraint-based Loading
PK1
FK1 PK2
FK2
Delete SQL
DELETE from <target> WHERE <primary key> = <pkvalue>
To maintain referential integrity, primary keys must be loaded before their corresponding foreign keys here in the order Target1, Target2, Target 3
can change the number of rows on the data flow Examples: Source Qualifier, Aggregator, Joiner, Sorter, Filter
Active source
Active transformation that generates rows Cannot match an output row with a distinct input row Examples: Source Qualifier, Aggregator, Joiner, Sorter (The Filter is NOT an active source)
Active group
Group of targets in a mapping being fed by the same active
source
179 180
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
130
Example 1
With only one Active source, rows for Targets1, 2, and 3 will be loaded properly and maintain referential integrity
FK1 PK2
FK2
PK1
Example 2
With two Active sources, it is not possible to control whether rows for Target3 will be loaded before or after those for Target2
Ports All input / output Specify the Update Strategy Expression IIF or DECODE logic determines how to handle the record Example Updating Slowly Changing Dimensions
FK1 PK2
FK2
181
184
Router Transformation
Rows sent to multiple filter conditions
Ports All input/output Specify filter conditions for each Group Usage Link source data in one pass to multiple filter conditions
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
131
Router Groups
Input group (always one) User-defined groups Each group has one condition ALL group conditions are evaluated for EACH row One row can pass multiple conditions Unlinked Group outputs are ignored Default group (always one) can capture rows that fail all Group conditions
191 192
Ports Two predefined output ports, NEXTVAL and CURRVAL No input ports allowed
195
196
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
132
System Variables
SYSDATE
Informatica Server
$$$SessStartTime
Returns the system date value as a string. Uses system clock on machine hosting Informatica Server
Format of the string is database type dependent Used in SQL override Has a constant value
Used with any function that accepts transformation date/time datatypes Not to be used in a SQL override Has a constant value
198
199
Set datatype User-defined names Set aggregation type Set optional initial value
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
133
Parameter Files
You can specify a parameter file for a session in the session editor Parameter file contains folder.session name and initializes each parameter and variable for that session. For example:
[Production.s_m_MonthlyCalculations] $$State=MA $$Time=10/1/2000 00:00:00 $InputFile1=sales.txt $DBConnection_target=sales $PMSessionLogFile=D:/session logs/firstrun.txt
204 205
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
134
Unconnected Lookups
By the end of this section you will know: Unconnected Lookup technique Unconnected Lookup functionality Difference from Connected Lookup
Unconnected Lookup
Physically unconnected from other transformations NO data flow arrows leading to or from an unconnected Lookup Lookup data is called from the point in the Mapping that needs it Lookup function can be set within any transformation that supports expressions
Function in the Aggregator calls the unconnected Lookup
207
208
IIF ( ISNULL(customer_id),:lkp.MYLOOKUP(order_no))
Condition is evaluated for each row but Lookup function is called only if condition satisfied
Lookup Condition (called only when condition is true) (true for 2 percent of all rows)
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
135
Part of the mapping data flow Returns multiple values (by linking output ports to another transformation) Executed for every record passing through the transformation More visible, shows where the lookup values are used Default values are used
211 212
Separate from the mapping data flow Returns one value - by checking the Return (R) port option for the output port that provides the return value Only executed when the lookup function is called Less visible, as the lookup is called from an expression within another transformation Default values are ignored
Must check a Return port in the Ports tab, else fails at runtime
Mapplets
By the end of this section you will be familiar with: Mapplet Designer Mapplet advantages Mapplet types Mapplet rules Active and Passive Mapplets Mapplet Parameters and Variables
Mapplet Designer
216
217
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
136
Mapplet Advantages
Useful for repetitive tasks / logic Represents a set of transformations Mapplets are reusable Use an instance of a Mapplet in a Mapping Changes to a Mapplet are inherited by all instances Server expands the Mapplet at runtime
218
219
Unsupported Transformations
You cannot not use the following in a mapplet: Normalizer Transformation XML source definitions Target definitions Other mapplets
220
221
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
137
External Sources
Mapplet contains a Mapplet Input transformation Receives data from the Mapping it is used in
Passive Transformation Connected Ports Output ports only Usage Only those ports connected from an Input transformation to another transformation will display in the resulting Mapplet
223
Mixed Sources
Mapplet contains one or more of either of a Mapplet Input transformation AND one or more Source Qualifiers Receives data from the Mapping it is used in, AND from the Mapplet
222
Transformation
Transformation
Connecting the same port to more than one transformation is disallowed Pass to an Expression transformation first
Source Qualifier
Resulting Mapplet HAS input ports When used in a Mapping, the Mapplet may occur at any point in mid-flow
224
Mapplet
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
138
CAUTION: Changing a passive Mapplet into an active Mapplet may invalidate Mappings which use that Mapplet so do an impact analysis in Repository Manager first
228
229
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
139
Passive
Active
Multiple Active Mapplets or Active and Passive Mapplets cannot populate the same target instance
230
Reusable Transformations
By the end of this section you will be familiar with: Transformation Developer Reusable transformation rules Promoting transformations to reusable Copying reusable transformations
Transformation Developer
Make a transformation reusable from the outset, or test it in a mapping first
Reusable transformations
234
235
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
140
Reusable Transformations
Define once, reuse many times Reusable Transformations
Can be a copy or a shortcut Edit Ports only in Transformation Developer Can edit Properties in the mapping Instances dynamically inherit changes Caution: changing reusable transformations can invalidate mappings Note: Source Qualifier transformations cannot be made reusable
236
237
3. Drop the transformation into the mapping 4. Save the changes to the Repository
How to log errors to a flat file or relational table When and how to use source row logging
238
241
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
141
Error Types
Transformation error
Data row has only passed partway through the mapping
transformation logic
An error occurs within a transformation
Data reject
Data row is fully transformed according to the mapping
Data rejects
Appended to reject file Written to row error (one .bad file per target) tables or file
logic
Due to a data issue, it cannot be written to the target A data reject can be forced by an Update Strategy
242
243
Error Log Type Log Row Data Log Source Row Data
244
245
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
142
X X
First column: 0=INSERT o 0,D,1313,D,Regulator System,D,Air Regulators,D,250.00,D,150.00,D 1=UPDATEo 1,D,1314,D,Second Stage Regulator,D,Air Regulators,D,365.00,D,265.00,D 2=DELETE o 2,D,1390,D,First Stage Regulator,D,Air Regulators,D,170.00,D,70.00,D 3=REJECT o 3,D,2341,D,Depth/Pressure Gauge,D,Small Instruments,D,105.00,D,5.00,D
248
249
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
143
250
251
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
144
254
256
(Native Databases) (MQ Series) (File Transfer Protocol file) (Custom) (External Database Loaders)
258
259
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
145
FTP Connection
Create an FTP connection Instructions to the Server to ftp flat files Used in Session Tasks
260
261
262
264
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
146
Session Configuration
Define properties to be reusable across different sessions Defined at folder level Must have one of these tools open in order to access
265
267
268
269
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
147
270
271
Worklets
An object representing a set or grouping of Tasks Can contain any Task available in the Workflow Manager Worklets expand and execute inside a Workflow A Workflow which contains a Worklet is called the parent Workflow Worklets CAN be nested Reusable Worklets create in the Worklet Designer Non-reusable Worklets create in the Workflow Designer
273 274
Re-usable Worklet
In the Worklet Designer, select Worklets | Create
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
148
Non-Reusable Worklet
1. Create worklet task in Workflow Designer 2. Right-click on new worklet and select Open Worklet 3. Workspace switches to Worklet Designer
275
276
Built-in, pre-defined.
Workflow or worklet properties. Reset in Assignment tasks. Parameter file. Constant for session.
$DBConnectionORCL $InputFile1
281
DINESH IT SOLUTIONS, #612, Annapurna Block, Aditya Enclave, Ameerpet Ph: 9959063476, 8885689446
149
Performace Tuning