INFORMATICA POWERCENTER 8.1.0

Slides:



Advertisements
Similar presentations
CC SQL Utilities.
Advertisements

4 Oracle Data Integrator First Project – Simple Transformations: One source, one target 3-1.
MS-Access XP Lesson 1. Introduction to MS-Access Database Management System Software (DBMS) Store data in databases Database is a collection of table.
Tutorial 8: Developing an Excel Application
Guide to Oracle10G1 Introduction To Forms Builder Chapter 5.
Chapter 9 Chapter 9: Managing Groups, Folders, Files, and Object Security.
A Guide to Oracle9i1 Introduction To Forms Builder Chapter 5.
XP New Perspectives on Microsoft Office Excel 2003, Second Edition- Tutorial 11 1 Microsoft Office Excel 2003 Tutorial 11 – Importing Data Into Excel.
Chapter 2: The Visual Studio.NET Development Environment Visual Basic.NET Programming: From Problem Analysis to Program Design.
Introduction To Form Builder
1 Chapter 2 Reviewing Tables and Queries. 2 Chapter Objectives Identify the steps required to develop an Access application Specify the characteristics.
Chapter 7 Managing Data Sources. ASP.NET 2.0, Third Edition2.
Microsoft Access 2010 Chapter 7 Using SQL.
Access Tutorial 3 Maintaining and Querying a Database
Microsoft Office Word 2013 Expert Microsoft Office Word 2013 Expert Courseware # 3251 Lesson 4: Working with Forms.
COMPREHENSIVE Excel Tutorial 8 Developing an Excel Application.
BUSINESS INTELLIGENCE/DATA INTEGRATION/ETL/INTEGRATION AN INTRODUCTION Presented by: Gautam Sinha.
WORKING WITH MACROS CHAPTER 10 WORKING WITH MACROS.
Chapter 3 Maintaining a Database
Excel 2007 Part (2) Dr. Susan Al Naqshbandi
8 Copyright © 2004, Oracle. All rights reserved. Creating LOVs and Editors.
Chapter 2 Querying a Database MICROSOFT ACCESS 2010.
ASP.NET Programming with C# and SQL Server First Edition
XP New Perspectives on Microsoft Access 2002 Tutorial 51 Microsoft Access 2002 Tutorial 5 – Enhancing a Table’s Design, and Creating Advanced Queries and.
Workflow Manager and General Tuning Tips. Topics to discuss… Working with Workflows Working with Tasks General Tuning Tips.
Developing Workflows with SharePoint Designer David Coe Application Development Consultant Microsoft Corporation.
Creating a Web Site to Gather Data and Conduct Research.
Copyright © 2007, Oracle. All rights reserved. Managing Concurrent Requests.
1 Data List Spreadsheets or simple databases - a different use of Spreadsheets Bent Thomsen.
Automating Database Processing Chapter 6. Chapter Introduction Design and implement user-friendly menu – Called navigation form Macros – Automate repetitive.
Analyzing Data For Effective Decision Making Chapter 3.
Chapter 6 SAS ® OLAP Cube Studio. Section 6.1 SAS OLAP Cube Studio Architecture.
Lesson 2.  To help ensure accurate data, rules that check entries against specified values can be applied to a field. A validation rule is applied to.
Session Objectives • Login to PeopleSoft Test Framework(PTF)
ISV Innovation Presented by ISV Innovation Presented by Business Intelligence Fundamentals: Data Cleansing Ola Ekdahl IT Mentors 9/12/08.
Data Management Console Synonym Editor
Oracle Data Integrator Procedures, Advanced Workflows.
Oracle Data Integrator Transformations: Adding More Complexity
Microsoft Office 2007 Access Chapter 3 Maintaining a Database.
FIX Eye FIX Eye Getting started: The guide EPAM Systems B2BITS.
XP Chapter 2 Succeeding in Business with Microsoft Office Access 2003: A Problem-Solving Approach 1 Building The Database Chapter 2 “It is only the farmer.
3 Copyright © 2004, Oracle. All rights reserved. Working in the Forms Developer Environment.
- Joiner Transformation. Introduction ►Transformations help to transform the source data according to the requirements of target system and it ensures.
6 Copyright © 2009, Oracle. All rights reserved. Using the Data Transformation Operators.
LANDESK SOFTWARE CONFIDENTIAL Tips and Tricks with Filters Jenny Lardh.
McGraw-Hill/Irwin The Interactive Computing Series © 2002 The McGraw-Hill Companies, Inc. All rights reserved. Microsoft Excel 2002 Working with Data Lists.
Gold – Crystal Reports Introductory Course Cortex User Group Meeting New Orleans – 2011.
® Microsoft Access 2010 Tutorial 5 Creating Advanced Queries and Enhancing Table Design.
Transportation Agenda 77. Transportation About Columns Each file in a library and item in a list has properties For example, a Word document can have.
Informatica Power Center 7.1. Agenda Overview & Components Informatica Server & Data Movement Repository Server & Repository Manager Designer Transformations.
Quick Test Professional 9.2. Testing Process Preparing to Record Recording Enhancing a Test Debugging Running the Test and Analyzing the Results Reporting.
21 Copyright © 2009, Oracle. All rights reserved. Working with Oracle Business Intelligence Answers.
Aggregator  Performs aggregate calculations  Components of the Aggregator Transformation Aggregate expression Group by port Sorted Input option Aggregate.
INCREMENTAL AGGREGATION After you create a session that includes an Aggregator transformation, you can enable the session option, Incremental Aggregation.
Aggregator Stage : Definition : Aggregator classifies data rows from a single input link into groups and calculates totals or other aggregate functions.
5 Copyright © 2008, Oracle. All rights reserved. Testing and Validating a Repository.
Physical Layer of a Repository. March 6, 2009 Agenda – What is a Repository? –What is meant by Physical Layer? –Data Source, Connection Pool, Tables and.
1 Chapter 6: Creating Oracle Data Block Forms. 2 Forms  Application with a graphical user interface that looks like a paper form  Used to insert, update,
1 Copyright © 2008, Oracle. All rights reserved. Repository Basics.
C Copyright © 2009, Oracle. All rights reserved. Using SQL Developer.
Excel Tutorial 8 Developing an Excel Application
Business rules.
Introduction to Informatica PowerCenter
Creating LOVs and Editors
Informatica PowerCenter Performance Tuning Tips
Access Tutorial 5 Creating Advanced Queries and Enhancing Table Design
Presentation transcript:

INFORMATICA POWERCENTER 8.1.0 DESIGNER

integration * intelligence * insight Content WORKING WITH POWERCENTER 8 DESIGNER integration * intelligence * insight

integration * intelligence * insight Designer Overview Designer Designer is used to create mappings that contain transformation instructions for the Integration Service. The Designer has the following tools that we use to analyze sources, design target schemas, and build source-to-target mappings. Source Analyzer It imports or creates source definitions. Target Designer It imports or creates target definitions. Transformation Developer Develop transformations to use in mappings. we can also develop user-defined functions to use in expressions. Mapplets Designer It Creates sets of transformations to use in mappings. Mapping Designer It Creates mappings that the Integration Service uses to extract, transform, and load data. integration * intelligence * insight

integration * intelligence * insight Designer Overview The following things are displayed in Designer Navigator It connect to repositories, and open folders within the Navigator. we can also copy objects and create shortcuts within the Navigator. Workspace It opens different tools in this window to create and edit repository objects, such as sources, targets, mapplets, transformations, and mappings. Output View details about tasks you perform, such as saving your work or validating a mapping. Designer Windows integration * intelligence * insight

integration * intelligence * insight Designer Overview Status bar It Displays the status of the operation you perform. Overview An optional window to simplify viewing a workspace that contains a large mapping or multiple objects. Outlines the visible area in the workspace and highlights selected objects in color. Instance data View transformation data while you run the Debugger to debug a mapping. Target data View target data while you run the Debugger to debug a mapping. Designer Windows integration * intelligence * insight

integration * intelligence * insight Designer Overview Application Hyperion Essbase IBM MQSeries IBM DB2 OLAP Server JMS Microsoft Message Queue Mainframe Adabas Datacom IBM DB2 OS/390 IBM DB2 OS/400 Other Microsoft Excel Microsoft Access External web services PeopleSoft SAP NetWeaver SAS Siebel TIBCO WebMethods IDMS IDMS-X IMS VSAM Informatica PowerCenter 8 can access the following data sources and load the data into the following targets . Sources Relational Oracle Sybase ASE Informix IBM DB2 Microsoft SQL Server Teradata File Flat file COBOL file XML file web log integration * intelligence * insight

integration * intelligence * insight Designer Overview Targets Relational Oracle Sybase ASE Informix IBM DB2 Microsoft SQL Server Teradata File Flat file XML file Application Hyperion Essbase IBM MQSeries IBM DB2 OLAP Server JMS Microsoft Message Queue Mainframe IBM DB2 OS/390 IBM DB2 OS/400 Other Microsoft Access External web services MY SAP PeopleSoft EPM SAP BW SAS Siebel TIBCO WebMethods VSAM integration * intelligence * insight

integration * intelligence * insight About Transformation The transfer of data is called transformation. A transformation is a repository object that generates, modifies, or passes data. We configure logic in a transformation that the Integration Service uses to transform data. The Designer provides a set of transformations that perform specific functions. Transformations in a mapping represent the operations the Integration Service performs on the data. Data passes into and out of transformations through ports that we link in a mapping or mapplet. Transformations can be Active or Passive. An active transformation can change the number of rows that pass through it. A passive transformation does not change the number of rows that pass through it. Transformations can be connected to the data flow. An unconnected transformation is not connected to other transformations in the mapping. It is called within another transformation, and returns a value to that transformation. integration * intelligence * insight

integration * intelligence * insight About Transformation Designer Transformations Aggregator - to do things like "group by". Expression - to use various expressions. Filter - to filter data with single condition. Joiner - to make joins between separate databases, file, ODBC sources. Lookup - to create local copy of the data. Normalizer - to transform denormalized data into normalized data. Rank - to select only top (or bottom) ranked data. Sequence Generator - to generate unique IDs for target tables. Source Qualifier - to filter sources (SQL, select distinct, join, etc.) Stored Procedure - to run stored procedures in the database - and capture their returned values. Update Strategy - to flag records in target for insert, delete, update (defined inside a mapping). Router - same as filter but with multiple conditions Java Transformation- It provides a simple native programming interface to define transformation functionality with the Java programming language. Reusable transformation- is a transformation that can be used in multiple mappings Tasks to incorporate a transformation into a mapping Create the transformation Configure the transformation Link the transformation to other transformations and target definitions Mapping Designer Transformation Developer Mapplet Designer integration * intelligence * insight

Types Of Transformation Active Transformation Filter Router Update Strategy Aggregator Sorter Rank Joiner Normalizer Passive transformations Sequence Generator Stored Procedure Expression Lookup integration * intelligence * insight

Aggregator Transformation The Aggregator is an active transformation. The Aggregator transformation allow us to perform aggregate calculations, such as averages and sums. The Aggregator transformation is unlike the Expression transformation, in that we can use the Aggregator transformation to perform calculations on groups. The Expression transformation permit us to perform calculations on a row-by-row basis only. We can use conditional clauses to filter rows, providing more flexibility than SQL language. The Integration Services performs aggregate calculations as it reads, and stores necessary data group and row data in an aggregate cache. Components of the Aggregator Transformation Aggregate expression Group by port Sorted input Aggregate cache Aggregate Expression An aggregate expression can include conditional clauses and non-aggregate functions. It can also include one aggregate function nested within another aggregate function, such as. MAX( COUNT( ITEM ) Aggregate Functions The aggregate functions can be used within an Aggregator transformation. We can nest one aggregate function within another aggregate function. AVG COUNT integration * intelligence * insight

Aggregator Transformation Aggregate Functions FIRST LAST MEDIAN MAX MIN STDDEV PERCENTILE SUM VARIANCE Conditional Clauses We use conditional clauses in the aggregate expression to reduce the number of rows used in the aggregation. The conditional clause can be any clause that evaluates to TRUE or FALSE. Null Values in Aggregate Functions When we configure the Integration Service, we can choose how we want the Integration Service to handle null values in aggregate functions. We can choose to treat null values in aggregate functions as NULL or zero. By default, the Integration Service treats null values as NULL in aggregate functions. integration * intelligence * insight

Creating Aggregator Transformation In the Mapping Designer, click Transformation > Create. Select the Aggregator transformation. Enter a name for the Aggregator, click Create. Then click Done. The Designer creates the Aggregator transformation. Drag the ports to the Aggregator transformation. The Designer creates input/output ports for each port we include. Double-click the title bar of the transformation to open the Edit Transformations dialog box . Select the Ports tab. Click the group by option for each column you want the Aggregator to use in creating groups. Click Add and enter a name and data type for the aggregate expression port. Make the port an output port by clearing Input (I). Click in the right corner of the Expression field to open the Expression Editor. Enter the aggregate expression, click Validate, and click OK. Add default values for specific ports. Select the Properties tab. Enter settings as necessary. Click OK. Choose Repository-Save. integration * intelligence * insight

Expression Transformation We can use the Expression transformation to calculate values in a single row before we write to the target We can use the Expression transformation to test conditional statements To perform calculations involving multiple rows, such as sums or averages we can use aggregator transformation We can use the Expression transformation to perform any non-aggregate calculations Creating an Expression Transformation In the Mapping Designer, click Transformation > Create. Select the Expression transformation and click OK. The naming convention for Expression transformations is EXP_TransformationName Create the input ports If we have the input transformation available, we can select Link Columns from the Layout menu and then drag each port used in the calculation into the Expression transformation or we can open the transformation and create each port manually. Repeat the previous step for each input port we want to add to the expression Create the output ports we need integration * intelligence * insight

Expression Transformation Setting Expression in Expression Transformation Enter the expression in the Expression Editor we have disable to in port. Check the expression syntax by clicking Validate. Connect to Next Transformation Connect the output ports to the next transformation or target. Select a Tracing Level on the Properties Tab Select a tracing level on the Properties tab to determine the amount of transaction detail reported in the session log file. Choose Repository-Save. integration * intelligence * insight

Filter Transformation A Filter transformation is an Active Transformation. We can filter rows in a mapping with Filter transformation. We pass all the rows from a source transformation through the Filter transformation and then enter a filter condition for the transformation. All ports in a Filter transformation are input/output, and only rows that meet the condition pass through the Filter transformation. Creating a Filter Transformation In the Mapping Designer, click Transformation > Create. Select the Filter transformation. Enter a name, and click OK. The naming convention for Filter transformations is FIL_TransformationName. Select and drag all the ports from a source qualifier or other transformation to add them to the Filter transformation. After we select and drag ports, copies of these ports appear in the Filter transformation. Each column has both an input and an output port. Double-click the title bar of the filter transformation to edit transformation properties. integration * intelligence * insight

Filter Transformation Click the Value section of the condition, and then click the Open button. The Expression Editor appears. Enter the filter condition we want to apply. Use values from one of the input ports in the transformation as part of this condition However, we can also use values from output ports in other transformations. We may have to fix syntax errors before continuing. Click OK. Select the Tracing Level, and click OK to return to the Mapping Designer. Choose Repository-Save. Filter Transformation Tips Use the Filter transformation early in the mapping. Use the Source Qualifier transformation to filter. integration * intelligence * insight

Joiner Transformation A Joiner transformation is an active transformation. Joiner transformation is used to join source data from two related heterogeneous sources residing in different locations or file systems. We can also join data from the same source. The Joiner transformation joins sources with at least one matching column. The Joiner transformation uses a condition that matches one or more pairs of columns between the two sources. We can use the following sources Two relational tables existing in separate databases. Two flat files in potentially different file systems. Two different ODBC sources. A relational table and an XML source. A relational table and a flat file source. Two instances of the same XML source. Creating a Joiner Transformation In the Mapping Designer, click Transformation > Create. Select the Joiner transformation. Enter a name, and click OK. The naming convention for Joiner transformations is JNR_TransformationName. integration * intelligence * insight

Joiner Transformation Drag all the input/output ports from the first source into the Joiner transformation. The Designer creates input/output ports for the source fields in the Joiner transformation as detail fields by default. We can edit this property later. Select and drag all the input/output ports from the second source into the Joiner transformation. The Designer configures the second set of source fields and master fields by default. Edit Transformation Double-click the title bar of the Joiner transformation to open the Edit Transformations dialog box. Select the port tab. Add default values for specific ports as necessary. Setting the Condition Select the Condition tab and set the condition. Click the Add button to add a condition. Click the Properties tab and configure properties for the transformation. Click OK . integration * intelligence * insight

Joiner Transformation Defining the Join Type Join is a relational operator that combines data from multiple tables into a single result set. We define the join type on the Properties tab in the transformation. The Joiner transformation supports the following types of joins. Normal Master Outer Detail Outer Full Outer Joiner Transformation Tips Perform joins in a database when possible. Join sorted data when possible. For an unsorted Joiner transformation, designate the source with fewer rows as the master source. For a sorted Joiner transformation, designate the source with fewer duplicate key values as the master source. integration * intelligence * insight

Lookup Transformation A Lookup transformation is a passive transformation. Use a Lookup transformation in a mapping to look up data in a flat file or a relational table, view, or synonym. We can import a lookup definition from any flat file or relational database to which both the PowerCenter Client and Integration Service can connect. We can Use multiple Lookup transformations in a mapping. The Integration Service queries the lookup source based on the lookup ports in the transformation. It compares Lookup transformation port values to lookup source column values based on the lookup condition. integration * intelligence * insight

integration * intelligence * insight Types of Lookup Transformation Connected Lookup Receives input values directly from the pipeline Cache includes all lookup columns used in the mapping If there is no match for the lookup condition, it returns the default value for all output ports Pass multiple output values to another transformation Supports user-defined default values Unconnected Lookup Receives input values from other transformation calling: LKP expression You can use a static cache Cache includes all lookup/output ports in the lookup condition If there is no match for the lookup condition, returns null Pass one output value to another transformation Does not support user-defined default values integration * intelligence * insight

Lookup Transformation Tasks Of Lookup Transformations Get a related value. Perform a calculation. Update slowly changing dimension tables. Connected or unconnected. Cached or uncached. Lookup Components We have to define the following components when we configure a Lookup transformation in a mapping. Lookup source Ports Properties Condition Metadata extensions Edit Transformation integration * intelligence * insight

Lookup Transformation Creating a Lookup Transformation In the Mapping Designer, click Transformation > Create. Select the Lookup transformation. Enter a name for the transformation and Click OK. The naming convention for Lookup transformation is LKP_Transformation Name. In the Select Lookup Table dialog box, we can choose the following options. Choose an existing table or file definition. Choose to import a definition from a relational table or file. Skip to create a manual definition. If we want to manually define the lookup transformation, click the Skip button. Define input ports for each Lookup condition we want to define. integration * intelligence * insight

Lookup Transformation For Lookup transformations that use a dynamic lookup cache, associate an input port or sequence ID with each lookup port. On the Properties tab, set the properties for the lookup. Click OK. Configuring Unconnected Lookup Transformations An unconnected Lookup transformation is separate from the pipeline in the mapping. we write an expression using the :LKP reference qualifier to call the lookup within another transformation. Adding Input Ports. Adding the Lookup Condition. ITEM_ID = IN_ITEM_ID PRICE <= IN_PRICE Designating a Return Value. Calling the Lookup Through an Expression. LKP.lookup_transformation_name(argument, argument, ...) Double click on lookup transformation edit transformation opens. Edit Transformation integration * intelligence * insight

Lookup Transformation Setting the properties to port tab And properties tab Lookup Transformation Tips Properties Tab Port Tab Add an index to the columns used in a lookup condition Place conditions with an equality operator (=) first. Cache small lookup tables. Join tables in the database. Use a persistent lookup cache for static lookups. Call unconnected Lookup transformations with the :LKP reference qualifier. integration * intelligence * insight

integration * intelligence * insight Lookup Caches The Integration Service builds a cache in memory when it processes the first row of data in a cached Lookup transformation. It allocates memory for the cache based on the amount we configure in the transformation or session properties. The Integration Service stores condition values in the index cache and output values in the data cache. The Integration Service queries the cache for each row that enters the transformation. The Integration Service also creates cache files by default in the $PMCacheDir. Types of lookup caches Persistent cache Recache from database Static cache Dynamic cache Shared cache integration * intelligence * insight

integration * intelligence * insight Rank Transformation The Rank transformation is Active Transformation The Rank transformation allow us to select only the top or bottom rank of data. The Rank transformation differs from the transformation functions MAX and MIN, to select a group of top or bottom values, not just one value. Creating Rank Transformation In the Mapping Designer, click Transformation > Create. Select the Rank transformation. Enter a name for the Rank. The naming convention for Rank transformations is RNK_TransformationName. Enter a description for the transformation. This description appears in the Repository Manager. Click Create, and then click Done. The Designer creates the Rank transformation. Link columns from an input transformation to the Rank transformation. Click the Ports tab, and then select the Rank (R) option for the port used to measure ranks. If we want to create groups for ranked rows, select Group By for the port that defines the group. Port Tab integration * intelligence * insight

integration * intelligence * insight Rank Transformation Click the Properties tab and select whether we want the top or bottom rank For the Number of Ranks option, enter the number of rows we want to select for the rank. Change the other Rank transformation properties, if necessary. Click OK. Click Repository > Save. Properties Tab integration * intelligence * insight

Sequence Generator Transformation A Sequence Generator transformation is a passive transformation. The Sequence Generator transformation generates numeric values. We can use the Sequence Generator to create unique primary key values, cycle through a sequential range of numbers. The Sequence Generator transformation is a connected transformation. The Integration Service generates a value each time a row enters a connected transformation, even if that value is not used. When NEXTVAL is connected to the input port of another transformation, the Integration Service generates a sequence of numbers. When CURRVAL is connected to the input port of another transformation, the Integration Service generates the NEXTVAL value plus one. We can make a Sequence Generator reusable, and use it in multiple mappings. Web might reuse a Sequence Generator when we perform multiple loads to a single target. If we have a large input file we can separate into three sessions running in parallel, we can use a Sequence Generator to generate primary key values. If we use different Sequence Generators, the Integration Service might accidentally generate duplicate key values. Instead, we can use the reusable Sequence Generator for all three sessions to provide a unique value for each target row. integration * intelligence * insight

Sequence Generator Transformation Tasks with a Sequence Generator Transformation Create keys Replace missing values Cycle through a sequential range of numbers Creating a Sequence Generator Transformation In the Mapping Designer, select Transformation-Create. Select the Sequence Generator transformation. The naming convention for Sequence Generator transformations is SEQ_TransformationName. Enter a name for the Sequence Generator, and click Create. Click Done. The Designer creates the Sequence Generator transformation. Edit Transformation Double-click the title bar of the transformation to open the Edit Transformations dialog box. Properties Tab Select the Properties tab. Enter settings as necessary. Click OK. To generate new sequences during a session, connect the NEXTVAL port to at least one transformation in the mapping. Choose Repository-Save. integration * intelligence * insight

Sequence Generator Transformation Sequence Generator Ports The Sequence Generator provides two output ports: NEXTVAL and CURRVAL. Use the NEXTVAL port to generate a sequence of numbers by connecting it to a transformation or target. We connect the NEXTVAL port to a downstream transformation to generate the sequence based on the Current Value and Increment By properties. Connect NEXTVAL to multiple transformations to generate unique values for each row in each transformation. We might connect NEXTVAL to two target tables in a mapping to generate unique primary key values. NEXTVAL to Two Target Tables in a Mapping We configure the Sequence Generator transformation as follows: Current Value = 1, Increment By = 1. When we run the workflow, the Integration Service generates the following primary key values for the T_ORDERS_PRIMARY andT_ORDERS_FOREIGN target tables. T_ORDERS_PRIMARY TABLE: PRIMARY KEY T_ORDERS_FOREIGN TABLE: PRIMARY KEY 1 2 3 4 5 6 7 8 9 10 integration * intelligence * insight

Sequence Generator Transformation Sequence Generator and Expression Transformation We configure the Sequence Generator transformation as follows: Current Value = 1, Increment By = 1 Output key values for the T_ORDERS_PRIMARY and T_ORDERS_FOREIGN target tables T_ORDERS_PRIMARY TABLE: PRIMARY KEY T_ORDERS_FOREIGN TABLE: 1 2 3 4 5 integration * intelligence * insight

Sequence Generator Transformation CURRVAL is the NEXTVAL value plus one or NEXTVAL plus the Increment By value. We typically only connect the CURRVAL port when the NEXTVAL port is already connected to a downstream transformation. When a row enters the transformation connected to the CURRVAL port, the Informatica Server passes the last-created NEXTVAL value plus one. Connecting CURRVAL and NEXTVAL Ports to a Target We configure the Sequence Generator transformation as follows: Current Value = 1, Increment By = 1. When we run the workflow, the Integration Service generates the following values for NEXTVAL and CURRVAL. OUT PUT When we run the workflow, the Integration Service generates the following values for NEXTVAL and CURRVAL. If we connect the CURRVAL port without connecting the NEXTVAL port, the Integration Service passes a constant value for each row. NEXTVAL CURRVAL 1 2 3 4 5 6 integration * intelligence * insight

Sequence Generator Transformation Only the CURRVAL Port to a Target For example, we configure the Sequence Generator transformation as follows. OUTPUT Current Value = 1, Increment By = 1 When we run the workflow, the Integration Service generates the following constant values for CURRVAL. CURRVAL 1 integration * intelligence * insight

Source Qualifier Transformation A Source Qualifier is an active transformation. The Source Qualifier represents the rows that the Integration Service reads when it executes a session. When we add a relational or a flat file source definition to a mapping source Qualifier transformation automatically comes. Task of Source Qualifier Transformation We can use the Source Qualifier to perform the following tasks. Join data originating from the same source database. Filter records when the Integration Service reads source data. Specify an outer join rather than the default inner join Specify sorted ports. Select only distinct values from the source. Create a custom query to issue a special SELECT statement for the Integration Service to read source data. Default Query of Source Qualifier For relational sources, the Integration Service generates a query for each Source Qualifier when it runs a session. The default query is a SELECT statement for each source column used in the mapping. integration * intelligence * insight

Source Qualifier Transformation To view the Default Query To view the default query. From the Properties tab, select SQL Query Click Generate SQL Click Cancel to exit Example of source Qualifier Transformation We might see all the orders for the month, including order number, order amount, and customer name. The ORDERS table includes the order number and amount of each order, but not the customer name. To include the customer name, we need to join the ORDERS and CUSTOMERS tables. Setting the properties to Source Qualifier Double-click the title bar of the transformation to open the Edit Transformations dialog box. Select the Properties tab. Enter settings as necessary. integration * intelligence * insight

Source Qualifier Transformation SQL Query We can give query in the Source Qualifier transformation. From the Properties tab, select SQL Query The SQL Editor displays. Click Generate SQL. Joining Source Data We can use one Source Qualifier transformation to join data from multiple relational tables. These tables must be accessible from the same instance or database server. Use the Joiner transformation for heterogeneous sources and to join flat files. Sorted Ports In the Mapping Designer, open a Source Qualifier transformation, and click the Properties tab. Click in Number of Sorted Ports and enter the number of ports we want to sort. The Integration Service adds the configured number of columns to an ORDER BY clause, starting from the top of the Source Qualifier transformation. The source database sort order must correspond to the session. integration * intelligence * insight

Stored procedure Transformation A Stored Procedure is a passive transformation A Stored Procedure transformation is an important tool for populating and maintaining databases. Database administrators create stored procedures to automate tasks that are too complicated for standard SQL statements. Stored procedures run in either connected or unconnected mode. The mode we use depends on what the stored procedure does and how we plan to use it in a session. we can configure connected and unconnected Stored Procedure transformations in a mapping. Connected: The flow of data through a mapping in connected mode also passes through the Stored Procedure transformation. All data entering the transformation through the input ports affects the stored procedure. We should use a connected Stored Procedure transformation when we need data from an input port sent as an input parameter to the stored procedure, or the results of a stored procedure sent as an output parameter to another transformation. Unconnected: The unconnected Stored Procedure transformation is not connected directly to the flow of the mapping. It either runs before or after the session, or is called by an expression in another transformation in the mapping. integration * intelligence * insight

Stored procedure Transformation Creating a Stored Procedure Transformation After we configure and test a stored procedure in the database, we must create the Stored Procedure transformation in the Mapping Designer To import a stored procedure In the Mapping Designer, click Transformation >Import Stored Procedure. Select the database that contains the stored procedure from the list of ODBC sources. Enter the user name, owner name, and password to connect to the database and click Connect Select the procedure to import and click OK.. The Stored Procedure transformation appears in the mapping. The Stored Procedure transformation name is the same as the stored procedure we selected. Open the transformation, and click the Properties tab Select the database where the stored procedure exists from the Connection Information row. If we changed the name of the Stored Procedure transformation to something other than the name of the stored procedure, enter the Stored Procedure Name. Click OK. Click Repository > Save to save changes to the mapping. integration * intelligence * insight

integration * intelligence * insight Update Strategy An Update Strategy is an active transformation . When we design a data warehouse, we need to decide what type of information to store in targets. As part of the target table design, we need to determine whether to maintain all the historic data or just the most recent changes. The model we choose determines how we handle changes to existing rows. In PowerCenter, we set the update strategy at two different levels. Within a session Within a mapping Setting the Update Strategy We use the following steps to define an update strategy To control how rows are flagged for insert, update, delete, or reject within a mapping, add an Update Strategy transformation to the mapping. Update Strategy transformations are essential if we want to flag rows destined for the same target for different database operations, or if we want to reject rows. Define how to flag rows when we configure a session. We can flag all rows for insert, delete, or update, or we can select the data driven option, where the Integration Service follows instructions coded into Update Strategy transformations within the session mapping. Define insert, update, and delete options for each target when we configure a session. On a target-by-target basis, we can allow or disallow inserts and deletes. integration * intelligence * insight

integration * intelligence * insight Update Strategy Creating an Update Transformation In the Mapping Designer, select Transformation-Create. Select the Update transformation. The naming convention for Update transformations is UPD_TransformationName. Enter a name for the Update transformation , and click Create. Click Done. The Designer creates the Update transformation. Drag all ports from another transformation representing data we want to pass through the Update Strategy transformation. In the Update Strategy transformation, the Designer creates a copy of each port we drag. The Designer also connects the new port to the original port. Each port in the Update Strategy transformation is a combination of input/output port. Normally, we would select all of the columns destined for a particular target. After they pass through the Update Strategy transformation, this information is flagged for update, insert, delete, or reject. Double-click the title bar of the transformation to open the Edit Transformations dialog box. Click the Properties tab. Click the button in the Update Strategy Expression field. The Expression Editor appears. integration * intelligence * insight

integration * intelligence * insight Update Strategy Enter an update strategy expression to flag rows as inserts, deletes, updates, or rejects. Validate the expression and click OK. Click OK to save the changes. Connect the ports in the Update Strategy transformation to another transformation or a target instance. Click Repository > Save Setting the Update Strategy for a Session When we configure a session, we have several options for handling specific database operations, including updates. Specifying an Operation for All Rows When we configure a session, we can select a single database operation for all rows using the Treat Source Rows As setting. Configure the Treat Source Rows As session property. Treat Source Rows displays the options like. Insert Delete Update Data Driven integration * intelligence * insight

integration * intelligence * insight Update Strategy Specifying Operations for Individual Target Tables Once we determine how to treat all rows in the session, we also need to set update strategy options for individual targets. Define the update strategy options in the Transformations view on Mapping tab of the session properties. We can set the following update strategy options for Individual Target Tables. Insert. Select this option to insert a row into a target table. Delete. Select this option to delete a row from a table.. Update. You have the following options in this situation. Update as Update. Update each row flagged for update if it exists in the target table. Update as Insert. Inset each row flagged for update. Update else Insert. Update the row if it exists. Otherwise, insert it. Truncate table. Select this option to truncate the target table before loading data. integration * intelligence * insight

Router Transformation A Router transformation is an Active Transformation. A Router transformation is similar to a Filter transformation because both transformations allow us to use a condition to test data. A Filter transformation tests data for one condition and drops the rows of data that do not meet the condition. However, a Router transformation tests data for one or more conditions and gives us the option to route rows of data that do not meet any of the conditions to a default output group. If we need to test the same input data based on multiple conditions, use a Router transformation in a mapping instead of creating multiple Filter transformations to perform the same task. Creating a Router Transformation In the Mapping Designer, click Transformation > Create. Select the Router transformation. Enter a name for the transformation and Click OK. The naming convention for router transformation is RTR_TransformationName. Input values in the Router Transformation Select and drag all the desired ports from a transformation to add them to the Router transformation. Double-click the title bar of the Router transformation to edit transformation properties. integration * intelligence * insight

Router Transformation Setting the properties to port tab And properties tab Port Tab Properties Tab Group tab in Router Transformation Click the Group Filter Condition field to open the Expression Editor. Enter a group filter condition. Click Validate to check the syntax of the conditions we entered. Click OK. Connect group output ports to transformations or targets. Choose Repository-Save. integration * intelligence * insight

Router Transformation A Router transformation has the following types of groups. Input Output There are two types of output groups. User-defined groups Default group Router Transformation Components Working with Ports A Router transformation has input ports and output ports. Input ports reside in the input group, and output ports reside in the output groups. We can create input ports by copying them from another transformation or by manually creating them on the Ports tab. Port tab in Router Transformation integration * intelligence * insight

Router Transformation Connecting Router Transformations in a Mapping When we connect transformations to a Router transformation in a mapping consider the following rules. We can connect one group to one transformation or target. Connect one port to Multiple Target We can connect one output port in a group to multiple transformations or targets. Connect Multiple out ports to Multiple Target We can connect multiple output ports in one group to multiple transformations or targets. integration * intelligence * insight

Reusable Transformation Reusable transformation is a transformation that can be used in multiple mappings. We can create most transformations as a non-reusable or reusable but only create the External Procedure transformation as a reusable transformation . When we add a reusable transformation to a mapping, we add an instance of the transformation. The definition of the transformation still exists outside the mapping. Methods To Create Reusable Transformation Design it in the Transformation Developer In the Transformation Developer, we can build new reusable transformations. Promote a non-reusable transformation from the Mapping Designer After we add a transformation to a mapping, we can promote it to the status of reusable transformation. The transformation designed in the mapping then becomes an instance of a reusable transformation maintained elsewhere in the repository. integration * intelligence * insight

Reusable Transformation Creating Reusable Transformation Goto transformation developer<Transformation <create. To promote an existing transformation to re-usable: Goto mapping designer>double click on transformation>>transformation tab>make reusable. Changes that can invalidate mapping When we delete a port or multiple ports in a transformation. When we change a port datatype, you make it impossible to map data from that port to another port using an incompatible datatype. When we change a port name, expressions that refer to the port are no longer valid. When we enter an invalid expression in the reusable transformation, mappings that use the transformation are no longer valid. The Integration Service cannot run sessions based on invalid mappings integration * intelligence * insight

integration * intelligence * insight Java Transformation The Java transformation is a Active/Passive Connected transformation that provides a simple native programming interface to define transformation functionality with the Java programming language. You create Java transformations by writing Java code snippets that define transformation logic. The Power Center Client uses the Java Development Kit (JDK) to compile the Java code and generate byte code for the transformation. The Integration Service uses the Java Runtime Environment (JRE) to execute generated byte code at run time. Steps To Define Java Transformation Create the transformation in the Transformation Developer or Mapping Designer. Configure input and output ports and groups for the transformation. Use port names as variables in Java code snippets. Configure the transformation properties. Use the code entry tabs in the transformation to write and compile the Java code for the transformation. Locate and fix compilation errors in the Java code for the transformation. integration * intelligence * insight

integration * intelligence * insight Java Transformation Enter the ports and use that ports as identifier in java code. Go to java code and enter the java code and click compile and check the output in the output window. Create session and workflow and run the session. Functions Some functions used in designer are AVG Syntax : AVG( numeric_value [, filter_condition ] ) MAX Syntax: MAX( value [, filter_condition ] ) MIN Syntax :MIN( value [, filter_condition ] ) INSTR Syntax :INSTR( string, search_value [, start [, occurrence ] ] ) SUBSTR Syntax :SUBSTR( string, start [, length ] ) IS_DATE Syntax:IS_DATE( value ) integration * intelligence * insight

Working With Flat Files To use flat files as sources, targets, and lookups in a mapping we must import or create the definitions in the repository . We can import or create flat file source definitions in the Source Analyzer or create flat file target definitions in the Target Designer or import flat files lookups or use existing file definitions in a Lookup transformation. When we create a session with a file source, we specify a source file location different from the location we use , when we import the file source definition. Importing a flat file Goto sources<import from file<select the file. Click on delimited and next. Name the different columns and change the datatype if needed. Click finish. integration * intelligence * insight

Working With Flat Files Editing a flat file definition Table tab Edit properties such as table name, business name, and flat file properties. Columns tab Edit column information such as column names, datatypes, precision, and formats. Properties tab We can edit the default numeric and datetime format properties in the Source Analyzer and the Target Designer. Metadata Extensions tab We can extend the metadata stored in the repository by associating information with repository objects, such as flat file definitions. integration * intelligence * insight

User Defined Functions We can create user-defined functions using the PowerCenter transformation language. Create user-defined functions to reuse expression logic and build complex expressions. User-defined functions are available to other users in a repository. Once you create user-defined functions, we can manage them from the User-Defined Function Browser dialog box. We can also use them as functions in the Expression Editor. They display on the User-Defined Functions tab of the Expression Editor. We create a user-defined function in the Transformation Developer. Configure the following information when we create a user-defined function. Name Type Description Arguments Syntax Steps to Create User-Defined Functions In the Transformation Developer, click Tools > User-Defined Functions. Click New The Edit User-Defined Function dialog box appears Enter a function name Select a function type If we create a public user-defined function, we cannot change the function to private when we edit the function. integration * intelligence * insight

User Defined Functions Optionally, enter a description of the user-defined function. We can enter up to 2,000 characters. Create arguments for the user-defined function. When we create arguments, configure the argument name, data type, precision, and scale. We can select transformation data types. Click Launch Editor to create an expression that contains the arguments we defined. Click OK The Designer assigns the data type of the data the expression returns. The data types have the precision and scale of transformation data types. The expression displays in the User-Defined Function Browser dialog box. integration * intelligence * insight

integration * intelligence * insight Mapplet Designer A mapplet is a reusable object that we create in the Mapplet Designer. It contains a set of transformations and we reuse that transformation logic in multiple mappings. When we use a mapplet in a mapping, we use an instance of the mapplet. Like a reusable transformation, any change made to the mapplet is inherited by all instances of the mapplet. Usage of Mapplets Include source definitions Use multiple source definitions and source qualifiers to provide source data for a mapping. Accept data from sources in a mapping If we want the mapplet to receive data from the mapping, we use an Input transformation to receive source data. Include multiple transformations A mapplet can contain as many transformations as you need. Pass data to multiple transformations We can create a mapplet to feed data to multiple transformations. Contain unused ports We do not have to connect all mapplet input and output ports in a mapping integration * intelligence * insight

integration * intelligence * insight Mapplet Designer Limitations of Mapplets We cannot connect a single port in the Input transformation to multiple transformations in the mapplet. An input transformation must receive data from a single active source. A mapplet must contain at least one Input transformation or source definition with at least one port connected to a transformation in the mapplet and same applies for output transformation. When a mapplet contains a source qualifier that has an override for the default SQL query, we must connect all of the source qualifier output ports to the next transformation within the mapplet. We cannot include PowerMart 3.5-style LOOKUP functions in a mapplet. We cannot include the following objects : Normalizer transformations, Cobol sources, XML Source Qualifier transformations, XML sources and targets, Pre- and post- session stored procedures and other mapplets. integration * intelligence * insight

integration * intelligence * insight Data Profiling Data profiling is a technique used to analyze source data. PowerCenter Data Profiling can help us to evaluate source data and detect patterns and exceptions. we can profile source data to suggest candidate keys, detect data patterns and evaluate join criteria. Use Data Profiling to analyze source data in the following situations. During mapping development . During production to maintain data quality. To profile source data, we create a data profile. we can create a data profile based on a source or mapplet in the repository. Data profiles contain functions that perform calculations on the source data. The repository stores the data profile as an object. we can apply profile functions to a column within a source, to a single source, or to multiple sources. We can create the following types of data profiles. Auto profile Contains a predefined set of functions for profiling source data. Use an auto profile during mapping development. Custom profile Use a custom profile during mapping development to validate documented business rules about the source data. we can also use a custom profile to monitor data quality or validate the results of BI reports. integration * intelligence * insight

integration * intelligence * insight Data Profiling Optionally, click Description to add a description for the data profile. Click OK. Enter a description up to 200 characters. Optionally, select the groups or columns in the source that you want to profile. By default, all columns or groups are selected Select Load Verbose Data if you want the Integration Service to write verbose data to the Data Profiling warehouse during the profile session. By default, Load Verbose Data option is disabled. Click Next. Select additional functions to include in the auto profile. We can also clear functions we do not want to include. Steps To Create Auto Profile When we create an auto profile, we can profile groups or columns in the source. Or, we can profile the entire source. To create an auto profile. Select the source definition in the Source Analyzer or mapplet in the Mapplet Designer you want to profile. Launch the Profile Wizard from the following Designer tools. Source Analyzer. Click Sources > Profiling > Create Auto Profile. Mapplet Designer. Click Mapplets > Profiling > Create Auto Profile. You set the default data profile options to open the Auto Profile Column Selection dialog box when you create an auto profile. The source definition contains 25 or more columns. integration * intelligence * insight

integration * intelligence * insight Data Profiling Optionally, click Save As Default to create new default functions based on the functions selected here. Optionally, click Profile Settings to enter settings for domain inference and structure inference tuning. Optionally, modify the default profile settings and click OK. Click Configure Session to configure the session properties after you create the data profile. Click Next if you selected Configure Session, or click Finish if you disabled Configure Session. The Designer generates a data profile and profile mapping based on the profile functions. Configure the Profile Run options and click Next. Configure the Session Setup options. Click Finish. integration * intelligence * insight

integration * intelligence * insight Data Profiling We can create a custom profile from the following Designer tools. Source Analyzer. Click Sources > Profiling > Create Custom Profile. Mapplet Designer. Click Mapplets > Profiling > Create Custom Profile. Profile Manager. Click Profile > Create Custom. To create a custom profile, complete the following. Enter a data profile name and optionally add a description. Add sources to the data profile. Add, edit, or delete a profile function and enable session configuration. Configure profile functions. Configure the profile session if we enable session configuration. integration * intelligence * insight

integration * intelligence * insight Profile Manager Profile Manager is a tool that helps to manage data profiles. It is used to set default data profile options, work with data profiles in the repository, run profile sessions, view profile results, and view sources and mapplets with at least one profile defined for them. When we launch the Profile Manager, we can access profile information for the open folders in the repository. There are two views in the Profile Manager Profile View: The Profile View tab displays the data profiles in the open folders in the repository. Source View: The Source View tab displays the source definitions in the open folders in the repository for which we have defined data profiles. Profile View Source View integration * intelligence * insight

integration * intelligence * insight Debugger Overview We can debug a valid mapping to gain troubleshooting information about data and error conditions. Debugger used in the following situations Before we run a session After we save a mapping, we can run some initial tests with a debug session before we create and configure a session in the Workflow Manager. After we run a session If a session fails or if we receive unexpected results in the target, we can run the Debugger against the session. we might also run the Debugger against a session if we want to debug the mapping using the configured session properties. Create breakpoints. Create breakpoints in a mapping where we want the Integration Service to evaluate data and error conditions. Configure the Debugger. Use the Debugger Wizard to configure the Debugger for the mapping. Select the session type the Integration Service uses when it runs Debugger. Run the Debugger. Run the Debugger from within the Mapping Designer. When we run the Debugger the Designer connects to the Integration Service. The Integration Service initializes the Debugger and runs the debugging session and workflow. Monitor the Debugger. While we run the Debugger, we can monitor the target data, transformation and mapplet output data, the debug log, and the session log. Modify data and breakpoints. When the Debugger pauses, we can modify data and see the effect on transformations, mapplets, and targets as the data moves through the pipeline. we can also modify breakpoint information. integration * intelligence * insight

integration * intelligence * insight Debugger Overview Create Breakpoints Goto mapping<<debugger<<edit transformations. Choose the instant name, breakpoint type. And then ADD to add the breakpoints. Give the condition for data breakpoint type. Give no. of errors before we want to stop. Run The Debugger Got mapping<debugger<start debugger Click next and then choose the session as ‘create debug session’ other wise choose ‘existing session’ Click on next integration * intelligence * insight

integration * intelligence * insight Debugger Overview Choose connections of target and source and click next. Click on next Debug Indicators integration * intelligence * insight

integration * intelligence * insight The End integration * intelligence * insight