Ads 468x60px

Pages

Subscribe:

Labels

Informatica (18) Integration Service (10) Siebel Business Intelligence (6) ETL (5) Informatica PowerCenter (4) Informatica PowerCenter 8x (4) Oracle (4) Metadata (3) DTM (2) Data Transformation Manager (2) Hexaware Technologies (2) OUD (2) Oracle Unified Directory (2) PowerCenter (2) XML (2) business (2) ASCII (1) Administration Console (1) Application Services (1) Automated Migration (1) BFSI (1) Binary (1) Bulk Load (1) Business Intelligence (1) Business Intelligence Challenge (1) Business Intelligence Company (1) Business Intelligence Consulting (1) Business Objects (1) ByTree (1) CDC (1) CNBC News (1) CNBCTV18’s Shreya Roy (1) COBOL (1) Change Data Capture (1) Collaborative (1) Collaborative Data Management (1) Computing Expression Evaluator (1) Convert Rows To Columns In Inforamtica (1) Data (1) Data Governance (1) Data Management (1) Data Mart (1) Data Type (1) Data Virtualization Services (1) Database (1) Datawarehouse (1) ETL Developers (1) Expression Evaluator (1) Expression Evaluator Debugging (1) Extract (1) FTP (1) File List (1) Flash or Java Applets (1) Flat Files (1) Function (1) HP Diagnostics-Identify (1) HP Diagnostics-Identify bottlenecks (1) HTTP Headers (1) Hexaware Technologies Limited (1) IT Metrics (1) IT companies (1) IT company (1) Index (1) Indirect Source (1) Informatica 8.6 (1) Informatica 8.x (1) Informatica Data Integration Service (1) Informatica Debugger (1) Informatica Debugging Transformation (1) Informatica Development (1) Informatica File Transfer (1) Informatica Power Center (1) Informatica Powercenter 8x Key Concepts (1) Informatica Process Control Audit (1) Informatica Repository Restoration (1) Informatica Server Re-Installation (1) Informatica Server Recovery System (1) Informatica Upgrade Challenge (1) Informatica Workflow (1) Informatica Workflow Process Control (1) Integration Services (1) Integration and Repository services (1) Invalid Objects (1) JavaScript Functions (1) Joiner Transformation (1) LDAP (1) LDAP Directory (1) LDAP Replication (1) Load Balancer (1) LoadRunner 11.5 (1) Looping (1) Manual Correlation (1) Mappings (1) NTLM Resource (1) NTLM authentication (1) Native Driver (1) New Column (1) ODBC (1) ODBC Driver (1) OID (1) OUD Configuration (1) OUD Directory Server (1) OUD Replication (1) Oracle Applications (1) Oracle Client (1) Oracle Hints In SQL (1) Oracle Internet Director (1) Oracle Optimizer (1) Oracle R12 (1) Oracle Solutions (1) Oracle loadrunner (1) OracleErrorActionFile (1) PeopleSoft Jobs (1) PeopleSoft Jobs In Hexaware (1) Peoplesoft Tester In Chennai (1) Peoplesoft Tester Jobs In Chennai (1) Performance (1) Performance Testing (1) PowerCenter 8.5 (1) PowerCenter Server (1) PowerCenter Server Support (1) Powercenter 8.5.1 (1) Pushdown Optimization (1) Re-Import (1) Relational (1) Remote Filename (1) Repository Services (1) Reverse Of A Normalizer In Informatica (1) Rows Read (1) SFTP (1) SJSDS (1) SMP (1) SQL Statement (1) SSH2 (1) SUBSTR Function (1) Session Failed (1) Source Data (1) Source Definition (1) Source Row (1) Store Procedure (1) SuppressNilContentMethod (1) Symmetric Multi-Processing (1) Target Definition (1) Target Row (1) Task Developer (1) Text Flags (1) Transfer Protocol (1) Transformation (1) Transformation Logics (1) Transpose Records (1) UDF (1) User Defined Functions (1) WriteNullXMLFile (1) XML File (1) XML Optimization (1) XML Target (1) XML Tuning (1) XMLSendChildFirst (1) XMLWarnDupRows (1) Zero byte XML file (1) accelerate application (1) employee performance (1) mid-cap it (1) web_reg_save_param (1)

Labels

Blogroll

About

Blogger templates

Blogger news

Tuesday 15 March 2011

Configuring Informatica File Transfer Protocol


Informatica File Transfer Protocol can be used to transfer/move files from different environment into our pre-defined Landing Zone. It can also be used to transfer file to the destination folder/directories.   The Integration Service can use FTP to access any machine it can connect to, including mainframes.

Configuring FTP in Informatica Workflow
To use FTP file sources and targets in a session,
  • Create an FTP connection object in the Workflow Manager and configure the connection attributes
  • Configure the session to use the FTP connection object in the session properties.
  • Specify the Remote filename in the connection value of the Session properties.
Guidelines
  • Specify the source or target output directory in the session properties. If not specified, the Integration Service stage the file in the directory where the Integration Service runs on UNIX or in the Windows System directory.
  • Session cannot run concurrently if the same FTP source file or target file located on a mainframe.
  • If a workflow containing a session that stages an FTP source or target from a mainframe is aborted, then the same workflow cannot be run until it’s timed out.
  • Configure an FTP connection to use SSH File Transfer Protocol (SFTP) while connecting to an SFTP server. SFTP enables file transfer over a secure data stream. The Integration Service creates an SSH2 transport layer that enables a secure connection and access to the files on an SFTP server.
  • To run a session using an FTP connection for an SFTP server that requires public key authentication, the public key and private key files must be accessible on nodes where the session will run.
Configuring Remote Filename
Attribute Description
Remote Filename The remote file name for the source or target.  Indirect source file name to be entered, in case of indirect source file is sent. Use 7-bit ASCII characters for the file name. The session fails if it encounters a remote file name with Unicode characters.
If the path name is provided with the source file name, the Integration Service ignores the path entered in the Default Remote Directory field. The session will fail if the File name with path is provided with single or double quotation marks.
Is Staged Stages the source or target file on the Integration Service. Default is “Not staged”.
Is Transfer Mode ASCII Changes the transfer mode. When enabled, the Integration Service uses ASCII transfer mode.
-           Use ASCII mode when transferring files on Windows machines to ensure that the end of line character is translated properly in text files. When disabled, the Integration Service uses Binary Transfer mode.
-          Use Binary Transfer mode when transferring files on UNIX machines. Default is disabled.

Tuesday 1 March 2011

Informatica – User Defined Functions


Informatica User Defined Functions are similar to Built-in Functions, where these functions need to be created once and execute multiple times. Transformation logics that are common across the ports are the ideal candidate for User Defined Functions.
Transformation Logic implemented without User Defined Functions

Validation “IIF( ISNULL(LTRIM(RTRIM(INPUT))),’TRUE’,’FALSE’)” is being performed in multiple ports.
The disadvantage with this approach is any changes to this validation need to be done in all the ports.
This can be addressed by creating a User Defined function and have the logic incorporated there.
Steps to Create User Defined Functions
Step 1 : Right-click on the User-Defined Functions folder in a repository folder in the Designer.
Click on “New”
Step 2: In Editor add the transformation logic / validation that needs to be performed.
Click ok and validate the UDF.
User Defined Function – Type:
Public if the function is callable from any expression. Private if the function is only callable from another user-defined function.
To Call User Defined functions from Port:

Tuesday 25 January 2011

Informatica Pushdown Optimization


What is Pushdown Optimization and things to consider

The process of pushing transformation logic to the source or target database by Informatica Integration service is known as Pushdown Optimization. When a session is configured to run for Pushdown Optimization, the Integration Service translates the transformation logic into SQL queries and sends the SQL queries to the database. The Source or Target Database executes the SQL queries to process the transformations.

How does Pushdown Optimization (PO) Works?

The Integration Service generates SQL statements when native database driver is used. In case of ODBC drivers, the Integration Service cannot detect the database type and generates ANSI SQL.  The Integration Service can usually push more transformation logic to a database if a native driver is used, instead of an ODBC driver.
For any SQL Override, Integration service creates a view (PM_*) in the database while executing the session task and drops the view after the task gets complete. Similarly it also create sequences (PM_*) in the database.
Database schema (SQ Connection, LKP connection), should have the Create View / Create Sequence Privilege, else the session will fail.

Few Benefits in using PO

  • There is no memory or disk space required to manage the cache in the Informatica server for Aggregator, Lookup, Sorter and Joiner Transformation, as the transformation logic is pushed to database.
  • SQL Generated by Informatica Integration service can be viewed before running the session through Optimizer viewer, making easier to debug.
  • When inserting into Targets, Integration Service do row by row processing using bind variable (only soft parse – only processing time, no parsing time). But In case of Pushdown Optimization, the statement will be executed once.
Without Using Pushdown optimization:
INSERT INTO EMPLOYEES(ID_EMPLOYEE, EMPLOYEE_ID, FIRST_NAME, LAST_NAME, EMAIL,
PHONE_NUMBER, HIRE_DATE, JOB_ID, SALARY, COMMISSION_PCT,
MANAGER_ID,MANAGER_NAME,
DEPARTMENT_ID) VALUES (:1, :2, :3, :4, :5, :6, :7, :8, :9, :10, :11, :12, :13) –executes 7012352 times
With Using Pushdown optimization
INSERT INTO EMPLOYEES(ID_EMPLOYEE, EMPLOYEE_ID, FIRST_NAME, LAST_NAME, EMAIL, PHONE_NUMBER, HIRE_DATE, JOB_ID, SALARY, COMMISSION_PCT, MANAGER_ID, MANAGER_NAME, DEPARTMENT_ID) SELECT CAST(PM_SJEAIJTJRNWT45X3OO5ZZLJYJRY.NEXTVAL AS NUMBER(15, 2)), EMPLOYEES_SRC.EMPLOYEE_ID, EMPLOYEES_SRC.FIRST_NAME, EMPLOYEES_SRC.LAST_NAME, CAST((EMPLOYEES_SRC.EMAIL || ‘@gmail.com’) AS VARCHAR2(25)), EMPLOYEES_SRC.PHONE_NUMBER, CAST(EMPLOYEES_SRC.HIRE_DATE AS date), EMPLOYEES_SRC.JOB_ID, EMPLOYEES_SRC.SALARY, EMPLOYEES_SRC.COMMISSION_PCT, EMPLOYEES_SRC.MANAGER_ID, NULL, EMPLOYEES_SRC.DEPARTMENT_ID FROM (EMPLOYEES_SRC LEFT OUTER JOIN EMPLOYEES PM_Alkp_emp_mgr_1 ON (PM_Alkp_emp_mgr_1.EMPLOYEE_ID = EMPLOYEES_SRC.MANAGER_ID)) WHERE ((EMPLOYEES_SRC.MANAGER_ID = (SELECT PM_Alkp_emp_mgr_1.EMPLOYEE_ID FROM EMPLOYEES PM_Alkp_emp_mgr_1 WHERE (PM_Alkp_emp_mgr_1.EMPLOYEE_ID = EMPLOYEES_SRC.MANAGER_ID))) OR (0=0)) –executes 1 time

Things to note when using PO

There are cases where the Integration Service and Pushdown Optimization can produce different result sets for the same transformation logic. This can happen during data type conversion, handling null values, case sensitivity, sequence generation, and sorting of data.
The database and Integration Service produce different output when the following settings and conversions are different:
  • Nulls treated as the highest or lowest value: While sorting the data, the Integration Service can treat null values as lowest, but database treats null values as the highest value in the sort order.
  • SYSDATE built-in variable: Built-in Variable SYSDATE in the Integration Service returns the current date and time for the node running the service process. However, in the database, the SYSDATE returns the current date and time for the machine hosting the database. If the time zone of the machine hosting the database is not the same as the time zone of the machine running the Integration Service process, the results can vary.
  • Date Conversion: The Integration Service converts all dates before pushing transformations to the database and if the format is not supported by the database, the session fails.
  • Logging: When the Integration Service pushes transformation logic to the database, it cannot trace all the events that occur inside the database server. The statistics the Integration Service can trace depend on the type of pushdown optimization. When the Integration Service runs a session configured for full pushdown optimization and an error occurs, the database handles the errors. When the database handles errors, the Integration Service does not write reject rows to the reject file.

Monday 3 January 2011

Informatica Performance Improvement Tips


We often come across situations where Data Transformation Manager (DTM) takes more time to read from Source or when writing in to a Target. Following standards/guidelines can improve the overall performance.
  • Use Source Qualifier if the Source tables reside in the same schema
  • Make use of Source Qualifer  “Filter” Properties if the Source type is Relational.
  • If the subsequent sessions are doing lookup on the same table, use persistent cache in the first session. Data remains in the Cache and available for the subsequent session for usage.
  • Use flags as integer, as the integer comparison is faster than the string comparison.
  • Use tables with lesser number of records as master table for joins.
  • While reading from Flat files, define the appropriate data type instead of reading as String and converting.
  • Have all Ports that are required connected to Subsequent Transformations else check whether we can remove these ports
  • Suppress ORDER BY using the ‘–‘ at the end of the query in Lookup Transformations
  • Minimize the number of Update strategies.
  • Group by simple columns in transformations like Aggregate, Source Qualifier
  • Use Router transformation in place of multiple Filter transformations.
  • Turn off the Verbose Logging while moving the mappings to UAT/Production environment.
  • For large volume of data drop index before loading and recreate indexes after load.
  • For large of volume of records Use Bulk load Increase the commit interval to a higher value large volume of data
  • Set ‘Commit on Target’ in the sessions