Informatica Sample Resumes

Informatica Sample Resume 5+ Years Experience

 Professional Summary:

·   8+ years of IT experience in Software Development Life Cycle (SDLC) which includes requirement gathering, designing, implementing and testing.

·   5+ years of Technical and Functional experience in Decision Support Systems – Data warehousing implementing ETL (Extract, Transform and Load) using Informatica Power Center 8.6/7.1.3/6.2

·   Solid understanding of ETL design principles and good practical knowledge of performing ETL design processes through Informatica. 
·   Extensively worked on Power Center Mapping Designer, Mapplet Designer, Transformation developer Warehouse Designer, Workflow Manager, Repository Manager and Workflow Monitor 
·   Well acquainted with Performance Tuning of sources, targets, mapping and sessions to overcome the bottlenecks in mappings. 
·   Sound Understanding of Data warehousing concepts and Dimensional modeling (Star schema and Snowflake schema) 
·   Strong analytical and conceptual skills in database design and implementation of RDBMS concepts. 
·   Experienced in Oracle database development using PL/SQL, Stored Procedures, Triggers, Functions and Packages. 
·   Good knowledge in interacting with Informatica Data Explorer (IDE), and Informatica Data Quality (IDQ). 
·   Experience in UNIX shell scripting (file validations, file downloads, workflow executions). 
·   Developmental experience on Windows NT/95/98/2000/XP, UNIX platforms. 
·   Excellent communication and interpersonal skills. 
·   Good understanding of Star and Snowflake Schema, Dimensional Modeling, Relational Data Modeling and Slowly Changing Dimensions.

·   Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star schema and Snowflake schemas.

·   Resourceful, creative problem-solver with proven aptitude to analyze and translate complex customer requirements and business problems and design/implement innovative custom solutions. 
·   Exceptional problem solving and sound decision making capabilities, recognized by associates for quality of data, alternative solutions, and confident, accurate, decision making.Technical Skills: 

ETL Tools

Informatica Power Center 8.6/7.1.3/ 6.2

Operating Environment

Unix(solaris) 9.0, AIX 5.2,MS Windows 95/98/NT/2000/XP


TOAD 7.x/8.5


Oracle 10g/9i/8i, DB2, SQL server 2000




Unix shell scripting


MS-visio 2007.

Education/Technical Certifications: 
·   Master’s in computer science, Bachelor in Arts in India 


Professional Experience  

Goodyear Inc., OH Mar 2009-till date 
Sr.ETL Developer 
Description: World Services Data warehouse

Goodyear is the world’s largest tire company. It operates more than 2,000 tire and auto service center outlets. The high-level scope of this project is to extract data from collection computers from the plants, transform it to fulfill the business rules and load to EDW tables. Once the data is available in the EDW tables, generate BI reports against the EDW for the Business Users. By collecting barcode data from the plants into a standard database and integrating it with production, transportation, registration, and adjustment data, Goodyear would have visibility to the tire’s life cycle.Responsibilities: 
·   Assisted Business Analyst with drafting the requirements, implementing design and development of various components of ETL for various applications. 
·   Worked exclusively on performing the ETL to Level 1 as is staging, Level 2 staging, DW load based on the business rules and transformation specifications. 
·   Extensively used Transformation Language functions in the mappings to produce the desired results. 
·   Used session logs, workflow logs and debugger to debug the session and analyze the problem associated with the mappings and generic scripts. 
·   Analyzed requirements from the users and created, reviewed the specifications for the ETL. 
·   Designed Incremental strategy, Created Reusable Transformations, Mapplets, Mappings/Sessions/Workflows etc. 
·   Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values. 
·   Created Complex ETL Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, Dynamic Lookup, and Connected and unconnected lookups, Filters, Sequence, Router and Update Strategy 
·   Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems. 
·   Implemented Slowly Changing Dimensions (SCD, Both Type 1 & 2). 
·   Worked with complex mapping using transformations such as Expression, Router, Lookup, Filter, Joiner, SQ, Stored Procedures and Aggregator. 
·   Worked on existing mapping for the performance tuning to reduce the total ETL process time. 

EnvironmentInformatica Power Center 8.6, Oracle 10g, Windows NT, Flat files (fixed width/delimited), MS-Excel, UNIX shell scripting.

Centene corporation, St.Louis MO Mar 2008 – Feb 2009 
ETL Developer 
Claims processing (Medicare/Medicaid)The client is a leading multi-line healthcare enterprise that provides programs and related services to individuals receiving benefits under Medicaid, including the State Children’s Health Insurance Program (SCHIP), as well as Aged, Blind, or Disabled (ABD), Foster Care, Long-Term Care and Medicare (Special Needs Plans). 
Working as a Technical analyst, I was responsible for handling diverse technical activities that included managing the ETL processes, analyzing and documenting the technical specifications, reverse engineering process flows and deducing the corresponding business rules and process rules, maintenance of the User acceptance Test (UAT) manual.


·   Analyzed, inspected and laid the framework for Claims process Engine, a process designed and customized by external vendors for the client. 
·   Comprehensively analyzed and systematically documented the end-to-end flow of the Inbound and Outbound claims file (834 HIPAA). Informatica B2B 
·   Extensively Worked in Processing Structured and Unstructured data. 
·   Informatica B2B Data Transformation supports transformations and mappings, via XML, of 
most healthcare industry standards including HIPAA 275,277. 
·   Resolved the anomalies arising in the file processing across individual stages of data flow. 
·   Deduced the business rules and process rules from the stored procedures that were an active part of the MEMBER’S load reporting layer. 
·   Designed, Developed and unit tested the ETL process for loading the Medicaid/Medicare records across STAGE, LOAD and ACCESS schemas. 
·   Accomplished the Provider directory load (a five step sequential process) using informatica for the states of AZ and FL. 
·   Extended the functionalities of existing ETL process of Medicaid for Medicare. 
·   Extensively worked on the transformations like Source Qualifier, Filter, Joiner, Aggregator, Expression and Lookup. 
·   Used session logs, workflow logs and debugger to debug the session and analyze the problem associated with the mappings and generic scripts. 
·   Maintained documents for Design reviews, Engineering Reviews, ETL Technical specifications, Unit test plans, Migration checklists and Schedule plans. 
·   Worked with HIPPA 5010 for reduces risk and provides flexibility and complete bi-directional transaction crosswalk transactions. 

EnvironmentInformatica Power Center 8.1/7.1.3, HIPPA5010, SQL server 2000, DTS, Oracle 10g, Windows NT,      VB, Flat files (fixed width/delimited), MS-Excel, UNIX shell scripting.

Circuit City Stores Inc, Richmond,VA Nov 2006 – Feb 2008. 
ETL Developer/Analyst 
Floor and Space Planning (FSP)Circuit City is a Fortune 200 company and the third largest consumer electronics retailer in the United States with over $11 billion USD in sales. Working as a ETL Designer, I was primarily involved in designing, developing and testing the mappings meant for processing Floor and Space related data using informatica. The FSP application team generates the source data using JDA tool which in turn is staged and transformed in EDW and subsequently stored in EDM for reporting. The sources and targets were ORACLE, DB2 and Flat Files respectively.