Conducted sessions for gathering Business Requirements from various users and Subject Matter Experts.
- Performed Requirement Analysis and Effort Estimations
- Analyzed business requirements and worked closely with the various application teams and business teams to develop ETL procedures that are consistent across all applications and systems
- Creation of ETL frame work and review of high level design document
- Creation of Logical Data Model and Physical Data Model
- Extensively worked in the performance tuning of the programs, ETL Procedures and processes
at this job
Created/Implemented the Architecture for Informatica Power Exchange CDC processes using (Oracle, MySQL, and AS400) feeding a queue that would be consumed by TIBCO for the largest project of the year.
- Architected/Developed Informatica Batch/Real-time(CDC) processes to feed the Informatica MDM (Master Database Management) serving as a single access of customer data between applications.
- Mentored team member and other groups on Informatica Architecture and Development.
- Tools included: Informatica 9.5.x, Erwin 11.x, Informatica MDM 9.1, & Informatica PowerExchange CDC.
at this job
Senior Consultant/ Informatica Architect/oracle DBA - Informatica/ Erwin
Designed, Created and Managed Informatica Environments using Admin Console and Repository Manager toolsets.
- Created complex workflows, mappings and other Informatica objects to load EDI data from FTP, XML/XSD, and relational sources into Decision Support Star Schemas data marts.
- Performed extensive performance tuning on Informatica mappings, sessions and workflows
- Created Unix scripting using PMCMD for Informatica execution automation and scheduling.
- Created Oracle Packages and stored procedures for the creation of a database monitoring and reporting tool as well as in support of ETL activities.
- Used Informatica Metadata Manager for management of mapping and external metadata.
- Perform Oracle DBA tasks in Oracle 10g environment including database creation, tuning/optimization, backups (RMAN, Import/Export) and RAC/Grid management.
- Created logical and physical data models using ERWin including the use of naming standards (.nsm) files and Global Domain Dictionaries.
at this job
Data Warehouse Architect/informatica Architect
Architected Informatica tools Power Center and Power Analyzer on different servers including user and security setup for different environments. Involved in the setup of repositories, promoting code on different environments and versioning of objects.
- Involved in business analysis and modeling of HTS mart (high throughput screen data - experiments) and mapping the source and target matrices, data cleansing and data profiling activities at source and target objects.
- Worked with ETL team for loading data from various sources into Staging / Data Warehouse and Data Marts using Informatica Power Center 7.1.x (Repository Manager, Designer, Server Manager, Workflow Manager, and Workflow Monitor)
- Extensively used Informatica debugger to validate mappings and to gain troubleshooting information about data and error conditions.
- Coded for loading large volumes of data in fact tables to the tune of 1.2 billion rows using source partitioning and parallel partitioning at the target. Used multiple sessions with bulk load option for performance
- Performance tuning for power center included design changes, splitting mappings into multiple mappings, changing cache values, having multiple sessions, using parallel partitions at different stages of the mappings at session level and using only fields needed at the target level from the source.
- Responsible for ETL process includes data extraction, cleansing, aggregation, reorganization, transformation, calculation, and loading assay data. Responsibilities included data modeling using TOAD to implement star schema data warehouse and data marts for experimental data using Oracle. Created search capabilities to drill up data from the data mart to the data warehouse. Created dynamic pivot for data mart was very efficient since the source data was run in multiple threads to process the data efficiently and made it scalable.
at this job