Printing tool Download PDF


  • Senior data integration engineer with 9+ years of experience in the data warehouse industry to analyze, design, development and implementation of large data processing applications.
  • Ab Initio specialist skilled in extraction, cleansing, dimensional data modeling (star & snowflake schema), transformation, wrangling, profiling source data, integrating various sources like mainframe , SAS files, XML, JSON, HDFS & data migration by leveraging various Ab Initio components like rollup, join, lookup, ICFF, scan and many more.
  • Experienced in data architecting, implementing data warehousing dimensional design patterns and fundamentals, parallel data extraction & loading techniques with TERADATA and other database.
  • Hands on experience of generic graphs, PDL, Data Quality Environment (data profiler) and Conduct IT. Experienced in optimizing & tuning SQL queries, indexing with Teradata.
  • Strong UNIX shell scripting skills with hands on experience of scheduling and automate Ab Initio jobs through TIVOLI job scheduler, Autosys tool and crontab.
  • Excellent in preparing test case, test plan/framework and testing data models. Proficient in documenting ETL strategy, detailed design specifications for the ETL process.
  • Provided valuable input into project plans and schedules, translating business requirements into conceptual, logical and physical data models. Delivering regular status updates to project manager and technical manager; consistently completed project elements on time, adhering to all strict project requirements.
  • Good experience of EME data store, air commands and version control like GITHUB, SVN.
  • Experienced in big data technologies like Hadoop hive, Sqoop and NO-SQL database and always get evolved in new emerging technologies.
  • Experience in scrum/agile model, managing quality processes and coordinate go-live production releases. Strong in supporting, executing, validating and monitoring ETL production jobs.
  • Inspiring leader with an innate understanding of team dynamics and interpersonal relationships, build winning teams and that take pride in their accomplishments. Proven track record for coordinating requirements, establishing work priorities, organize assignments and mentoring team members.

Work History

Aug 2016Present

Data Integration Engineer

Shutterfly Inc
  • Design, develop and test large data processing Ab Initio/Hadoop ETL jobs. Work with business analyst to develop analytic solution that meets business requirement.
  • Design star and snowflakes data model, prepare Data Mappings from source, to staging, to dimensional models (as needed), and manage metadata.
  • Write & Execute BTEQ script to Create and load tables in Teradata
  • Use swagger API and parse JSON objects to derive data from RESTful web services
  • Prepare wrapper script in python/UNIX to execute job
  • Design and Test data warehouse schema and query design and Monitor data warehouse performance
  • Prepare user documentation for application and maintain source library.

Technical Environments: Ab Initio GDE 3.2.4, Co-Operating 3.2.3, Enterprise Metadata Environment, Linux, Github, Teradata, Automic scheduler and Hadoop ecosystem.

Apr 2015July 2016

Data Analytic Consultant

Allstate Insurance
  • Profile, Analyze and evaluate Auto & Property policy data and prepare data plate form using Ab Initio for actuary. Leverage enterprise data warehouse and different tools to develop corporate research data files, information platforms or data spaces.
  • Evaluate technical designs and data mapping documents, utilize design to develop accurate defect free code, test programs using Ab-Initio, SQL, Shell Scripting, Java, python, oracle, SAS on UNIX environment. 
  • Execute, monitor and validate data of key policy, quotes and claim application, provide on-site support and perform root-cause analysis & fix any issues that arise in data sets. Participate in Production Support transition meetings
  • Participate in code reviews, and test plan reviews
  • Provides input in creating detailed ETL development estimates or change requests.
  • Perform performance tuning on both UNIX ETL and Oracle database environments
  • Work closely with Data scientist, Actuary and business managers to prepare requested data sets. Prepare use case document to evaluate different best suitable ETL tools.

Technical Environments: Ab Initio GDE 3.2.2, Co-Operating 3.2.4, Linux, Oracle 12i, Python, Java, SAS, Tivoli work scheduler and Hadoop ecosystem

Nov 2013Apr 2015

Sr. Ab Initio Consultant

Discover Financial Services, IL
  • Provided subject matter expertise in the analysis and preparation of specifications and plans for development/modification of the EDW.
  • Designed PULSE data marts and support business analytic using the Teradata database. Pro-actively identify and communicated scope, design, development issues and recommend potential solutions.
  • Developed ETL code to meet all technical specifications and business requirements according to the established designs, assisted & mentor other team members. 
  • Coordinated work with offshore team members, business analyst and data analyst to resolve gaps in EDW design specifications.
  • Coordinated with data modeler/DBA to design data model & integrate DB changes. 
  • Created, scheduled new jobs and Supported existing production job runs 
  • Keep ETL artifacts in EME and deploy it to production using Live_Queue process.

TE: Ab Initio GDE 3.1. 7, Co-Operating 3.1. 4, Linux, Teradata, Tivoli work scheduler and Cognos reporting

Jan 2010Nov 2013

Project Lead - Ab Initio Specialist

Syntel Inc
  • Discussed requirements with business analyst, Data architects and QA and prepare technical specifications. 
  • Designed, developed and tested Ab Initio graphs, Java batch and web application.
  • Implemented various partitioning techniques (like data, Component and Pipeline), re-partitioning technique and multi-partition file and flow to speed up the processing of data wherever applicable.
  • Wrote shell script to invoke graph in scheduler. 
  • Maintained issue log, prioritize, resolve critical immediately and coordinate remaining issues with the offshore team member, resolve doubts related to requirements, design and guide them on technical issues.
  • Installed and configured A>I products, create EME projects in each of the environments, setup sandboxes in physical environment.
  • Actively involved in code reviewing and testing Ab Initio graphs, java program and shell script for its correctness.
  • Created and deploy Ab Initio tag & Java packages on test environment, support production release activities.

TE: Ab Initio GDE 3.0.4, CO>OP 3.0.4, HP Unix, IBM DB2 9, SQL and PL-SQL, Inetsoft reporting server, Autosys scheduler, Excel, MS visio, Java and Tomcat web server

Jan 2007Dec 2009

Programmer Analyst

Cognizant Technology Solution
  • Document, develop, test and deploy changes in web application  and reporting portal
  • Consult with business user and various data team to investigate issue and prepare technical specification document based on analysis.
  • Enhance functionality of existing java programs, prepare and review test cases
  • Analyze data, write complex SQL query to prepare report requested by business user
  • Create report using Jasper soft studio & Prepare dashboard using Kava chart libraries
  • Providing 3rd level support to business process flow in the production environment
  • Monitoring production jobs, resolving issues occurred in daily operation and inform business users.

TE: Java, Oracle, Unix, IBM MQ, Jaspersoft BI, JSP, struts, Hibernate, Spring, Weblogic application server

Awards & Recognition

  • Discover Financial Service, 2014 : Global Resources Exceptional Achievement Tribute - Best of the Best team player 
  • Syntel Inc (2012 and 2013): Speed and Smart Value Award