0% found this document useful (0 votes)
14 views3 pages

Preeti Ghargi Data Engineer Analyst

Uploaded by

preetighargi04
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views3 pages

Preeti Ghargi Data Engineer Analyst

Uploaded by

preetighargi04
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Preeti Ghargi Mobile: +91-8217721145

Email: [email protected]
LinkedIn: www.linkedin.com/in/preetighargi

Professional Summary
• Experienced Informatica Developer - Over 8 years of comprehensive expertise in designing,
developing, and implementing ETL solutions using Informatica PowerCenter and Informatica
BDM.
• Metadata Management Proficiency - Skilled in managing Metadata using Informatica Metadata
Manager, ensuring data integrity and governance throughout the ETL process using Informatica
Metadata Manager.
• Data Visualization and BI Dashboard Creation - Proficient in crafting visually compelling data
visualization cards and Business Intelligence (BI) dashboards, transforming complex datasets into
actionable insights for informed decision-making using Power BI.
• Proven Track Record Demonstrated success in delivering high-quality data integration projects
across diverse domains, adept at analysing complex business requirements and optimizing
performance to ensure seamless data workflows. Strong team player with excellent
communication and collaboration abilities
Technical Skillset

Query Languages MySQL, Teradata

ETL Tools Informatica PowerCenter, Informatica BDM 10.2, Informatica IICS


(Knowledgeable), DataStream
Programming Python
Visualization and
DOMO, Power BI
Reporting tools
UNIX scripting, Informatica Metadata Manager, MHUB, ESP scheduling
Other
tool, IBM Tivoli scheduling tool, MS-Excel

Work Experience
Infosys 3 years 11 Months (July 2020 – At Present) – Technology Lead

Truist Banking Account –

Responsible for ETL Design and Development of UPCS application using


Informatica PowerCenter 10.2 – (Feb 2021 – Present)
• Led a team of 4 developers including designing, developing, and
implementing ETL solutions of UPCS application for Escheat process as
part of merger activity of SunTrust and BB&T to Truist.
Role & • Developed non-deposit application processing procedures, including new
Responsibility issues, updates and closing data across multiple business units. Implemented
a unified data structure to efficiently send information to UPCS application.
• Developed Unix scripts to manage technical debt, automate summary and
closing email notifications. Utilized HTML process formats generated by
Informatica process to streamline communication.
• Developed ESP and Control Direct jobs for complete process automation,
adhering to production standards.
• Engaged in creating unit test requests (UTRs), unit testing provision, and
comprehensive experience in pre and postproduction support.
Responsible for generating metadata lineages utilizing the Informatica
Metadata Manager tool and scheduling via MHUB (July 2020 – Jan 2021)
• Exporting applications and metadata lineages efficiently.
• Conducting link failure analysis.
• Creating link files utilizing PDFs obtained from SNOW requests.
• Analyzing and comparing link files, PDFs, and IMM lineage extracts.
• Generating mapping files based on provided source and target details.
• Scheduling and managing job processing using MHUB.
• Extracting metadata reports using MHUB.

Tata consultancy 4 years 7 Months (Dec 2015 – June 2020) – System Engineer
Services

AVIVA Insurance account –

Responsible for ETL Design and Development of General Insurance Data


Marts for Aviva Insurance, UK Account - (Jan 2018 – June 2020)
• Designed a Generic Dynamic mapping capable of loading data from user-
specified sources and targets, subsequently listed as a reusable component
within the team, resulting in a 30% reduction in average build time.
• Processed data from diverse source systems including Flat File, Fixed length
column, CSV, XML and Relational databases, and loaded it into various
destinations such as UNIX box, HDFS, AWS S3, and Oracle Database.
• Implemented parsing of XML data to CSV format using the serializer of
Data Processor Transformation.
• Created scalable tables in Hive using the AVRO Scheme to efficiently
handle large volumes of data.
• Developed a Hive table processing fixed-length columns for improved data
management and processing efficiency.
• Utilized mapping variables/parameters for test loads and developed
parameter files to enable flexible runs of workflows, allowing for
Role &
adjustments in variable values during scheduled loads. This approach
Responsibility enhanced the adaptability and efficiency of the workflow execution process.
• Crafted HiveQL queries to validate both HDFS files and Hive table data,
ensuring adherence to specified requirements.
• Employed the Minus Query Technique to verify the consistency between
source and target staging data sets, enhancing data quality assurance
processes.
• Collaborated on the creation of job definitions and job streams to schedule
tasks utilizing the IBM Tivoli workload scheduler. This involved
coordinating with the team to ensure seamless integration and efficient
execution of scheduled jobs within the designated environment.

Responsible for Design and Development of Business Intelligence Solutions for


Target Retail Account- (March 2016– December 2017)
• Engaged in the entire Software Development Lifecycle Experience (SDLC),
spanning from Business Analysis through Development, Testing,
Deployment, and Documentation.
• Created databases, tables, triggers, macros, views, stored procedures,
functions, joins, and hash indexes within the Teradata database
environment.
• Developed Unix Scripts to automate the file fetching process from Outlook.
• Created dataflow jobs in DOMO to merge disparate datasets using Amazon
Redshift SQL.
• Designed business dashboards for conducting trend analysis on retail Key
Performance Indicator (KPI) metrics.
• Managed scheduling and automation of ETL jobs on DataStream scheduler.
• Generated Design, and Technical Documents for metrics developed.
• Conducted Knowledge transfer sessions to onboard new resources.
• Maintained ongoing interaction with Business Users and facilitated demo
sessions to enhance their understanding of visualizations. This proactive
engagement ensured effective communication and alignment between
technical solutions and business requirements.

Training, Certifications and Achievements


• Trained in Business Intelligence and Performance Management Stream at TCS ILP Trivandrum.
• In 2021, honored with the prestigious INSTA Award for exemplary proficiency in successfully
navigating high-need project. Demonstrated a keen ability to manage complexities, optimize
resources, and deliver results surpassing expectations within challenging project environments.

Academic Qualifications

Degree/Class Year Institution/School University/Board Percentage


B.E. (Computer Basaveshwara Engineering
2015 VTU 7.86 CGPA
Science) College, Bagalkot
JSS Pre University College, Karnataka State
Pre-University 2011 82.60%
Dharwad Board
Loyola Convent High School, Karnataka State
Higher Secondary 2009 90.08%
Gadag Board

You might also like