GUGLOTHU SATHISH Email:GUGLOTHUSATHISH@GMAIL.
COM
Phone: 8317563281
Key Expertise
5.4 years of Information Technology Experience in Data Migration/Data Integration and Application
support and Project Maintenance as ETL/ELT functionality using Azure Data Factory (ADF V2)
Currently working as ETL developer with Azure Data Factory, Azure Data Lake Gen2, Azure data
Bricks, Pyspark and Azure Blob storage.
Proficient in designing & developing strategies for the Extraction, Transformation, and Loading (ETL)
mechanism.
Data migrated using Azure Data Factory and created new pipelines, and data flows.
Hands-on experience in Migrating SQL database to Azure Data Lake, Azure SQL Database, Snowflake
and Azure SQL Data warehouse and controlling and granting database access.
Build multiple Data Lakes Build how the data will be received, validated, transformed, and then published.
Azure Data Factory (ADF), Integration Run Time (IR), Self-Hosted, File System Data Ingestion,
Relational Data Ingestion.
Built scheduled ETL pipelines in ADF
In-depth experience in ETL Design Specification, Data Stage Design & Development, Quality Process, and
Code deployment, Agile, ARM, Power Shell.
Implemented Performance Tuning Techniques while designing ETL applications.
Effective in cross-functional and global environments to manage multiple tasks and assignments concurrently
with effective communication skills.
Strong understanding of the principles of Data Warehousing using fact tables, dimension tables schema
modeling.
PROFESSIONAL EXPERIENCE
1. Working as Azure Developer for Infosys (SUBHASHINI SOFTWARE SOLUTIONS LTD)-from October 2023
to till date.
2. Worked as Azure Developer for Wipro (CONFIANZA SOLUTIONS PVT LTD)-from April-2018 to October
2023.
PROFESSIONAL SUMMARY
RELEVANT EXP: 2.7
YEAR
Hands on development experience with Azure BI Services like ADF, Azure Blob, ADLS, Logic Apps, Key
Vault, Azure SQLDB and Snowflake.
Created Medium and complex initial & incremental load Pipelines to move data from different
heterogeneous sources like on- premise SQL Server, Flat files etc into Azure Data Lake and then into
Azure SQLDB.
Created dynamic pipelines and datasets as re-usable component to carryout data transformations and data
movement.
Hands on experience in creation of Pipelines with different activities like Copy Data, Lookup, SP, For Each,
Pipeline, set variable, Wait , Until, etc.
Involved in creating DB, Server, setting up server configurations and firewall settings.
Experienced in creating Azure Key-vault to provide extra security for credentials and access keys by creating
Secrets, keys and certificates.
Experienced in creating logic apps to send email notification, to upload file attachment from mail box to
Blob store, to trigger pipeline.
Design and develop ETL integration patterns using ADF
Develop framework for converting existing mappings .
Create Pyspark frame to bring data from heterogeneous sources to Azure Datalake.
Involved in writing various notebooks using Data frame from the required files in ADLS
Worked on reading and writing multiple data formats like JSON, ORC, Parquet on ADLS using PySpark
Provide guidance to development team working on PySpark as ETL platform
Worked on creating Logic Apps, key vaults, Triggers
EDUCATIONAL QUALIFICATION
Bachelor of Technology from Jawaharlal Nehru Technological University College of Engineering Hyderabad
in year 2016 with 67%.
TECHNICAL SKILLS
Data Integration (ETL/ELT) Tools: Azure Data Factory (ADF V2)
Azure Data Factory Developer (Cloud): Azure Data Factory (ADF), Azure SQL Database, Azure Data Lake
Storage Gen2, Azure SQL Data Warehouse, Azure Storage, Azure Blob Storage, Agile, ARM, Power hell.
PROJECT DETAILS
Project 3: Nov 2023 to Till Date.
Project Title : Data migration into ADLS
Client : Rabo Bank
Role : ADF Developer.
Team Size : 11
Environment : Azure Data Factory, SQL Server 2008R2
Client Overview:
Rabo bank is a Netherlands based organisation specialized in the mortigate and loans. The company has interactive
offices in more than 12 countries.
Business Objectives:
Client has Loans portal where customers can take loans through online, all these online loans and mortigae data we
receive through sql tables and sales data from offices we receive in form on flat files. We created SSIS packages to
load data from stage tables to data marts, and also to load from flat files to data marts.
Roles& Responsibilities:
Monitoring jobs and notifying ETL team whenever issue comes or job fails.
Created Stored Procedures, View’s and source queries required for ETL team
Creating documents for logging errors, task tracker etc
Project 2: Feb 2023 to Oct 2023
Project Title: TCCC Digital Product Authoring
Client: Coca-Cola
Technologies: Azure Data Factory, ADLS Gen2, Logic Apps, Azure SQL Data Base, SQL Server
Role: ADF Developer
Duration: January’2023 to till date
Client Overview:
The Coca-Cola Company is an American multinational beverage corporation. The Company has interests in the
manufacturing, retailing, and marketing of non-alcoholic beverage concentrates and syrups. Digital Product Authoring
(DPA) aims to orchestrate product data so that it can be authored once and shared across the TCCC ecosystem. It is an
essential undertaking for realizing The Company’s vision of digital growth and innovation. Delivering on enterprise-
level ambitions like this requires a comprehensive, coherent, and coordinated approach. DPA will coordinate with and
in some cases catalyse other efforts designed to strengthen the Company’s capabilities in Product Innovation,
Development, Commercialization, and Management.
Roles & Responsibilities:
Hands-on experience in Migrating SQL database to Azure Data Lake, Azure data lake Analytics, Azure SQL
Database and controlling and granting database access.
Build multiple Data Lakes Build how the data will be received, validated, transformed, and then published.
Azure Data Factory (ADF), Integration Run Time (IR), Self-Hosted, File System Data Ingestion, Relational Data
Ingestion.
Built scheduled ETL pipelines in ADF
Design and implement end-to-end data solutions (storage, integration, processing) in Azure
Design and implement database solutions in Azure SQL Data Warehouse, Azure SQL
Recreating existing application logic and functionality in the Azure Data Lake, Data Factory, SQL Database
environment. experience project implementation using Azure Data Factory/Data flows
Migrate data from traditional database systems to Azure databases
Project 1: Jan 2021 to Dec 2022.
Project Title : Business Process Management for IGT
Client : International Game Technology
Role : A D F Project Support.
Team Size :8
Environment : Azure Data Factory, SQL Server 2008R2
Client Overview:
International Game Technology is a Nevada based company specialized in the designing, development
manufacturing of Gaming Machines, Network System products, Online & Mobile Gaming Solutions. The company
has interactive offices in more than 16 countries.
Business Objectives:
Client has online sales portal where customers can buy products through online, all these online sales data we receive
through sql tables and sales data from offices we receive in form on flat files. We created SSIS packages to load data
from stage tables to data marts, and also to load from flat files to data marts.
Roles& Responsibilities:
Monitoring jobs and notifying ETL team whenever issue comes or job fails.
Created Stored Procedures, View’s and source queries required for ETL team
Creating documents for logging errors, task tracker etc
(SATHISH)