Jaswanth k Email : jaswantk1985@yahoo.
com
Mobile: +91-8148495654
Professional Summary
● Having a total of 14+ years of experience in the IT industry.
● Having 5+ years of experience in Microsoft Azure Cloud technologies
● A Self-starter with a positive attitude, willingness to learn new concepts/technology and acceptance of
challenges.
● Excellent Technical, Interpersonal and Management skills.
● Experienced in Azure Data Factory.
● Very strong experience in ETL design.
● Exposure to Azure Cloud computing technologies.
● Hands-on experience in Azure Services – Azure Data Lake Store (ADLS), Azure Data Lake Analytics
(ADLA), Azure Data Factory (ADF).
● Prepared Project Documentations, Such as Setup Documents, Test Scripts and Functional specification
Documents.
● Hands-on experience in Azure Data factory and its Core Concepts like Datasets, Pipelines and
Activities, Scheduling and Execution
● Excellent knowledge of ADF building components – Integration Runtime, Linked Services, Data
Sets, Pipelines, Activities
● Designed and developed data ingestion pipelines from on-premises to different layers into the ADLS using
Azure Data Factory (ADF V2)
● Experience with integration of data from multiple data sources
● Good knowledge on polybase external tables in SQL DW.
● Designed Azure Logic apps application to send pipeline success/failure alert emails, file unavailability
notifications etc.
● Knowledge on Data Extraction from On-Premise Sources and Delta Extraction methods from
Source Systems to ADLS
● Extensively Worked on Copy Data activity
● Worked on Get Metadata Activity, look up, Store Procedure, Foreach, IF and execute Pipeline activities
● Orchestrated data integration pipelines in ADF using various Activities like GetMetadata,
Lookup, ForEach, Wait, Execute Pipeline, Set Variable, Filter, until, etc.
● Implemented dynamic pipeline to extract the multiple files into multiple targets with the help of
single pipeline
● Manage data recovery for Azure Data Factory Pipelines
● Monitor and manage Azure Data Factory
● Experience in SQL – Joins/co-related queries/sub-queries etc.
● Good knowledge of stored procedures, functions, triggers, views, etc.
● Involved in Configuration of the system, Functional Testing of the application, Integrity of the modules,
Managing Business Users.
● Possess sound knowledge of Business process and Data flow.
● Able to work within an aggressive and demanding project schedule and environment.
● Experience in working with extended teams to investigate and resolve critical problems with systems
identified during system Implementation.
● Experience in code reviews, integration and end-user support.
1
● Automated execution of ADF pipelines using Triggers
● Experience in Production support
PROFESSIONAL QUALIFICATION
● MCA from Annamali university in 2009.
TECHNIC AL PROFICIENCY
Key Competencies & Skills
Operating System Unix and Windows
ETL Tools Azure Data Factory, Azure Data lake Analytics,spark
Language Tools SQL,Pyspark,pyspark
Database Tools Oracle, SQL Server, AZURE DWv
CAREER PROFILE
● working as Technical Lead in infinite computer solutions from jan2021 – till date
● worked in senior engineer in AstraZeneca India Pvt Ltd from Nov 2016 – Nov 2021
● Worked in BCT Consultancy Services from jun 2016 – Nov 2016
● Worked in Sreeven info com limited from Sep 2014 – May 2016
● Working as senior specialist in HCL technologies from FEB 2013 to sep 2014
● Working as storage admin in AGS infotech Ltd from July 2009 to Jan 2013
Project
Client MolinaHelthcare
Role Azure Data Engineer
Technologies & Tools Azure Data Factory V2, Logic App,Key Vault,
Oracle,Files Duration Jan-21 to till date
Project Description: -
This project is mainly for creating an INTEGRATED data to Azure Data Factory is being used as an ETL tool.
Data will be extracted from Oracle source to Data lake and Azure Synapse Analytics. This Data warehouse will be
used as a base to accommodate all the reporting requirements.
Roles & Responsibilities-Supporting: -
● Created pipelines to extract data from on premises source systems to azure cloud data lake storage;
Extensively worked on copy activities and implemented the copy behavior’s such as flatten hierarchy,
preserve hierarchy and Merge hierarchy. Implemented Error Handling concept through copy activity.
● Exposure to Azure Data Factory activities such as Lookups, Stored procedures, if condition, for each, Set
Variable, Append Variable, Get Metadata, Filter and wait.
● Create dynamic pipeline to handle multiple sources extracting to multiple targets; extensively used azure
key vaults to configure the connections in linked services.
● Configured and implemented the Azure Data Factory Triggers and scheduled the Pipelines; monitored
the scheduled Azure Data Factory pipelines and configured the alerts to get notification of failure
pipelines.
● Implemented delta logic extractions for various sources with the help of a control table; implemented
the Data Frameworks to handle the deadlocks, recovery, logging the data of pipelines.
2
● Reviewing individual work on ingesting data into azure data lake and provide feedbacks based on reference
architecture, naming conventions, guidelines and best practices
● Developing Spark (Python) notebooks to transform and partition the data and organize files in ADLS
● Involved End-End logging frameworks for Data factory pipelines.
● Implemented delta logic extractions for various sources with the help of a control table; implemented the
Data Frameworks to handle the deadlocks, recovery, logging the data of pipelines.
● Extracted data from different sources such as Flat files, Oracle to load into SQL database.
● Involved in preparation and execution of the unit, integration and end to end test cases.
● Used COPY to bulk load the data.
● Created internal and external stage and transformed data during load.
● Used FLATTEN table function to produce lateral view of VARIENT, OBECT and ARRAY column.
● Identifying areas for modification in existing programs and subsequently developing these modifications.
● Used Temporary and Transient tables on diff datasets.
● Cloned Production data for code modifications and testing.
● Shared sample data using grant access to customer for UAT.
Certifications:
DP-203 – Microsoft Certified: Azure Data Engineer Associate
DECLARATION
I hereby confirm that the information given above is true to the best of my knowledge.
Place: Chennai jaswanth.k
3
4