MADHAVI LATHA.
T
PROFESSIONAL SUMMARY
● Certified SnapLogic professional
● 8 Years of IT experience with evaluating, designing, configuring, developing, testing, and managing
Integration using Snaplogic Integration
● Strong knowledge in translate functional specifications, user stories into technical specifications.
● Having good knowledge on webservices like SOAP and REST.
● Good experience in building Snap logic pipelines, error handling, scheduling tasks & alerts.
● Excellent working experience in API's and Scripting.
● Experienced in working with offshore team and Multi Culture Dynamic team environment.
● Good in database knowledge such as RDBMS Oracle/PLSQL, Snowflake
● Demonstrated leadership experience in identifying and implementing appropriate technology solutions, in
solving business problems, and supporting company strategies. Mentoring junior employees effectively
● Effective independent and team player and self-starter with high adaptability to new technologies
TECHNICAL SKILLS
Snaplogic Snaplogic Designer, Snap logic pipelines, Manager, Error handling,
Scheduling tasks & alerts, Dashboard
Web Technologies HTML, DHTML, JSP, XML, XSL,
Operating System LINUX, UNIX,
Databases Oracle PL/SQL, RDBMS, Snowflake, SQL, SQL server and MS-Access
webservices Soap and rest api
Other Environment Data storage, ETL, Azure Cloud, AWS, Scripting
EDUCATIONAL QUALIFICATIONS
● Bachelor of Commerce | Vikrama Simhapuri University (VSU)| 2017 | Rama Krishna PG College- Alluru,
Nellore Dt
● Master’s in Commerce | Vikrama Simhapuri University (VSU) | April 2019 | Rama Krishna PG College-
Alluru, Nellore Dt
PROFESSIONAL EXPERIENCE
Accenture Solutions | April 2023 to present | Packaged app developer
Project: Humana
Responsibilities:
● Supervised and managed a team of 5 members
● Created detailed User Stories and translate it into Technical specifications
● Created automated pipelines in AWS Code Pipeline to deploy Docker containers in AWS ECS using
services like CloudFormation , CodeBuild , CodeDeploy , S3 and puppet .
● Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load)
processes using Informatica PowerCenter
● Created data sharing between two snowflake accounts
● Developed and delivered team training session to ensure team members were up to date.
● Built SnapLogic Pipelines to ingest data from variety of sources such as Oracle, S3 & SFTP etc
● Extensively worked with making calls to APIs to fetch the data from the backend as JSON response and
parse it on the UI for DOM/CSS Manipulation
● Creation of database objects like tables, views, materialized views, procedures and packages using oracle
tools like Toad, PL/SQL Developer and SQL* plus.
● Setup S3 buckets for Artifact management and data Storage of batch files for multi cluster batch
application.
● Patching activities and upgradation will handle.
● Worked on Continuous Integration CI/Continuous Delivery (CD) pipeline for Azure Cloud Services
using CHEF.
● Scheduling Database tasks - Jobs, Alerts, Emails, Notification
● Implemented Teradata Relational Database Management Systems RDBMS analysis with Business
Objects to develop reports, interactive drill charts, balanced scorecards and dynamic Dashboards.
● Created IAM policies for delegated administration within Amazon Web Services (AWS) and
Configure IAM Users / Roles / Policies to grant fine - grained access to Amazon Web Services (AWS)
resources to users.
● Creating SSIS packages using with Error Handling.
● Snap pack versions changes & will downgrade if any discrepancies happen.
● Created reports using SQL, MS Excel, UNIX, Shell scripting and Perl
● Supported prod and non prod pipelines around 300+
● Documents will be maintained after snap pack versions, patching activities & new error notice.
Buzzworks Business Pvt Ltd | Mar 2021 to May 2022 | Snap logic Engineer
Project: Mindtree: This project involved several key functional tasks. The objective of this program is to control
reservation allocation to stores for high demand/low supply products and to have the ability to dynamically set
allocation thresholds at the store at the level of specified skus
Responsibilities:
● Worked on Snaplogic in the GameStop project where we developed and supported snaplogic pipelines
for multiple systems
● Worked on batch performance through data processing using Perl, Shell scripting and Unix.
● Worked on Critical times and Real time data processing jobs using snaplogic ultra pipelines for Order
Management application
● Build the Logical and Physical data model for snowflake as per the changes required
● Designed, developed, managed and monitored snap logic pipelines to process FTO and e-commerce data
using Snaplogic Elastic Integration iPaaS
● Extensively used ETL to transfer and extract data from source files (Flat files and DB2) and load the data
into the target database.
● Resolved critical issues and snaplex related problems within SLA times
● Implemented Event Handlers and Error Handling in SQL Server Integration Services packages and
notified process results to various user communities.
● Managed Servers on the Amazon Web Services (AWS) platform instances using Puppet configuration
management.
● Identify, evaluate, PoC design, test and automate data storage architectures around object and file storage
as well as cloud integrated backup and recovery pipeline
● Generated server side Oracle PL/SQL scripts for data manipulation and validation and materialized views
for remote instances.
● Documented the simulation steps for debug using Shell scripting
● Worked on different kinds of source and destination file types and systems
● Used Zookeeper, to set the offset to the Api’s and to prevent the lose of messages when passing from one
api to another in the system.
● Performed regression testing and participated in load testing for pipelines along with QA help.
● Worked on Heavy Data load from ETL to Snaplogic
● Define virtual warehouse sizing for Snowflake for different type of workloads
● Developed the Dashboard for Snaplogic TIDAL jobs Dependencies
● Created Tasks and Dependencies in Snap Logic through TIDAL
App icon IT | Jan 2019 to Feb 2021 | Snap logic Consultant
Responsibilities:
● started intially as supporting engineer in snaplogic
● Helped in establishing ETL procedures and standards for the objects improving performance
enhancements and migrated the objects from Development, QA, and Stage to Prod environments
● Built S3 buckets and managed policies for S3 buckets and used S3 bucket and Glacier for data storage
and backup on AWS
● Worked on Oracle Databases, RedShift and Snowflake
● Creating Parent child pipelines and nested pipelines.
● Developed Oracle PL/SQL triggers and master tables for automatic creation of primary keys
● Create Scripts for all identified workflows, in corporate proper error handling using
standard Vugen functions and C / Java / VB Utility functions.
● Developed Kafka Messaging System in Api’s to behave as Producer and Consumer using Kafka Wrapper.
● Validating the pipelines to test the input and output data
● Configured BGP routes to enable ExpressRoute connections between on premise data centres and
● monitaring the 300+ pipelines daily
● Analyzed the source system and involved in designing the ETL data load.
● Backend scripting/parsing using Perl and Python escalating the corresponding teams through mails
without any delay and meet SLA every day.