0% found this document useful (0 votes)
42 views

AWS SQL Developer SabithaBekkam

Resumes

Uploaded by

Mandeep Bakshi
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views

AWS SQL Developer SabithaBekkam

Resumes

Uploaded by

Mandeep Bakshi
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 11

Name: SABITHA BEKKAM

Email: [email protected]
Phone: 510-458-6952

Summary:
• Over 7+ years of Software Life Cycle experience in System Analysis, Design, Development, and
Implementation, Maintenance, and Production support of Data Warehouse Applications, SQL,
PL/SQL, SSIS, SSRS, Power BI, Alteryx, AZURE Data Factory and Python and Domain Knowledge on
HealthCare, Insurance & Payments.
• AWS certified cloud engineer with around 7+ Years of experience in IT industry comprising of
Systems Administration and Change Management, Software Configuration Management (SCM),
Build and Release management and Process Engineering.
• Extensively involved through the Software Development Life Cycle (SDLC) from initial planning
through implementation of the projects.
• Experience in coordinating with users in gathering source data files, defining data elements,
business rules, data, and process flow documentation, prepare High level and Low-level design
documents.
• Experienced in Automating, Configuring and deploying instances on AWS, Azure environments and
Data centers, also familiar with EC2, Cloud watch, Cloud Formation and managing security groups
on AWS.
• Private Cloud Environment - Leveraging AWS and Puppet to rapidly provision internal computer
systems for various clients.
• Develop Puppet modules and role/profiles for installation and configuration of software for
required for various applications/blueprints.
• Over 7 plus years of programming experience as an Oracle, SQL, PL/SQL developer in analysis,
design and implementation of business application and Expertise in client server application
development using oracle 11g/10g/9i/8i, PL/SQL
• Experience in designing, developing, analyzing, and implementing client-server, web and desktop-
based applications using C# language.
• developing pipelines through scripts in ETL streamsets. Extensive experience in ETL/Informatica
Power Center and data integration experience in developing ETL mappings and scripts using
Informatica Power Center10.x/9.x/8.x/7.x, IDQ. Experience in data extraction from heterogeneous
sources using Teradata, Informatica Power center.
• 5 years of experience as an oracle EBS Application developer in implementation, customization,
upgrade of oracle E-business suit application R12.
• Strong knowledge on Azure Storage schematics such as Gen1 and Gen2.Experience utilizing
Snowflake to build data marts with the data residing in Azure storage.
 •Worked on Migrating table DDL’s, views and stored procedures from hive to SQL, Netezza to
snowflake, Teradata to snowflake. designing and implementing a fully operational production
grade large scale data solution on Snowflake Data Warehouse.
• To generate underlying data for the reports and to export cleaned data from an Excel
Spreadsheet, Flat file, MS Access and CSV files to the data warehouse.
• Ability to create SSIS Packages using various Transformation like Lookup, Derived Columns,
Conditional Split, Data Conversion, Aggregate, Merge Join, Sort,
• Database development for both OLTP and OLAP systems using MSSQL Server.
• Created and edited maps for various retail transactions including 810, 820,832.
Setup/troubleshooting/data validation configurations to ensure that all inbound and outbound EDI
transactions meet.
• Worked on EDI 834 enrollment and reconciliation implementations based on Affordable care act
compatible companion guide. Crossed Reference EDI Trading Partner Item Cross Reference Customer.
• Provided training on the EDI 834 and 820 transactions and on the reconciliation processes.
• Experience in Database Backup, Recovery and Disaster Recovery procedures. Experienced in creating
and using Stored Procedures, Triggers, Views, User Defined Functions, Sub - Queries and Joins for
complex queries involving multiple tables and exception handlers.
• Hands on experience in Azure cloud services (PaaS &IaaS), storage, web apps, Active directory,
application insights, logic apps, data factory, service Bus, azure monitoring.
• Strong understanding of Release Management process and required applications.
• Writing automation code for the translations using java script in eclipse phrase tree to automate
the translation from TD, Hive, T-SQL, PostgreSQL to snowflake and Azure to synapse.
 Hands on experience on unified data analytics with data bricks workspace user interface,
managing data bricks notebooks.
 Tested the HIPPAEDI 834 ,270/271transactions according to the test scenarios and verify the data
in the different modules.
• Expertise in DWH technical architecture, design, business requirement definition and Data
Modeling. Able to do Data load and exporting using Teradata utilities such as TPT, FLoad, MLoad,
Fast Export and Tpump.
• Experience in UNIX shell scripting, job scheduling and server communication. Able to Design
schema for big query.
• Extensive experience in implementing projects using Agile (Kanban) and waterfall methodologies

CERTIFICATIONS:
AWS Certified developer - Associate

Technical skills:
ETL Tools Informatica 10.1.1/10.1/9.6/9.1/8.6.1/8.1 Source Analyzer, Mapping Designer, Workflow
Monitor, Workflow Manager, Data Cleansing, Data Quality, Repository, Metadata, streamsets
Data Mart, OLAP, OLTP, SQL Server SSIS.

Data modeling tools Erwin

BI Tools Cognos, Business Objects

Databases Oracle 11g/10g/9i/8i, IBM-DB2, MS SQL Server (2008, 2005, 2000, 7.0, 6.5, 6.0), MS Access,
DTS, DB2, snowflake, hive, tsql. Netezza, PL/SQL

8.4.X to 9.5.X, C, C++, C#, Visual Basic 6, Visual Basic,oracle application R12.0.6
Other Tools Toad, SQL Developer, Crystal Reports, SQL Assistant, Alteryx ,Crystal Report, SQL Server 2012
Reporting Services (SSRS), Hadoop YARN, Spark code, Spark Streaming
Programming Languages SQL, Java, PL/SQL, T-SQL, UNIX Shell Scripting

Job scheduling Shell Scripting, Autosys, Tidal, Control-M

Environment MS Windows 2012/2008/2005, UNIX

AWS (amazon web services) Certified associate Developer, AWS(EC2, VPC, ELB, S3, EBS, RDS, Route53, CloudFormation,
AWS Auto Scaling, Lambda), AWS CLI, Jenkins, Chef, Terraform, Nginx, Tomcat, JBoss.
DBA Tools SQL, Erwin, TOAD, PL/SQL, T-SQL, VMware,

Professional Experience
FannieMae
Washington,.DC, Virginia September 2022 -Present

Role: Sr. AWS SQL Developer


• Developed and maintained automated testing frameworks using AWS services such as CodeBuild
and Code Deploy, resulting in a 50% reduction in testing time and a 25% increase in test coverage.
• Built S3 buckets and managed policies for S3 buckets and used S3 bucket and Glacier for storage
and backup on AWS.
• Work with other teams to help develop the Puppet infrastructure to conform to various
requirements including security and compliance of managed servers.
• Perform troubleshooting and monitoring of the Linux server on AWS using Splunk.
• Management and Administration of AWS Services CLI, EC2, VPC, 53, ELB Glacier, Route 53,
CloudTrail, IAM, and Trusted Advisor services.
• Worked on JIRA for defect/issues logging & tracking and documented all my work using
CONFLUENCE.
• services like GitHub, AWS Code Pipeline, Jenkins and AWS Elastic Beanstalk to create a deployment
pipeline.
• Tune SQL statements using hints for maximum efficiency and performance, create and
Maintain/modify PL/SQL Packages, mentor others with the creation of complex SQL Statements,
perform data modeling and create/maintain and modify complex database Triggers and Data
Migration scripts. Created changes in the database according new requirements (new tables in the
existing database and fields in the existing tables).
• Constructed and implemented multiple-table links requiring complex join statements, including
Outer-Joins and Self-Joins. Requirements (new tables in the existing database and fields in the
existing tables).
• Constructed and implemented multiple-table links requiring complex join statements, including
Outer-Joins and Self-Joins.
• Developed Shell scripts for job automation and daily backup.
• Documentation of the business process through Designer. Created, debugged, and modified Stored
Procedures, Triggers, Tables, Views and User-Defined Functions.
• Good Experience in architecting and configuring cloud VPC using private and public networks their
subnets in AWS.
• built and coordinated an automated build & release CI/CD process using Gitlab, Jenkins and Puppet
on hybrid IT infrastructure. Involved in designing and developing Amazon EC2, Amazon 53, Amazon
RDS, Amazon Elastic Load Balancing, Amazon SWF, Amazon SQS, and other services of the AWS
infrastructure.
• Running build jobs and integration tests on Jenkins Master/Slave configuration.
• Managed Servers on the Amazon Web Services (AWS) platform instances using Puppet
configuration management.
• Involved in maintaining the reliability, availability, and performance of Amazon Elastic Compute
Cloud (Amazon EC2) instances.
• Conduct systems design, feasibility and cost studies and recommend cost-effective cloud solutions
such as Amazon Web Services (AWS).
• Responsible for monitoring the AWS resources using Cloud Watch and also application resources
using Nagios.
• Created AWS Multi-Factor Authentication (MFA) for instance RDP/SSH logon, worked with teams
to lockdown security groups.
• Writing UNIX shell scripts to automate the jobs and scheduling corn jobs for job automation using
commands with Crontab.
• writing SQL queries, Cursors using embedded SQL, PL/SQL Testing and debugging of the
applications.
• Separating tables and indexes on different locations. Applications Tuning of Database using
EXPLAIN PLAN, TKPROF, SQL TRACE, ANALYZE, HINTS. Oracle query tuning and optimization and
Extensive Use of Bulk Binds, Bulk Collect.
• Utilized tools like TOAD during development of the application.
• Used Database Trigger for making history of insertion, updating, deletion and all kind of Audit
routines.
• Use of EXPLAIN PLAN, ANALYZE, HINTS, TKPROF to tune queries for better performance.
• Extensively used the advanced features of PL/SQL Collections Nested Table
Next pathway, New York, New York February 2021- Sept 2022
Role: Sr. SQL Developer/ Data Engineer
Responsibilities:
• Expert knowledge of Terada platform and associated tools. Expert design/coding skills, unit testing
methodologies and techniques.
• Demonstrated competency in all phases of business intelligence and data warehousing projects,
from inception to production deployment. Solid understanding of data warehousing principles,
concepts, and best practices (e.g., ODS, Data Marts, Staging)
• Worked on Migrating table DDL’s, views and stored procedures from hive to SQL, Netezza to
snowflake, Teradata to snowflake, Teradata to synapse, PostgreSQL to snowflake using Tool called
SHIFT
• Writing Automation code to translate the source code using java script in eclipse phrase tree to
automate the code translation from various databases could environment which is snowflake.
• Knowledge of JSON and PostgreSQL support for JSON. Experience with NoSQL databases (e.g.
Apache Cassandra) and NoSQL support in PostgreSQL
• Analyzed and reviewed Business & Functional Requirement Documents and the technical specification
for the HIPAAEDI transactions.
• Developed data warehouse model in snowflake for over 100 datasets using wherescape.
• Validating data from SQL server to snowflake to make sure it has apple to apple match.
• Developed stored procedures/views in snowflake and use in Talend for loading dimensions and
facts. Design, develop, test, implement and support Data Warehousing ETL using Talend.
• Experience with Snowflake Multi - Cluster Warehouses and build the Logical and Physical data
model for snowflake as per the changes required.
• Developed Flat format/xml/EDI maps for various transactions like 810, 850, 832. Maintained EDI
documentation such as work instructions and procedures.
• Created reports utilizing SSRS, Excel services, Power BI and deployed them on SharePoint Server as
per business requirements.
• O Build DataSync job on Windows Azure to synchro data from SQL 2012 databases to SQL Azure.
• Build DataSync job on Windows Azure to synchronize data from SQL 2012 databases to SQL Azure.
• Creating and Managing Virtual Machines in Windows Azure and setting up communication with the
help of Endpoints and VM Migrations from Transitional hosts on Canada Boxes.
• O Updating the instance counts in the production for the
• services of red box in the Production Azure Subscription.
• Designed system for the Azure platform to ensure massive compute resources are available to
allow NKR to scale internationally. Involved in writing complex SQL Scripts using excel sheets for
loading data into maintenance tables.
• Work in progress on POC to migrate to Windows Azure to address scalability and performance
issues.
• Extensively used SSIS Data Profiler Task to profile target source systems (tables & views) prior to
building ETL solutions, which was instrumental in cleaning data for data consistency and designing
table structures.
• Built various SSIS packages having different tasks and transformations for various clients and
Scheduled SSIS packages. Created SSIS packages to validate, extract, transform and load data from
Oracle into SQL Server database. Used SSIS, Import/Export to Copy, move data from one server to
another, Excel to SQL Server.
• Used SQL Azure extensively for database needs in customer lookup &//AzNoT. Moderate and
contribute the support forums (azure networking, azure active directory, azure storage) for
Microsoft developers network including partners and MVPs.
• Worked on google cloud components, google container builders and GCP client libraries.
Architecting the infrastructure on google platform using GCP services and automated GCP
infrastructure using GCP cloud deployment manager.
• Scheduling tasks on windows task scheduler to run the Python scripts to generate reports for
frequent interval of times and also send email alerts.
• Worked on developing applications in Hadoop, Bigdata Technologies-Pig, Hive, Kafka, spark Scala.
Involved in loading process into Hadoop distributed file system and pig in order to preprocess the
data.
• responsible for estimating the cluster size, monitoring and troubleshooting of the spark databricks
cluster.
• Design and developed end-to-end ETL process from various source systems to Staging area, from
staging to Data Marts.
• Responsible for sending quality data through secure channel to downstream system using role-
based access control and streamsets.
• Build pipelines solutions to integrate data from multiple heterogenous systems using streamsets
data collector and azure.
• Very good knowledge of RDBMS topics, ability to write complex queries, Stored procedures,
functions packages, triggers using SQL, PL/SQL.
• Experience in oracle supplied packages, dynamic SQL records and PL/SQL tables. Generated server-
side PL/SQL scripts for data manipulation and validation and materialized views for remote
instances.
• Experience in various methodologies like Waterfall and Agile.
• Experience in working on Unix/Linux operating systems.

Environment: Informatica Power Center 10.1.1, Snowflake, Hive, Netezza, TSQL, Oracle 11g, Azure,
IDQ, UNIX, PL/SQL, SQL* PLUS, TOAD, TERADATA 14.0, MS Excel, Active Batch V12 Console, Cognos, Big
Query, SQL server Management studio 2016. Hadoop YARN, Spark code, Spark Streaming, AWS (EC2,
VPC, ELB, S3, EBS, RDS, Route53, streamsets, PYTHON, CloudFormation, AWS Auto Scaling, Lambda),
GIT, SQL, Jira, AW, ASP.NET core.

Cigna Health Spring, Nashville, TN July 2018– January 2021


Role: Sr. SQL Developer

Responsibilities:
• Worked with business analysts for requirement gathering, business analysis, and translated the
business requirements into technical specifications to build the Enterprise data warehouse.
• Design and developed end-to-end ETL process from various source systems to Staging area, from
staging to Data Marts.
• Worked with the DBA team on TDM process which is Data Acquisition TDM tool will help rapid
development of ETL process which we can copy one or more tables from SQL server or flat files to
any TD core system.
• In the TDM Development tool which includes ABC metadata entries, creating tables/views
Generation, Active batch development.
• Analyze requirements provided, design and develop integration packages as per the client request
and automate them.
• Wrote complex queries to incorporate the business logic and retrieve all the certification and
candidate details for further reporting purpose.
• Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the
performance of the conversion mapping.
• Created the schema, implemented all stored procedures and 4 SSIS packages Created the
deployment scripts and managed the code in Microsoft Team Foundation Server.
• Performed unit tests on all code and packages.
• Updated existing and created new reports using Microsoft SQL Server Reporting Services. The team
consisted of 2 developers.
• Performed front line code reviews for other development teams.
• Converted the User Requirements into Business Requirements Document, Functional and Technical
Requirements Documents.
• Created Business Process Models from the requirement specifications.
• Worked on developing Tables, Views, Indexes, Stored Procedures, Triggers, Queries, and Macros
using MS SQL Server and Oracle databases.
• Involved in Partitioning tables that have bulk inserts, deletes and updates to improve the
performance.
• Extracted data from xml and flat files using SSIS.
• Scheduled Sessions and Batch Process based on demand, run on time, run only once using Informatica
Server Manager.
• worked on Data Integration using Informatica for the Extraction transformation and loading of data
from various database source systems.
• Designed and developed efficient SSIS packages for processing fact and dimension tables using
transformations like lookup, merge, merge join, script component and slowly changing dimension.
• Created Stored Procedures and handling the Query Performance Issues
• Handled clustering between two Production Servers and performed daily backups and developed
recovery procedures.
• Experienced field Unit Testing have tested codes of T-SQL SSIS packages testing environment.
• Created the PL/SQL scripts to extract the data from operational database into simple flat text file
using UTL_FILE engines.
• Developed advanced PL/SQL packages, procedures, triggers, functions, indexes and collections to
implement business logic using SQL Navigator. wrote conversion scripts using SQL, PL/SQL, Stored
procedures, functions, and packages to migrate data from SQL server database to oracle database.
• Configured DB mail for sending automatic mails to respective people when SSIS process fails.
• Define virtual data warehouse sizing for snowflakes for different types of workloads.
• Schedule different snowflake jobs using NIFI. Used NIFI to ping snowflake to keep client sessions
alive.
• Have experience in snowflake multi-cluster warehouse and snowflake virtual warehouse.
• Consulting on snowflake data platform solution architecture, design, development, and
deployment focused to bring the data driven culture across the enterprises.
• Development of REST APIs in Python and MariaDB, providing backend capabilities to interface with
OpenStack and other downstream APIs.
• Experience in reviewing python code for running the troubleshooting test-cases and bug issues.
• Understanding Python files in open stack environment and make necessary changes if needed.
• Using G-cloud functions with python to load data into a big query for on arrival csv files into GCS
bucket.
• Administered MS SQL Server creating User Logins with appropriate Roles, dropping, and locking the
logins, monitoring User Accounts, granting privileges to users.
• Additional projects included automation of a manual process as the start of a data warehouse
project.

Environment: Informatica Power Center 10.1.1, Hadoop, Hive, Oracle 11g, Azure, IDQ, UNIX, PL/SQL,
SQL* PLUS, TOAD, TERADATA 14.0, MS Excel, Active Batch V12 Console, Cognos, Big Query, SQL server
Management studio 2016. AWS (EC2, VPC, ELB, S3, EBS, RDS, Route53, CloudFormation, AWS Auto
Scaling, Lambda), GIT, SQL, Jira, AWS.

FedEx, Memphis, TN September 2017 – June 2018


Sr. SQL/ETL Developer
Responsibilities:
• Interacted actively with Business Analysts and Data Modelers on Mapping documents and
Design process for various Sources and Targets.
• Plan, design, and implement application database code objects, such as stored procedures and
views. Build and maintain SQL scripts, indexes, and complex queries for data analysis and
extraction.
• Provide database coding to support business applications using Sybase T-SQL. Perform quality
assurance and testing of SQL server environment. Involved in Analysis, Design, and
Development and Testing phases.
• In-depth knowledge and hands-on experience of RDBMS systems, specifically Sybase, Teradata, and/or
Oracle.
• Involved in setting up development standards for the power BI project. Experience with
performance tuning new and existing stored procedures. Developed and modified SQL queries
and stored procedures to meet business requirements.
• Performed tests and produced results to ensure accurate compliance with project request
requirements and analyzed the business requirements from the given specifications.
• Writing of Stored Procedures and Functions using SQL Server 2012 and T-SQL. Maintenance of
the various database objects - tables/views/stored procedures - in SQL Server 2012 Performed
the Server optimization, tuning of the indexes and the queries and Consolidation of SQL Server
environment Used the maintenance Plan for SQL Server databases to recreate indexes and
statistics for the performance.
• Designed and developed efficient SSIS packages for processing fact and dimension tables using
transformations like Fuzzy lookup, lookup, merge, merge join, script component.
• Design and implementation of Security Model, policies and procedures. Query Optimization
using Query Analyzer, Profiler, show plan, Index Tuning and Stats Time Tool.
• Created and modified SQL Plus, PL/SQL and SQL Loader scripts for data conversions.
Developed and modified triggers, packages, functions and stored procedures for data
conversions and PL/SQL procedures to create database objects dynamically based on user
inputs. Created Snapshot Replication for the Development and Testing Databases across
different Servers.
• Configuring, developing, and monitoring GCP services and other cloud services (IAM,
networking, etc.) to increase delivery speed by 65%
• Assured the system is 100% reliable by protecting the GCP with various precautions - from
employee security training to eminently deploy system vulnerabilities patches
• Integrated custom visuals based on business requirement using power BI desktop. Converting
all operational excel report into Power BI Reports. Worked on all types of transformations that
are available in Power bi query editor. Scheduled Automatic refresh and scheduling refresh in
power bi service.
• Regular responsibilities for updating staging and dimensional databases as well as rebuilding
the dimensions and cubes on Analysis Services (SSAS).
• Extensively used Teradata SQL Assistant to write different queries to understand the data in
source tables and supported during QA/UAT/PROD deployments and bug fixes.
• Coded various stored procedures for the purpose of application development. Design and
implement data access stored procedures and triggers for automating tasks.
• Extensively used joins and sub-queries to simplify complex queries involving multiple tables.
Created Stored Procedures, Triggers, Views, Functions for application• Migrated the legacy
Databases in SQL Server 2000 to SQL Server 2005 using SSIS.
• Wrote T-SQL statements for retrieval of data involved in performance tuning of TSQL queries
and Stored Procedures. Troubleshot complex queries and stored procedures. Implemented
and enforce best practices, standard operating procedures, etc.
• Reviewed code to identify SQL changes to improve system performance. Handled
performance tuning and break/fix of existing T-SQL code. Troubleshot Reports by cross
checking Report Layouts, Stored Procedures and SSIS Packages. Developed several Complex
Reports using SQL Server Reporting Services (SSRS).
• Execute SQL Task, Data Flow Task, and Execute Package Task etc.
• Handson experience in Python, SQL, Azure and Databricks. Experience building data platforms
using Azure stack.
• Database Design Skill including normalization theory, STAR Schema Design and Data Modeling tool
with proven experience of Process, Data flow and forward/reverse engineering of logical/physical
RDBMS layout
• Expertise in working with Teradata, SQL,PL/SQL Stored Procedures, Oracle Stored procedures,
Packages, Cursors, Triggers, Tables, Constraints, Views, Indexes, Sequence, and Synonyms in
distributed environment.
• Extensive experience with cloud computing technologies. Must have a proven track record in
consuming AWS or GCP or similar public cloud environments.
• Strong experience in automating Vulnerability Management patching and CI/CD using Chef
and other tools like GitLab, Jenkins, and AWS/Open Stack.
• In depth Knowledge of AWS cloud service like Compute, Network, Storage, and Identity &
access management.
• Hands-on Experience in configuration of Network architecture on AWS with VPC, Subnets,
Internet gateway, NAT, Route table.
• Created Ad-hoc Reports, Drill-Down Reports, Pivot and Tabular Reports. Upgraded SQL Server
2000 DTS Packages to SSIS for uploading various formats of files databases MS SQL Server
2005 using SQL Server Integration Services (SSIS).

Environment: Informatica Power Center 9.6, Oracle 11g, Azure, Autosys, IDQ, Big Query, Cognos, AWS,
UNIX, PL/SQL, IDQ, Teradata V 13.0, SQL* PLUS, TOAD, Teradata SQL Assistant, MS Excel.
Sun Trust Bank, Alpharetta, GA Jan 2016 – July 2017
SQL Developer
• Responsibilities:
• Effective Communication with data architects, designers, application developers and senior
management in order to collaborate on projects that involve multiple teams in a vitally time-sensitive
environment.
• Effectively involved in allocation & review of various development activities / task with onshore
counter apart. Assisted in the definition of the database requirements; analyzed existing models and
reports looking for opportunities to improve their efficiency and troubleshoot various performance
issues. High level proficiency with SQL, Transactional SQL, Stored Procedures, and Relational Database
Management System (RDBMS)

• Developed highly optimized stored procedures, functions, and database views to implement the
business logic and also created clustered and non-clustered indexes. Involved in performance
monitoring, tuning and capacity planning.
• Advised optimization of slow performance queries by looking at Execution Plan for better tuning of the
database. Translated business requirements into BI application designs and solutions.
• Created SSIS package to load data from Flat Files, Excel and XML Files to Data warehouse and Report-
Data mart using Lookup, Derived Columns, Sort, Aggregate, Pivot Transformation, and Slowly Changing
Dimension.
• Used various SSIS tasks such as Conditional Split, Multicast, Fuzzy Lookup etc., which did Data
Scrubbing, including data validation checks during Staging, before loading the data into the Data
warehouse.
• In-depth RDBMS experience in Microsoft SQL Server; Expert T-SQL programming skills including query
optimization, stored procedures, views and functions; Ability to read, interpret and improve query
execution plan.
• Responsible for implementation of data viewers, Logging, error configurations for error handling the
packages. Involved in complete life cycle in creating SSIS packages, building, deploying and executing
the packages in both the environments (Development and Production).
• Created and maintained data flow diagrams for existing and future ETL processes. Designed, deployed,
and maintained complex canned reports using SQL Server 2008 Reporting Services (SSRS).
• Customized reports by adding Filters, Calculations, Prompts, Summaries and Functions. Created
reports to retrieve data using Stored Procedures that accept parameters.
• Created Parameterized Queries, generated Tabular reports, sub reports, Cross Tabs, Drill down reports
using Expressions, Functions, Charts, Maps, Sorting the data, Defining Data sources and Subtotals for
the reports
• Publishing PDF Reports obtained from Report Server on to SharePoint Portal. Maintain and Tune
Teradata Production and Development systems
• Understanding the business logic behind every piece of code and documenting requirements in a
reverse engineering fashion
• Optimized Query Performance, Session Performance and Reliability, did performance tuning of
Informatica components for daily and monthly incremental loading tables. Documented and
presented the production/support documents for the components developed when handing-over the
application to the production support team.

Environment: Informatica Power Center, Teradata SQL Assistant 12.0, Teradata V12.0R2, AWS, Oracle10g/9i,
MS SQL server 2005/2012, Business Objects, Autosys, Toad 7.6, SQL, PL/SQL, Unix Shell Scripting, Windows.

Environment: Oracle RDBMS 9i, Informatica, JAVA, SQL*Plus Reports, SQL*Loader, XML, Toad

Education Details:

Masters: Master’s in engineering management :2017 from Christian brothers University -TN
Bachelors: Computers Science and Engineering :2015 from JNTU -Hyderabad

You might also like