AWS SQL Developer SabithaBekkam
AWS SQL Developer SabithaBekkam
Email: [email protected]
Phone: 510-458-6952
Summary:
• Over 7+ years of Software Life Cycle experience in System Analysis, Design, Development, and
Implementation, Maintenance, and Production support of Data Warehouse Applications, SQL,
PL/SQL, SSIS, SSRS, Power BI, Alteryx, AZURE Data Factory and Python and Domain Knowledge on
HealthCare, Insurance & Payments.
• AWS certified cloud engineer with around 7+ Years of experience in IT industry comprising of
Systems Administration and Change Management, Software Configuration Management (SCM),
Build and Release management and Process Engineering.
• Extensively involved through the Software Development Life Cycle (SDLC) from initial planning
through implementation of the projects.
• Experience in coordinating with users in gathering source data files, defining data elements,
business rules, data, and process flow documentation, prepare High level and Low-level design
documents.
• Experienced in Automating, Configuring and deploying instances on AWS, Azure environments and
Data centers, also familiar with EC2, Cloud watch, Cloud Formation and managing security groups
on AWS.
• Private Cloud Environment - Leveraging AWS and Puppet to rapidly provision internal computer
systems for various clients.
• Develop Puppet modules and role/profiles for installation and configuration of software for
required for various applications/blueprints.
• Over 7 plus years of programming experience as an Oracle, SQL, PL/SQL developer in analysis,
design and implementation of business application and Expertise in client server application
development using oracle 11g/10g/9i/8i, PL/SQL
• Experience in designing, developing, analyzing, and implementing client-server, web and desktop-
based applications using C# language.
• developing pipelines through scripts in ETL streamsets. Extensive experience in ETL/Informatica
Power Center and data integration experience in developing ETL mappings and scripts using
Informatica Power Center10.x/9.x/8.x/7.x, IDQ. Experience in data extraction from heterogeneous
sources using Teradata, Informatica Power center.
• 5 years of experience as an oracle EBS Application developer in implementation, customization,
upgrade of oracle E-business suit application R12.
• Strong knowledge on Azure Storage schematics such as Gen1 and Gen2.Experience utilizing
Snowflake to build data marts with the data residing in Azure storage.
•Worked on Migrating table DDL’s, views and stored procedures from hive to SQL, Netezza to
snowflake, Teradata to snowflake. designing and implementing a fully operational production
grade large scale data solution on Snowflake Data Warehouse.
• To generate underlying data for the reports and to export cleaned data from an Excel
Spreadsheet, Flat file, MS Access and CSV files to the data warehouse.
• Ability to create SSIS Packages using various Transformation like Lookup, Derived Columns,
Conditional Split, Data Conversion, Aggregate, Merge Join, Sort,
• Database development for both OLTP and OLAP systems using MSSQL Server.
• Created and edited maps for various retail transactions including 810, 820,832.
Setup/troubleshooting/data validation configurations to ensure that all inbound and outbound EDI
transactions meet.
• Worked on EDI 834 enrollment and reconciliation implementations based on Affordable care act
compatible companion guide. Crossed Reference EDI Trading Partner Item Cross Reference Customer.
• Provided training on the EDI 834 and 820 transactions and on the reconciliation processes.
• Experience in Database Backup, Recovery and Disaster Recovery procedures. Experienced in creating
and using Stored Procedures, Triggers, Views, User Defined Functions, Sub - Queries and Joins for
complex queries involving multiple tables and exception handlers.
• Hands on experience in Azure cloud services (PaaS &IaaS), storage, web apps, Active directory,
application insights, logic apps, data factory, service Bus, azure monitoring.
• Strong understanding of Release Management process and required applications.
• Writing automation code for the translations using java script in eclipse phrase tree to automate
the translation from TD, Hive, T-SQL, PostgreSQL to snowflake and Azure to synapse.
Hands on experience on unified data analytics with data bricks workspace user interface,
managing data bricks notebooks.
Tested the HIPPAEDI 834 ,270/271transactions according to the test scenarios and verify the data
in the different modules.
• Expertise in DWH technical architecture, design, business requirement definition and Data
Modeling. Able to do Data load and exporting using Teradata utilities such as TPT, FLoad, MLoad,
Fast Export and Tpump.
• Experience in UNIX shell scripting, job scheduling and server communication. Able to Design
schema for big query.
• Extensive experience in implementing projects using Agile (Kanban) and waterfall methodologies
CERTIFICATIONS:
AWS Certified developer - Associate
Technical skills:
ETL Tools Informatica 10.1.1/10.1/9.6/9.1/8.6.1/8.1 Source Analyzer, Mapping Designer, Workflow
Monitor, Workflow Manager, Data Cleansing, Data Quality, Repository, Metadata, streamsets
Data Mart, OLAP, OLTP, SQL Server SSIS.
Databases Oracle 11g/10g/9i/8i, IBM-DB2, MS SQL Server (2008, 2005, 2000, 7.0, 6.5, 6.0), MS Access,
DTS, DB2, snowflake, hive, tsql. Netezza, PL/SQL
8.4.X to 9.5.X, C, C++, C#, Visual Basic 6, Visual Basic,oracle application R12.0.6
Other Tools Toad, SQL Developer, Crystal Reports, SQL Assistant, Alteryx ,Crystal Report, SQL Server 2012
Reporting Services (SSRS), Hadoop YARN, Spark code, Spark Streaming
Programming Languages SQL, Java, PL/SQL, T-SQL, UNIX Shell Scripting
AWS (amazon web services) Certified associate Developer, AWS(EC2, VPC, ELB, S3, EBS, RDS, Route53, CloudFormation,
AWS Auto Scaling, Lambda), AWS CLI, Jenkins, Chef, Terraform, Nginx, Tomcat, JBoss.
DBA Tools SQL, Erwin, TOAD, PL/SQL, T-SQL, VMware,
Professional Experience
FannieMae
Washington,.DC, Virginia September 2022 -Present
Environment: Informatica Power Center 10.1.1, Snowflake, Hive, Netezza, TSQL, Oracle 11g, Azure,
IDQ, UNIX, PL/SQL, SQL* PLUS, TOAD, TERADATA 14.0, MS Excel, Active Batch V12 Console, Cognos, Big
Query, SQL server Management studio 2016. Hadoop YARN, Spark code, Spark Streaming, AWS (EC2,
VPC, ELB, S3, EBS, RDS, Route53, streamsets, PYTHON, CloudFormation, AWS Auto Scaling, Lambda),
GIT, SQL, Jira, AW, ASP.NET core.
Responsibilities:
• Worked with business analysts for requirement gathering, business analysis, and translated the
business requirements into technical specifications to build the Enterprise data warehouse.
• Design and developed end-to-end ETL process from various source systems to Staging area, from
staging to Data Marts.
• Worked with the DBA team on TDM process which is Data Acquisition TDM tool will help rapid
development of ETL process which we can copy one or more tables from SQL server or flat files to
any TD core system.
• In the TDM Development tool which includes ABC metadata entries, creating tables/views
Generation, Active batch development.
• Analyze requirements provided, design and develop integration packages as per the client request
and automate them.
• Wrote complex queries to incorporate the business logic and retrieve all the certification and
candidate details for further reporting purpose.
• Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the
performance of the conversion mapping.
• Created the schema, implemented all stored procedures and 4 SSIS packages Created the
deployment scripts and managed the code in Microsoft Team Foundation Server.
• Performed unit tests on all code and packages.
• Updated existing and created new reports using Microsoft SQL Server Reporting Services. The team
consisted of 2 developers.
• Performed front line code reviews for other development teams.
• Converted the User Requirements into Business Requirements Document, Functional and Technical
Requirements Documents.
• Created Business Process Models from the requirement specifications.
• Worked on developing Tables, Views, Indexes, Stored Procedures, Triggers, Queries, and Macros
using MS SQL Server and Oracle databases.
• Involved in Partitioning tables that have bulk inserts, deletes and updates to improve the
performance.
• Extracted data from xml and flat files using SSIS.
• Scheduled Sessions and Batch Process based on demand, run on time, run only once using Informatica
Server Manager.
• worked on Data Integration using Informatica for the Extraction transformation and loading of data
from various database source systems.
• Designed and developed efficient SSIS packages for processing fact and dimension tables using
transformations like lookup, merge, merge join, script component and slowly changing dimension.
• Created Stored Procedures and handling the Query Performance Issues
• Handled clustering between two Production Servers and performed daily backups and developed
recovery procedures.
• Experienced field Unit Testing have tested codes of T-SQL SSIS packages testing environment.
• Created the PL/SQL scripts to extract the data from operational database into simple flat text file
using UTL_FILE engines.
• Developed advanced PL/SQL packages, procedures, triggers, functions, indexes and collections to
implement business logic using SQL Navigator. wrote conversion scripts using SQL, PL/SQL, Stored
procedures, functions, and packages to migrate data from SQL server database to oracle database.
• Configured DB mail for sending automatic mails to respective people when SSIS process fails.
• Define virtual data warehouse sizing for snowflakes for different types of workloads.
• Schedule different snowflake jobs using NIFI. Used NIFI to ping snowflake to keep client sessions
alive.
• Have experience in snowflake multi-cluster warehouse and snowflake virtual warehouse.
• Consulting on snowflake data platform solution architecture, design, development, and
deployment focused to bring the data driven culture across the enterprises.
• Development of REST APIs in Python and MariaDB, providing backend capabilities to interface with
OpenStack and other downstream APIs.
• Experience in reviewing python code for running the troubleshooting test-cases and bug issues.
• Understanding Python files in open stack environment and make necessary changes if needed.
• Using G-cloud functions with python to load data into a big query for on arrival csv files into GCS
bucket.
• Administered MS SQL Server creating User Logins with appropriate Roles, dropping, and locking the
logins, monitoring User Accounts, granting privileges to users.
• Additional projects included automation of a manual process as the start of a data warehouse
project.
Environment: Informatica Power Center 10.1.1, Hadoop, Hive, Oracle 11g, Azure, IDQ, UNIX, PL/SQL,
SQL* PLUS, TOAD, TERADATA 14.0, MS Excel, Active Batch V12 Console, Cognos, Big Query, SQL server
Management studio 2016. AWS (EC2, VPC, ELB, S3, EBS, RDS, Route53, CloudFormation, AWS Auto
Scaling, Lambda), GIT, SQL, Jira, AWS.
Environment: Informatica Power Center 9.6, Oracle 11g, Azure, Autosys, IDQ, Big Query, Cognos, AWS,
UNIX, PL/SQL, IDQ, Teradata V 13.0, SQL* PLUS, TOAD, Teradata SQL Assistant, MS Excel.
Sun Trust Bank, Alpharetta, GA Jan 2016 – July 2017
SQL Developer
• Responsibilities:
• Effective Communication with data architects, designers, application developers and senior
management in order to collaborate on projects that involve multiple teams in a vitally time-sensitive
environment.
• Effectively involved in allocation & review of various development activities / task with onshore
counter apart. Assisted in the definition of the database requirements; analyzed existing models and
reports looking for opportunities to improve their efficiency and troubleshoot various performance
issues. High level proficiency with SQL, Transactional SQL, Stored Procedures, and Relational Database
Management System (RDBMS)
•
• Developed highly optimized stored procedures, functions, and database views to implement the
business logic and also created clustered and non-clustered indexes. Involved in performance
monitoring, tuning and capacity planning.
• Advised optimization of slow performance queries by looking at Execution Plan for better tuning of the
database. Translated business requirements into BI application designs and solutions.
• Created SSIS package to load data from Flat Files, Excel and XML Files to Data warehouse and Report-
Data mart using Lookup, Derived Columns, Sort, Aggregate, Pivot Transformation, and Slowly Changing
Dimension.
• Used various SSIS tasks such as Conditional Split, Multicast, Fuzzy Lookup etc., which did Data
Scrubbing, including data validation checks during Staging, before loading the data into the Data
warehouse.
• In-depth RDBMS experience in Microsoft SQL Server; Expert T-SQL programming skills including query
optimization, stored procedures, views and functions; Ability to read, interpret and improve query
execution plan.
• Responsible for implementation of data viewers, Logging, error configurations for error handling the
packages. Involved in complete life cycle in creating SSIS packages, building, deploying and executing
the packages in both the environments (Development and Production).
• Created and maintained data flow diagrams for existing and future ETL processes. Designed, deployed,
and maintained complex canned reports using SQL Server 2008 Reporting Services (SSRS).
• Customized reports by adding Filters, Calculations, Prompts, Summaries and Functions. Created
reports to retrieve data using Stored Procedures that accept parameters.
• Created Parameterized Queries, generated Tabular reports, sub reports, Cross Tabs, Drill down reports
using Expressions, Functions, Charts, Maps, Sorting the data, Defining Data sources and Subtotals for
the reports
• Publishing PDF Reports obtained from Report Server on to SharePoint Portal. Maintain and Tune
Teradata Production and Development systems
• Understanding the business logic behind every piece of code and documenting requirements in a
reverse engineering fashion
• Optimized Query Performance, Session Performance and Reliability, did performance tuning of
Informatica components for daily and monthly incremental loading tables. Documented and
presented the production/support documents for the components developed when handing-over the
application to the production support team.
Environment: Informatica Power Center, Teradata SQL Assistant 12.0, Teradata V12.0R2, AWS, Oracle10g/9i,
MS SQL server 2005/2012, Business Objects, Autosys, Toad 7.6, SQL, PL/SQL, Unix Shell Scripting, Windows.
Environment: Oracle RDBMS 9i, Informatica, JAVA, SQL*Plus Reports, SQL*Loader, XML, Toad
Education Details:
Masters: Master’s in engineering management :2017 from Christian brothers University -TN
Bachelors: Computers Science and Engineering :2015 from JNTU -Hyderabad