0% found this document useful (0 votes)
166 views7 pages

IT Professional with 14+ Years Experience

The document appears to be a resume for Venkata Kolli providing contact information and summarizing their professional experience including over 14 years experience in IT with skills in data warehousing, ETL development, cloud data warehouses like Snowflake, databases like Oracle, SQL Server, Teradata, programming languages, and tools like Informatica, Unix and more. Recent roles include data engineering at USAA migrating data from on-premises to Snowflake and data warehouse development at VNSNY developing ETL processes and loading data into Oracle and Snowflake.

Uploaded by

Harsha Gutha
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
166 views7 pages

IT Professional with 14+ Years Experience

The document appears to be a resume for Venkata Kolli providing contact information and summarizing their professional experience including over 14 years experience in IT with skills in data warehousing, ETL development, cloud data warehouses like Snowflake, databases like Oracle, SQL Server, Teradata, programming languages, and tools like Informatica, Unix and more. Recent roles include data engineering at USAA migrating data from on-premises to Snowflake and data warehouse development at VNSNY developing ETL processes and loading data into Oracle and Snowflake.

Uploaded by

Harsha Gutha
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd

Venkata Kolli

Tel: 904-514-1327
Email-id: venkatkolli96@[Link]

Summary:
 Over 14 years of IT experience, involved in complete SDLC including analysis, design, development,
testing, implementation & maintenance of application software.
 Experience in Designing and Development of OLAP and OLTP applications and experience in build-
ing Interface/conversion/data migration in client/server OLAP environment.
 Having 2 years of experience in working with Snowflake cloud DW, administration and AWS S3/EC2
Server, in migrating on premises data warehouse to Snowflake cloud data warehouse.
 Experience with Snowflake DW and good understanding of Snowflake architecture and processing.
 Experience with performance tuning of Snowflake data warehouse with Query Profiler, Caching and
Virtual data warehouse scale out, scale up, scale down in multi cluster warehouse.
 Thorough understanding of Normalization and De-normalization concepts and hands on experience
in designing Normalized and De-Normalized Databases for transactional and analytical systems.
 Good experience in Data warehousing this includes data analysis, data manipulation, and pro-
gramming, testing, implementation and report generation.
 Good experience in Agile Development Methodology.
 Proficient working in Snowflake cloud, Oracle 12c/11g/10g/9i, SQL Server 2005/2008, Teradata
v2r5, DB2 databases and knowledge on Python scripting.
 Strong Data Warehousing experience using Informatica with extensively designing Mappings / Work-
flows, Configuring Servers and scheduling sessions using UNIX shell scripts.
 Extensively worked in ETL process consisting of data transformation, data sourcing, mapping,
conversion along with data modelling.
 Designed and developed PL/SQL functions, stored procedures, cursors, packages and triggers.
 Maintained development, test and production repositories using Repository Manager. Also used
Repository Manager and Metadata Manager to maintain the metadata, Security, Backup and Locks.
 Designed the Data warehouse and Data marts related ETL procedures for extracting the data from
all legacy systems to the target system.
 Well acquainted with Informatica Designer Components - Source Analyzer, Warehouse Designer,
Transformation Developer, Mapplet and Mapping Designer.
 Expert in Performance Tuning using Oracle Hints, Explain plan, TKPROF, Partitioning.
 Involved in the development of Informatica mappings, sessions, workflows and also performed tun-
ing for better performance and generating OBIEE Reports.
 Knowledge of Job scheduling tools to create, modify, troubleshoot jobs, schedule and automate
daily/weekly/monthly/quarterly jobs.
 Hands on experience in using UNIX and Shell commands to create jobs, automate processes, mov-
ing processes to background and monitoring processes.
 Experience in Data mining and analyzing data to identify the Fact and Dimension tables.
 Experience in providing On Demand production support and provide critical and normal patches
for technical and business-related bugs.
 Active team player with commitment to work to meet deadline, strong communication, good ana-
lytical and time management skills.
 Good experience in production support, creating use cases, requirement matrix development and
Testing for Telecom Amdocs Billing solutions.
 Extensive experience in Sun Solaris, HP Unix, Linux Operating Systems and expertise in Shell scripts
and Python scripts.
 Strong Analytical, Organizational, Inter-personal and Communication Skills.
 Help team members with their technical bottleneck.
Technical Skills:
ORACLE 12c/11g/10g/9i/8i, MS SQL Server 2005/2008, Netezza 7.2, Tera-
RDBMS:
data v2r5, DB2, Cache
Cloud Tools: Snowflake cloud, AWS S3
OEM, SQL*NET, IMPORT/ EXPORT, SQL*LOADER, Pro*C, TOAD 12.8, TOAD
Database Utilities/Tools: Data Modeler 5.5, SQL Developer, DTS (Data transformation services),
Aginity work bench 4.0
Metasolve TBS, Amdocs Clarify, Amdocs Ensemble, Amdocs Enabler,
Telecom/CRM:
Micromuse Netcool, Omnibus
Informatica Power Center 10.2.0/9.5/8.x, BODS 4.2, Data Stage 11.3,
Datawarehouse / BI Tools: BOXI R3 (Info View, Designer, WebI, DeskI), Crystal Reports 2008/XI/10.0,
ERWIN 7.2, SSIS, SSRS, OBIEE
HP UNIX, AIX, Solaris 2.x, UNIX SVR4, LINUX, Windows NT 2003/ 2000/
Operating Systems:
4.0, Windows XP, 7, 8, 10
Microsoft Applications: MS word, Excel, MS Access, MS Power point. Share Point 2007/2010
PL/SQL, UNIX Shell Scripts, JAVA, JDK 2.0/1.x, JCL, SQL, XML, TSL, UML,
LANGUAGE:
Shell Script, Transact SQL (T-SQL), PHP, DRUPAL, APACHE, Python
Others: ControlM Scheduler 9.0, Airflow Scheduler 1.10.12

USAA (United Services Automobile Association), Plano, TX (Data Engineer) Aug’21 – Till Date

 Migration of data from on premises to cloud data warehouse for Life Company with R2Rcatchup
validation between Source-Target data loads for all models like Type1, Type2 tables.
 Responsible for loading, validating and monitoring the Netezza to snowflake Historical & Incre-
mental loads.
 Creating DDL scripts for snowflake stage and target tables based on source data base.
 Raised control and log table script request for snowflake environment.
 Creating and executing Config files for all warehouse and mart tables with duplicate data ex-
clusion options.
 Replicated DataStage job business rules to Snowflake procedures with writing complex Snowsql
scripts in Snowflake cloud data warehouse to Business Analysis and reporting.
 Used import and export from internal stage (Snowflake) and external stage (S3 Bucket).
 Developed various complex queries, procedures and functions that are used as part of appli-
cation modules in Snowflake.
 Loading data into Snowflake tables from internal stage and on local drive.
 Used COPY, LIST, PUT and GET commands for validating internal and external stage files for
Bulk loading data from external stage (AWS S3).
 Designed and developed the logic for handling slowly changing dimension tables load by flag-
ging the record using update strategy for populating the desired.
 Involved in cleansing, extraction of data and defined quality rules for the warehouse.
 Involved in Unit testing, User Acceptance Testing to check whether the data is loading into
target, which was extracted from different source systems according to the user requirements.
 Created Stored Procedures to transform the Data and worked extensively in SQL for various
needs of the transformations while loading the data.
 Performed Stored procedure Query Performance tuning.
 Interacted with Lead Developers, System Analysts, Business Users, Architects, Test Ana-
lysts, Project Managers and peer developers to analyze system requirements, design and de-
velopment software applications.
 Environment:  Snowflake cloud, UNIX, Python, FileZilla, WinSCP, Windows NT, DB2, Netezza
7.2, Control M, Gitlab, Aginity work bench 4.0, JIRA.
VNSNY (Visiting Nurse Service of NY), New York, NY (Data Warehouse Developer) Jul’16 – Jul’21

 Involved in full life cycle development including Design, ETL strategy, troubleshooting, re-
porting, and Identifying facts and dimensions.
 Designed the ETL processes using Informatica to load data from Oracle, Flat Files (Fixed
Width), and Excel files to staging database and from staging to the target ORACLE Warehouse
database and generating OBIEE Reports.
 Used import and export from internal stage (Snowflake) vs external stage (S3 Bucket).
 Writing complex Snowsql scripts, Queries, Procedures and functions that are used as part of
different application modules in Snowflake cloud data warehouse to Business Analysis and re-
porting.
 Implemented the best practices for the creation of mappings, sessions and workflows and per-
formance optimization.
 Created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expres-
sion, Filter, Router, Lookup, Update Strategy and Sequence Generator.
 Involved in Dimensional modeling (Star Schema) of the Data Warehouse and used Erwin to de-
sign the business process, dimensions and measured facts.
 Bulk loading from external stage (AWS S3).
 External stage (snowflake) using COPY command.
 Created Python scripts to upload/download files to/from S3 bucket.
 Created Unix scripts to validate files from S3 bucket, to ensure file adhered to business rules.
 Developed Snowflake procedures to ingest data from file to Snowflake.
 Transformed Informatica workflows into Snowflake procedures and developed validation scripts
to ensure data correctness.
 Loading data into Snowflake tables from internal stage and on local machine.
 Used COPY, LIST, PUT and GET commands for validating internal and external stage files.
 Designed and developed the logic for handling slowly changing dimension tables load by flag-
ging the record using update strategy for populating the desired.
 Involved in cleansing and extraction of data and defined quality process for the warehouse.
 Involved in performance tuning and optimization of Informatica mappings and sessions using
features like partitions and data/index cache to manage very large volume of data. Used al-
ternative methods ex: Oracle Bulk insert/ Update, to process records upwards of 100 million.
 Involved in migration of mappings and sessions from development repository to production
repository.
 Involved in Unit testing, User Acceptance Testing to check whether the data is loading into
target, which was extracted from different source systems according to the user requirements.
 Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various
needs of the transformations while loading the data.
 Wrote SQL queries, PL/SQL programming and Query Level Performance tuning.
 Interacted with Lead Developers, System Analysts, Business Users, Architects, Test Ana-
lysts, Project Managers and peer developers to analyze system requirements, design and de-
velopment software applications.
 Created new extracts for external vendors and use Informatica ETLs for new workflows to
move data out of multiple data sources.
 Unit test coding changes fill out required documentation such as installation instructions and
follow standards and procedures.
 Environment:  Informatica10.X/Informatica PowerCenter10.X, UNIX, Python, FileZilla, WinSCP,
MS ACCESS, Windows NT, Oracle 12c, DB2, Teradata, MS SQL Server, OBIEE, Snowflake cloud,
Control M, Airflow, SVN, Erwin 4.0, PL/SQL, T-SQL, TOAD 9.5, TFS Version Controller, JIRA.
ESPN, Bristol, CT (Sr. Oracle PL/SQL / ETL Developer) Oct’15 – Jun’16

 Analyzed System Requirements to create high level and detail design documents with the assis-
tance of requirements team and Architects.
 Prepared the required application design documents based on functionality required.
 Analyzed the ETL processes using BODS to load data from Oracle, Flat Files (Fixed Width), and
Excel files to staging database and from staging to the target Enterprise Data Warehouse db.
 Developed highly complex applications using Oracle 11g database as back-end with expertise in
design and development of Oracle PL/SQL Packages and Procedures.
 Designed and developed Oracle objects such as Tables, Views, Indexes, Stored Procedures,
Functions, Packages in PL/SQL, Materialized Views, External tables and Dynamic SQL.
 Used concepts like Partitioning, Partition pruning, Index by Tables, Bulk Collect, Pipelined
Functions, Insert All, Global Temporary tables.
 Developed Extraction, Transformation, and Load (ETL) scripts utilizing SQL, PL/SQL utilities
and provide solutions to critical issues enhancing performance and productivity to the project.
 As a key PL/SQL developer, defined the required analytics to design, develop ad sales routines,
and assisted various work streams in identifying, and resolving data irregularities, and establish
management reporting processes and requisite supporting technology.
 Constructed complex SQL queries with sub-queries, inline views as per the functional needs in
the Business Requirements Document (BRD). Created SQL scripts to perform unit and compo-
nent testing financial calculations such as, Interest receivables.
 Designed and developed the logic for handling slowly changing dimension table’s load by flag-
ging the record using PL/SQL for populating the desired data.
 Involved in Unit testing, User Acceptance Testing to check whether the data is loading into
target, which was extracted from different source systems according to the user requirements.
 Used Web Services for authentication of the end users.
 Wrote SQL queries, PL/SQL programming and Query Level Performance tuning.
 Interacted with Lead Developers, System Analysts, Business Users, Architects, Test Ana-
lysts, Project Managers/peer developers to analyze system requirements, design and develop-
ment.
 Created new extracts for external vendors and use BODS ETLs for new workflows to move data
out of multiple data sources.
 Unit test coding changes fill out required documentation such as installation instructions and
follow standards and procedures.
 Based on the business requirements created Functional design documents and technical de-
sign specification documents for ETL Process.
 Performed Backend Testing, Data analysis by executing the SQL queries.
 Environment:  Oracle 11g, Business Objects Data Services 4.2, UNIX, WinSCP, Clear case, Clear
quest, Erwin 4.0, PL/SQL, Toad 9.5, PLSQL Developer 7, Windows 7 Pro, AB Initio, Business Ob-
jects, DOORS, TFS Version Controller, JIRA.

NBC Universal, New York, NY (Sr. Oracle PL/SQL / ETL Developer) Mar’14 – Sep’15

 Analyzed System Requirements to create high level and detail design documents with the assis-
tance of requirements team and Architects.
 Involved in full life cycle development including Design, ETL strategy, troubleshooting, re-
porting, and Identifying facts and dimensions.
 Prepared the required application design documents based on functionality required.
 Designed the ETL processes using Informatica to load data from Oracle, Flat Files (Fixed
Width), and Excel files to staging database and from staging to the target Teradata Warehouse
database.
 Wrote Queries, Procedures and functions that are used as part of different application mod-
ules.
 Implemented the best practices for the creation of mappings, sessions and workflows and per-
formance optimization.
 Created mappings using several transformations like Source Qualifier, Sorter, Joiner, Aggre-
gator, Expression, Filter, Router, Lookup, Update Strategy, Normalizer and Sequence Gen-
erator transformations in Informatica.
 Developed highly complex applications using Oracle 11g database as back-end with expertise in
design and development of Oracle PL/SQL Packages and Procedures.
 Designed and developed Oracle objects such as Tables, Views, Indexes, Stored Procedures
and Functions in PL/SQL, Packages in PL/SQL, Materialized Views, and Dynamic SQL.
 Used concepts like Partitioning, Partition pruning, Index by Tables, Bulk Collect, Pipelined
Functions, Insert All, Global Temporary tables.
 Developed Extraction, Transformation, and Load (ETL) scripts utilizing SQL, PL/SQL utilities
and provide solutions to critical issues enhancing performance and productivity to the project.
 As a key PL/SQL developer, defined the required analytics to design, develop financial rou-
tines, and assisted various work streams in identifying, and resolving data irregularities, and es-
tablish management reporting processes and requisite supporting technology.
 Constructed complex SQL queries with sub-queries, inline views as per the functional needs in
the Business Requirements Document (BRD). Created SQL scripts to perform unit and compo-
nent testing financial calculations such as, Interest receivables.
 Involved in Dimensional modeling (Star Schema) of the Data Warehouse and used ERwin to de-
sign the business process, dimensions and measured facts.
 Designed and developed the logic for handling slowly changing dimension table’s load by flag-
ging the record using update strategy for populating the desired.
 Involved in performance tuning and optimization of Informatica mappings and sessions using
features like partitions and data/index cache to manage very large volume of data.
 Involved in migration of mappings and sessions between repositories.
 Involved in Unit testing, User Acceptance Testing to check whether the data is loading into
target, which was extracted from different source systems according to the user requirements.
 Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various
needs of the transformations while loading the data.
 Wrote SQL queries, PL/SQL programming and Query Level Performance tuning.
 Conduct status meetings with project managers, escalate issues when necessary, conducts
meetings for issues resolution.
 Interacted with Lead Developers, System Analysts, Business Users, Architects, Test Ana-
lysts, Project Managers and peer developers to analyze system requirements, design and de-
velop software applications.
 Created new extracts for external vendors and use Informatica ETLs for new workflows to
move data out of multiple data sources.
 Unit test coding changes fill out required documentation such as installation instructions and
follow standards and procedures.
 Based on the business requirements created Functional design documents and technical de-
sign specification documents for ETL Process.
 Worked extensively on End-to-End System Test.
 Performed Backend Testing, Data analysis by executing the SQL queries.
 Environment:  Oracle 11g, Informatica9.X/Informatica PowerCenter9.5.1/8.6, UNIX, WinSCP,
Clear case, Clear quest, DB2, Teradata, Erwin 4.0, PL/SQL, Toad 9.5, PLSQL Developer 7, Win-
dows XP 5.1.2, AB Initio, Business Objects, DOORS.

SEI Investments, Oaks, PA (Sr. Oracle PL/SQL / ETL Developer) May’13 – March'14

 Analyzed System Requirements to create high level and detail design documents with the assis-
tance of requirements team and Architects.
 Worked closely and supported members of teams such as Architecture, Change management,
Database Management, Technical operations, System Test, UAT, Requirements and Business
teams during entire life cycle of the project.
 Developed highly complex applications using Oracle 11g database as back-end with expertise in
design and development of Oracle PL/SQL Packages and Procedures.
 Designed and developed Oracle objects such as Tables, Views, Indexes, Stored Procedures
and Functions in PL/SQL, Packages in PL/SQL, Materialized Views, and Dynamic SQL.
 Used concepts like Partitioning, Partition pruning, Index by Tables, Bulk Collect, Pipelined
Functions, Insert All, Global Temporary tables.
 Worked on several transformations such as Source Qualifier, Sorter, Expression, Filter, Ag-
gregator, Joiner, Lookup, Router, Update Strategy, Normalizer and Sequence Generator
transformations in Informatica.
 Developed Extraction, Transformation, and Load (ETL) scripts utilizing SQL, PL/SQL utilities
and provide solutions to critical issues enhancing performance and productivity to the project.
 As a key PL/SQL developer, defined the required analytics to design, develop financial rou-
tines, and assisted various work streams in identifying, and resolving data irregularities, and es-
tablish management reporting processes and requisite supporting technology.
 Designing and developing complex Informatica mappings using expressions, joiners, routers, ag-
gregators, filters, lookup, and normalizer to ensure movement of the data between various ap-
plications.
 Constructed complex SQL queries with sub-queries, inline views as per the functional needs in
the Business Requirements Document (BRD). Created SQL scripts to perform unit and compo-
nent testing financial calculations such as, Interest receivables.
 Handled code defects, enhancements and release management using tools like HP Quality Cen-
ter and Rally.
 Performed performance tuning for faster retrieval of data using various optimizer hints such as
Parallel, Ordered, Append and Use Hash and used customized v$ scripts to monitor the data-
base processes and performance.
 Extracted the data from Flat files, Oracle, SAP systems using the BAPI/RFC transformation
and load the data into flat files, Oracle Using Informatica Power center
 Based on the business requirements created Functional design documents and technical de-
sign specification documents for ETL Process.
 Worked extensively on End-to-End System Test.
 Performed Backend Testing, Data analysis by executing the SQL queries.
 Environment:  Oracle 11g, Toad 9.5, PLSQL Developer 7, Informatica Power Center 8.6.0, Win-
dows 7, StarTeam Version Controller, HP Quality Center, Rally.

AT&T Tele Communications, Richardson, TX (Sr. Oracle PL/SQL Developer) June’09 – Apr’13

 Analyzed System Requirements to create high level and detail design documents with the assis-
tance of requirements team and Architects.
 Worked closely and supported members of teams such as Architecture, Change management,
Database Management, Technical operations, System Test, UAT, Requirements and Business
teams during entire life cycle of the project.
 Developed highly complex applications using Oracle 11g database as back-end with expertise in
design and development of Oracle PL/SQL Packages and Procedures.
 Developed Extraction, Transformation, and Load (ETL) scripts utilizing SQL, PL/SQL utilities
and provide solutions to critical issues enhancing performance and productivity to the project.
 Constructed complex SQL queries with sub-queries, inline views as per the functional needs in
the Business Requirements Document (BRD). Created SQL scripts to perform unit and compo-
nent testing financial calculations such as, Interest receivables.
 Worked on various AMDOCS environments Ensemble and Enabler.
 Created BANs for various LECs and Geo codes to ensure production like environment.
 Worked extensively on End-to-End System Test.
 Validated QA bills, Direct bills and sent the IR file for validation.
 Performed Backend Testing, Data analysis by executing the SQL queries.
 Created and tested Batch Processes developed in UNIX, PL/SQL.
 Environment: Oracle 11g/10g, Clear case, Clear quest, Toad 9.5, PLSQL Developer 7, Sun OS
5.8, Windows XP 5.1.2, AB Initio, Business Objects, DOORS.

QWEST Communications, Dublin, OH (Sr. Oracle Database Developer) Feb‘07 – Jun ‘09

 Involved creating data model for DDR, designed schemas, tables, materialized views, range
partitions, Indexes.
 Creating ETL processes to process data that comes in various forms, including Message Queues
(MAINFRAME Mercator Queue), flat files and materialized views.
 Used Oracle features like Global temporary tables, analytical functions, merge, table func-
tions.
 Created views for reporting purpose which involves complex SQL queries with sub-queries, in-
line views, multi table joins, with clause and outer joins as per the functional needs in the
Business Requirements Document (BRD).
 Various daily and monthly UNIX Perl/Shell scripts are created or modified to include the new
requirements, enhancements, accordingly Oracle stored procedures, functions, triggers and
packages are either created or modified to meet to the business user's expectations.
 Creating/coding SQR reports to analyze, extract, transform, load, and integrate data into con-
sumable PDF, CSV, TXT and XLS formats, meeting all critical milestones using data warehouse
techniques.
 Proficiently used debug feature in SQL Navigator, TOAD for debugging pl sql programs
 Created Purge process to remove or archive data older than the required span.
 Working closely with DBAs to enhance query and program performance, used tuning hints like
INDEX, ORDERED, USE_NL, USE_HASH. used Explain plan, Auto trace, Sql trace and TKPROF
to analyze query execution plan
 Refreshing the dev and test databases with the production data for testing needs
 Worked with different databases like SQL Server, SYBASE, DB2
 Created comprehensive documentation for the data mapping elements by identifying the vari -
ous data sources and mapping each line item within the reporting requirements to the specific
database column using the DDR Business Requirement Document and Use cases
 Environment: Oracle 10g, SQL * Plus, TOAD, SQL Navigator, SQL*Loader, import/export, UNIX
(AIX 5.2), J2EE, Windows XP, SQL Server 2008

You might also like