0% found this document useful (0 votes)
16 views6 pages

Tegan Ramos

Tegan Ramos is a seasoned IT professional with over a decade of experience in various roles including Sr. Data Architect, DBA, and BI Manager, specializing in data architecture, ETL processes, and BI tools. She has extensive knowledge in cloud systems, particularly AWS, and has managed large IT teams while leading complex projects across multiple industries. Her technical skills encompass a wide range of tools and programming languages, with a strong focus on data modeling, big data analytics, and performance optimization.

Uploaded by

uday.laconic22
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views6 pages

Tegan Ramos

Tegan Ramos is a seasoned IT professional with over a decade of experience in various roles including Sr. Data Architect, DBA, and BI Manager, specializing in data architecture, ETL processes, and BI tools. She has extensive knowledge in cloud systems, particularly AWS, and has managed large IT teams while leading complex projects across multiple industries. Her technical skills encompass a wide range of tools and programming languages, with a strong focus on data modeling, big data analytics, and performance optimization.

Uploaded by

uday.laconic22
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 6

Tegan Ramos

[email protected]

PROFESSIONAL SUMMARY:
 Spent more than a decade working as a Sr. Data Architect Sr. ABI/Teradata DBA/Developer/ Sr. Database
Administrator/Sr. BI Manager and Great understanding of BI tools such as Hyperion, Tableau, Spotfire, OBIEE,
etc... as well as ABI tools such as Express>It, Co>Operating System, Technical Repository, GDE, Data Profiler,
Data Quality Environment, Metadata Hub, Control Center, Conduct>It
 Exceptional Experience in managing large IT team ( 15-30 people)
 Worked as Sr. Data Modeler, Sr.Teradata Database Administrator, Sr. ETL Developer, Sr. Architect and Sr.
Project Manager, Sr. BI Manager.
 Experience in DevOps and Performance Engineer, and Systems Architect with 6+ years' experience.
 Architecture/design and deployment of solutions using bare solutions, VMWare and Amazon Web Services.
 Hands-on experience with most layers of AWS offerings and integration, and migration of existing metal
solutions into virtualized hosting environments using EC2, S3, VPC, ELB, Auto scaling, with Cloud watch metrics
integration.
 Setting up databases in AWS using RDS including MSSQL, MySQL, MongoDB & Dynamo DB. storage using S3
bucket and configuring instance backups to S3 bucket. Configured and maintained DNS systems using BIND,
Route53 (AWS) and Power DNS.
 Experience in managing large Shared MongoDB cluster.
 Great Knowledge in MongoDB installation, patching, troubleshooting and performance.
 Hands-on in managing life cycle of MongoDB including sizing, automation, monitoring and tuning.
 Tremendous knowledge on working with MongoDB Ops Manager, Cloud Manager and Atlas Manager
 Experienced with the understanding of the principles and best practices of Software Configuration Management
(SCM) processes, which include compiling, packaging, deploying and Application configurations like GIT,
Subversion (SVN) and TFS on Linux and windows platforms. Worked on SDLC Methodologies like Agile, Waterfall,
and other processes.
 Exceptional background in analysis, design, and data modeling (using ERwin, Embarcadero, Oracle Designer and
ARIS), development, and implementation of Master Data Management (MDM).
 Seasoned in Enterprise Data Warehouse (EDW) design (star & snowflake), performance tuning, query
optimization and years of hands on experience in SQL, Fastload, Bteq, MLoad, TPump and knowledge of various
database systems (Teradata, Sybase, Oracle, SQL Server, IBM DB2).
 Business Intelligence professional with solid IT experience and demonstrated expertise in managing all facets of
ETL processes: SAP ERP/Abinitio/Teradata/SSIS/Oracle Data Integrator (ODI)/Oracle Golden Gate/IBM
DataStage/Informatica.
 Hands on experience on ABI tools such as Express>It Ab Initio productivity software for analyst-driven
application development & DQE – Which enables users to detect data quality issues in input sources.
 Experience negotiating/managing projects and leading small teams.
 Proficient in Microstrategy/Microsoft (BI stack), Microsoft SharePoint Server (MOSS), Oracle (OBIEE), IBM
Cognos, SAS, SAP BW technologies and MicroStrategy BI suite.
 Experienced in Big Data Analytics and massively parallel processing (MPP) architectures like Teradata,
GreenPlum, and Netezza.
 Knowledge of Big Data concepts and experience of non-relational databases Hadoop (HDFS).
 Adept at all stages of project lifecycle, from gathering business requirements and writing system requirements
to development, testing, production support, and completion.
 Years of experience in writing intricate and fancy reports using COTS software including but not limited to:
Microsoft Reporting (SSRS), SAP Business Objects, Tableau, etc.
 Expertise in developing BI Roadmap to provide better reporting solutions and help perform advanced predictive
and responsive analysis on existing OLTPs.
 Good knowledge of Hadoop ecosystems, HDFS and Cloudera Impala. Also, have hands on Hive, Spark, Scala,
MongoDB, MemSQL
1
 Well versed with the Cloud Systems: SnapLogic, Aspera

Certification:
TOGAF 9 Certification - January 2017
Teradata Certified Master (V2R5) - May 2007–June 2007

Areas of Expertise:

Good understanding of ERP such as SAP, Snaplogic, Oracle,Data and Process modeling, Data Architecture, OLAP and
Data Warehousing, ODI Architecture, TOGAF9 Framework, ETL (Extract-Transform-Load), Ab Initio, Hadoop, Talend,
Teradata & Netezza, Data Mining /Microstrategy, C, C++, Java, VB, Fortran, TFS, SAS Enterprise Miner, SAS Predictive
Analysis, SAS Decision Manager, SAS GRID, ClearCase, ClearQuest ,Team Concert, Jenkins, Coverity

TECHNICAL SKILLS:
Data Modeling: Erwin, Visio, ER/Studio
Big Data: Hadoop, MapReduce, Hive, Oozie, Yarn, Hbase, Pig
ETL tools: Ab Initio, Informatica 9.1, Datastage
TOGAF 9 Architecture Framework: Core Concepts, Intro to the ADM, The Enterprise Continuum and tools, ADM Phases,
ADM Guidelines and Techniques, Architecture Governance, Architecture Views, Viewpoints and Stakeholders, Building
Blocks, ADM Deliverables, TOGAF Reference Models
Programming Languages: Java, COBOL, Pascal, C, C++, C#, JSON, Lisp, Eclipse, DOORS
Teradata: Fastload, Bteq, FastExport, Tpump, Mload, TPT
RDBM: Redshift, Oracle, Access, DB2, SQL Server
Systems: Unix/Linux, Windows, Mainframe
Business Intelligence: IBM Cognos, SAS BI, OBIEE, Microstrategy

PROFESSIONAL EXPERIENCE:
Client: Boeing, Seattle, WA/Tampa, FL July 2017 – Present
Role: Sr. Data Architect/Teradata
 Develop batch processes that query within Ab Initio an Oracle server, which includes some payroll info and load
that data into our EDW environment via both Teradata Utilities Fastload & BTEQ.
 Develop batch processes that decrypt provided feeds and populates that data into our EDW environment via
both Teradata Utilities Fastload & BTEQ within Ab Initio after transforming the data per business requirement.
 Develop some CRM batch processes via Ab Initio consisting of these phases.
 Setup Teradata Data Labs.
 Configure replication with replica set factors, arbiters, voting, priority, server distribution, slave delays
 Implement JavaScripts for using DML Operation with MongoDB
 Create, Configure and monitor Shard Sets
 Build various types of indexes on different collections to get good performance in MongoDB
 Create Document in MongoDB
 Configure replication with replica set factors, arbiters, voting, priority, server distribution, slave delays
 Mentor BI Developer and Sr. Developer about how to transfer data to the data labs.
 Propose tools to use in order to manipulate huge data prior to loading to data labs.
 Create ER diagram and conceptual, logical, physical data model.
 Designed and developed 3 tier web applications migrated to and hosted in Azure.
 Develop ABI Batch Processes using TD as target DB while using both DQE and Express>It

2
 Adapted web application to run in Windows Azure against SQL Azure using Azure Queues for background
processing.
 Design logical and physical data model using Erwin data modelling tool and Visio.
 Troubleshooting database issues related to performance, queries and stored procedure.
 Fine-tune the existing scripts and process to achieve increased performance and reduced load times for faster
user query performance.
 Prepared architecture plan to create the Azure Cloud environment to host migrated IaaS and PaaS role
instances for refactored applications and databases.
 Accountable for Architect related deliverables to ensure all project goals are met within the project time lines.
 Performs mapping between source and target data, as well as performing logical model to physical model
mapping and mapping from third normal form to dimensional.
 Manage and lead a team of developers to design a datamart using both ABI & Datastage ETL tools.
 Creates, validates and updates the data dictionary and analyzing documentation to make sure that the
information captured is correct.
 Architecture and design support to provide solution for business initiated requests/ projects.
 Writing Teradata & Netezza sql queries to join or any modifications in the table.
 Gathered the initial requirements and translated them into functional specifications.
 Configured master data submodule to match the requirements and goals including the customer master,
material master and sales areas.
 Customized the new sales order types, item categories and schedule lines for inflight products.
 Performed unit testing, debugging and trained end-users.

PROFESSIONAL EXPERIENCE:
Client: Bank of America, Charlotte, NC/Tampa, FL February 2017 – July 2017
Role: Sr. Data Architect/Sr. Developer/Sr. BI Manager/DBA
 Propose Architectural design changes to improve data warehouse performance.
 Develop ABI Batch Processes using TD as target DB while using both DQE and Express>It
 Develop ABI batch processes.
 Visualize a data architecture design from high level to low level, and design performance objects for each level.
 Create ER diagram and conceptual, logical, physical data model.
 Design logical and physical data model using Erwin data modelling tool and Visio.
 Troubleshooting database issues related to performance, queries and stored procedure.
 Fine-tune the existing scripts and process to achieve increased performance and reduced load times for faster
user query performance.
 Performed reverse engineering for a wide variety of relational DBMS, including Microsoft Access, Oracle and
Teradata, to connect to existing database and create graphical representation (E-R diagram) using Erwin 9.2.
 Accountable for Architect related deliverables to ensure all project goals are met within the project time lines.
 Performs mapping between source and target data, as well as performing logical model to physical model
mapping and mapping from third normal form to dimensional.
 Manage and lead a team of developers to design a datamart using both ABI & Datastage ETL tools..
 Creates, validates and updates the data dictionary and analyzing documentation to make sure that the
information captured is correct.
 Utilized Version Control and Continuous Integration System like GitHub, with deployments of containerized
applications via Docker onto a highly available Kubernetes cluster hosted in Azure.
 Used Teradata utilities such as Fast Export, MLOAD for handling various tasks.
 Architecture and design support to provide solution for business initiated requests/ projects.
 Working as part of DevOps Team for different internal automation and build configuration management.
 Developed shell scripts for automation of the build and release process.
 Automated the front-ended platform into highly scalable, consistent, repeatable infrastructure using a high
degree of automation using Chef, Jenkins, and Cloud Formation.

3
 Performed data analysis and data profiling using complex SQL on various sources systems including Oracle
10g/11g and Teradata.
 Maintaining the user accounts (IAM), RDS, Route 53 services in AWS cloud. Setup and Maintenance of
automated environment using Chef Recipes & Cookbooks for different applications.
 Developed Python Scripts to automate log rotation of multiple logs from web servers..
 Works with users and IT to develop solutions to make transactions faster and more accurate. Error proof
 Establish partnerships with department managers and healthy relationships with employees

Client: Brighthouse /Spectrum Tampa, FL September 2015 – February 2017


Role: Sr. Data Architect/Sr. TD DB Administrator/Sr. BI Manager
 Manage and lead a team of developers to design the ETL workflows and SOA based system integration.
 Work on technologies like Oracle, Oracle Data Integrator.
 Design and develop interfaces to load the data from multiple sources like Relational databases, flat files into
oracle database.
 Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical
database (capacity planning, object creation and aggregation strategies) for Oracle and Teradata as per business
requirements using Erwin 9.5.
 Develop packages, scenarios using interfaces and variables.
 Debugging the master jobs and packages by using operator navigator.
 Implemented SCD Type2 on Dimension tables.
 Architect system integration using Ab Initio as ETL and Teradata as target DB
 Use Ab Initio DQE suite to perform data profiling, validation and checks on data quality.
 Integrate Active Directory CRM with ODS using Ab Initio and capture real time updates from Active Directory.
 Develop and maintain the data models (Conceptual, Logical, & Physical) for the ODS using ER Studio and develop
the OLAP model for performing analytics.
 Used BTEQ for SQL queries, and TPump for loading data from source into teradata tables
 Use optimization techniques for faster data transfer from one system to another. For example- Bulk API
(Salesforce), pushdown optimization, etc.
 Use DTSWIZARD and SSIS for rapid prototyping in SQL Server and testing the proof of concepts.
 Use shell scripting in Perl for preliminary file processing in the Unix box.

Client: United Airlines, Miami, FL February 2007 – September 2015


Role: Sr. Developer/Principal Architect – Data Warehouse Group/Sr. Teradata DB Administrator
Led the architecture direction and coordinated the full SDLC of all projects.
Developed and maintained several realtime applications as well as some batch applications.
 Develop a C++ INMOD application within Teradata using OCX ActiveX that allows to load several UA logos into
the Data Warehouse.
 Develop a batch process that queries within Ab Initio a SQL server, which includes mobile payments and load
that data into our EDW environment via both Teradata Utilities Fastload & BTEQ.
 Develop a Wi-Fi heartbeat batch process that decrypts a provided feed and loads that data into our EDW via
both Teradata Utilities Fastload & BTEQ within Ab Initio after transforming the data per business requirement.
 Develop a batch process that scrapes our active directory information via Ab Initio and load that data into our
EDW via both Teradata Utilities Fastload & BTEQ to help HR along with other department(s) to complete their
duty more effectively.
 Develop several batch applications within Ab Initio that migrates data from DB2, Oracle, Netezza, SQL Server to
Teradata
 Develop some CRM batch processes via Ab Initio consisting of these phases:
 Gather some source data from our EDW environment and load them via both Teradata Utilities Fastload & BTEQ
within Abinitio and send them to a 3rd party for cleansing.

4
 Upon receiving the responses from the third party we load these data which divided into src clnse, rejects and
infobase into our EDW for business analysis while using both DQE and Express>It
 Develop a C++ realtime application which consists of 2 phases:
 Data Gathering which is done by an application service that gathers data from a mainframe and stores them into
data files.
 Data loading which is done by an automate system that kicks off a bunch of ETL tools to transform and bring the
data into the Data warehouse via both Teradata Utilities Fastload & BTEQ.
 Maintain several realtime applications using the Goldengate interface
 Some complex ones in which a SQL team extracts data and puts them to a log file and there is a process that
replicates the data and brings them into the data warehouse via some store procedures using many to one
mapping.
 Develop an ASP.NET/C# web service that includes several modules allowing different department developers to
retrieve the data within xml format on the back-end.
 Help improving Cargo architecture and data loading within Abinitio.
 Develop an intricate C++ BIDT global distribution application, which consists of eight modules: Amadeus, Abacus,
Infini, Galileo, Axxess, Saber, TravelSky and Topas.
 The data of each module is loaded within a C++ INMOD prior to being loaded in the DW via some ETL
transformation script.
 Develop a PNR web application (written in JAVA) which, is used by several other group developers within the UA
IT. The project consists of several modules :
 Develop a comprehensive, modular, JAVA enterprise wide travel management system that automates and
integrates most of UA business processes. The product addresses all the critical business functions such as
Operations, Sales management and Distribution, Financial Management, Revenue Accounting and Interline
Billing, etc. It consists of different modules : car, hotel, and flight.
 Develop a batch process that exports multiple daily files via Teradata Utility fast export to another outstanding
process that helps to determine whether or not flights over the next 14 days will be understaffed.
 Alert Type: A Daily Load, which loads The Alert description change into the EDW via Teradata Utility MLOAD.
 Chase Cardholder: which loads first to favorite chase card holder into the DW via Teradata Utility tpump.
 Ion Track: which exports a file via Teradata Utility fastexport that includes passenger counts by cabin and elite
level.
 CMMA MNS/AMEX: which loads some audit files for distribution planning via Teradata Utilities Fastload and
Bteq.
 Developed new projects of EME model in coordination with designing and technological teams.
 Assisted in overall designing and development of software applications and systems
 Provided technical assistance for development of file systems to support data, migration and archive directories.
 Implemented procedures for management of Ab Initio applications and NDC servers.
 Responded to storage requests for submission to appropriate support groups.
 Prepared automated script, watcher files, home and temporary directories for other teams.
 Executed effective processes for installation and validation on DR servers during drill.
 Supported technical teams in definition of requirements and creation of architectural specifications.

Client: Continental Group INC, Miami, FL June 2002 – September 2005


Role: Teradata Consultant
 Respond to user/developer requests by helping them with their day to day issues with the Teradata Product.
 Assist developer with their assigned projects using the TD ETL tools, Assist BI Users with their queries using SQL
assistant, Help the automate team automated complex projects.

Client: Ministry of Finances, Port-au-prince, Haiti 1999 – June 2002


Role: Teradata Developer
 Maintain and develop several batches and real time projects using the TD DB. Certain of these projects related
to migrating- data from several DBs to TD.
5
EDUCATION:
Masters of Sciences in IT – Florida International University - January 2012 – Dec. 2013
Florida International University, Bachelor of Sciences in Computer Sciences and Minor in French May 2004 – Fall 2006
Miami Dade College(MDC), Miami, FL - Associate of Arts in Computer Sciences, Graduation Date: May 2004

You might also like