0% found this document useful (0 votes)
101 views1 page

Resume Lisa Pang

Lisa Pang is pursuing a Master of Science in Computer Science at the University of East London, with a GPA of 90/100. She has professional experience as a Data Engineer at China CITIC Bank, where she optimized data pipelines and developed automation tools, and as a Data Analyst at Partner Cloud Information Technology, managing advertising budgets and improving reporting efficiency. Her technical skills include Python, SQL, and various data visualization tools, and she is fluent in Mandarin and English.

Uploaded by

Mickey Rebel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
101 views1 page

Resume Lisa Pang

Lisa Pang is pursuing a Master of Science in Computer Science at the University of East London, with a GPA of 90/100. She has professional experience as a Data Engineer at China CITIC Bank, where she optimized data pipelines and developed automation tools, and as a Data Analyst at Partner Cloud Information Technology, managing advertising budgets and improving reporting efficiency. Her technical skills include Python, SQL, and various data visualization tools, and she is fluent in Mandarin and English.

Uploaded by

Mickey Rebel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Lisa, Pang

+65 84760398 | lisa_819@[Link] | LinkedIn

EDUCATION
UNIVERSITY OF EAST LONDON SINGAPORE
master of Science in Computer Science Jun2024 – Jun 2025
GPA: 90/100
Core Courses: Advanced Software Engineer, Artificial Intelligence, Cloud Computing, Machine Learning, Data Analytic

PROFESSIONAL EXPERIENCES
CHINA CITIC BANK BEIJING, CN
Data Engineer Jun 2022 – May2024
 Data pipeline Optimization: Refactored Python ETL scripts to increase data cleaning efficiency by 35%, increasing
customer and product transaction's data processing from 50GB to 120GB.
 Data warehouse construction: Designed and implemented a MySQL data warehouse model for product push
campaigns, prize activities, and customer metrics, supporting the generation of 10+ business line reports while
reducing data redundancy by 40%.
 Automation Tool Development: Automated daily, weekly, and monthly data update tasks using Airflow scheduling,
reducing manual intervention time from 2 hours per day to just 10 minutes.
 Dashboard Development: Developed reports and dashboards using Tableau, Power BI, and other visualization
tools, delivering monthly data Kanban and leveraging data visualization to help businesses quickly identify issues,
improving efficiency by 30%.

PARTNER CLOUD INFORMATION TECHNOLOGY BEIJING, CN


Data analysts Jun 2020 – May 2022
 Data advertising analysis: Managed an annual $3 million advertising budget, optimizing strategies through
audience segmentation, A/B testing of ad creatives, bid adjustments, and budget reallocation across platforms.
These tactics led to an 18% improvement in average CTR, driving more efficient campaign performance.
 Business operation assisted : Developed a Linear Attribution Model using Python and SQL to analyze customer
touch points across different marketing channels. The model increased the accuracy of social product customer LTV
predictions by 22%, providing actionable insights for better budget allocation and customer targeting.
 Dashboard structured: Designed and implemented an automated reporting system using Tableau and Airflow to
streamline data visualization and reporting processes. The system improved the team’s analysis efficiency by 35%,
enabling faster decision-making and more timely strategic recommendations.

PROJECT
RETAIL BANKING CUSTOMER CREDIT SCORING MODEL OPTIMIZATION BEIJING, CN
China CITIC Bank Jun 2022 – May 2024
 Refactored the traditional SAS credit scoring system to the PySpark platform, enhancing the efficiency of the ETL
process by 50% and enabling terabyte-scale historical data backtracking analysis.
 Developed an automated feature warehouse using Airflow and Snowflake to dynamically generate over 100 risk
dimension indicators, increasing the model AUC to 0.82.
 Implemented data lineage tracking with Apache Atlas to meet BCBS 239 compliance requirements, reducing audit
preparation time by 65%.

SKILLS & INTERESTS


 Technical: Python (Pandas/NumPy), SQL; Spark, Hadoop, AWS (Redshift/S3/Glue), Airflow; Tableau, Google Analytics,
 Languages: Mandarin (Mother Tongue), English (Fluent)
 Certificate : AWS Certified Machine Learning,Generative AI Vocational Skill Base(linked)

You might also like