JITHIN RAMPILLA
9802224462/rampillajithin@[Link]
PROFESSIONAL SUMMARY:
• Python Developer with over 9.10 years of progressive experience including Big Data
development and Data Analytics (Python Machine Learning)
• Worked on Data migration, Data cleansing, Data validation and other projects involving
PySpark, Scala, Hive and Sqoop, Pandas, NumPy.
• Experience in developing projects using SPARK with PYTHON
• In-depth understanding of Spark architecture including Spark Core, Spark SQL.
• Developed Spark programs using Python API’s
• Strong experience in Big Data technologies like Sparksql, PySpark, Hive.
• Experience in data pre-processing, data analysis, machine learning to get insights into
structured and unstructured data.
• Experience as a Python Automation Developer with the Cable Industry Developing
scripts in Python, Testing voice, video, and data cable services.
• Developing scripts in Python & Selenium for Automation Testing on Tablets, RVP
Boards like Skylake, Kabylake, SOC platform
• Experience on PySpark with AWS S3 & EMR
• Knowledge on Kubernetes ,Docker ,containerization,Gen AI,Django .
• Excellent Interpersonal and communication skills, efficient time management and
organization skills, ability to handle multiple tasks and work well in a team environment.
TECHNICAL SKILLS:
Hadoop Ecosystem Hadoop, Big Data, HDFS, PySpark, HIVE
Data Analytics Pandas, NumPy
Machine Learning Sklearn, seaborn, matplotlib
Languages Python
Database MySQL, Oracle,SQLITE ,Dynamo DB
Amazon Services S3,EMR,DynamoDB,CloudWatch,EC2
Lambda,Step,SNS,SQS,VPC,Cloudformation,
Redshift ,EventBridge ,Glue
Palantir Foundry Contour,Data Lineage,Code
Workbook,Ontology ,Slate,Object Manager,
Function-based Actions, Fusion
Restful API's Requests ,Pycurl,Flask,Postman
Scripting AutoIT, HTML, CSS, JSON, Shell
Hardware SOC, KabyLake, Skylake, STB -Setup Box
Operating system Windows, Linux
Devops Tools Jenkins ,GIT ,Bit
bucket ,Selenium ,Splunk,Jira
IDE Pycharm,IDLE ,Eclipse ,Jupyter
EDUCATION:
5 years Integrated Bachelor & Master of Technology (Computer Science) |2013 | Andhra
University
PROFESSIONAL EXPERIENCE:
June 2023 – Dec 2024 Python Developer /Bank of America, Dallas, TX
Receive communication streams from different platforms, skype, microsoft, etc and need to
develop the common thread in Python. Responsible for engineering and development in support
of the Bank Enterprise Regulatory E-Communications.
• Develop Python code that will translate from files of varying types (e.g., XML, JSON,
CSV,EML etc.) into a specific XML format using Aspose ,pandas,Pyodbc modules .
• Extracting /Filtering the required data from database using Pandas
• Developed Python programs to parse XML documents and load data to the database.
• Designed and Implemented data pipeline to process semi-structured data by integrating
raw data from 100+ data feeds sources using Python.
• Implemented object-oriented design principles to enhance code readability and
maintainability.
• Integrated python with web development tools and web services.
• Designed and developed Web services ,RESTful API's using requests,pycurl,flask and
postman .
• Built scalable microservices using Flask and Django.
• Performed unit testing using pytest and code reviews .
• Capture and log any exceptions and abnormalities with automated [Link]
and validate requirements
Environment:Python,MYSQL,GIT,JIRA,Pandas,NumPy,Bitbucket,Postman,Pycurl,Requests,
docx, BeautifulSoup,Aspose,Smarsh,Pyodbc.
Dec 2022 – May 2023 Python Developer,IntelCorporation, Dallas, TX
• Automation development using Python and contributing to data analysis using machine
learning models. Development of tools and libraries in Python.
• Development of spark jobs using Python and Automated all the possible test cases and
automation script development.
• Built scalable microservices using Flask .
• Fluency in Python with working knowledge of ML & Statistical libraries
• Extensively used Pandas ,Numpy ,Seaborn,Matplotlib for developing various machine
learning algorithms.
• Experience in Python Development and Scientific Programming and using NumPy and
Pandas in Python for Data Manipulation..
• Triage and debug the failures and root cause analysis.
• Predict the values using different machine learning models.
Environment: Python, HSD, GIT, JIRA, Jenkins, Python 3.6.4, Pandas, NumPy, Sklearn,
seaborn, matplotlib.
May 2020 – Jan 2022 Python Developer,Airbus, Bangalore, India
AIRBUS: Skywise is a major Airbus initiative to provide better services to airlines by collecting
and analysing data from in-service aircrafts to ease maintenance activities (component supply,
repair, full airframe maintenance).
• Predicting the workload using machine learning techniques and developed different
models for better accuracy of the predicted [Link] UI using tkinter module for KDB
tool application.
• Led a cross-functional team of data professionals in developing a machine learning
platform that leveraged quantum computing algorithms,resulting in a improvement in model
accuracy
• Built ontology-backed time series analysis and monitoring product for using Pyspark and
Python, enabling clients to visually analyze the dashboards
• Developed and maintained PySpark data pipelines for processing large datasets.
• Collaborated with Skywise experts to create interactive dashboards that visualized flight
patterns and performance data
• Created data-rich dashboards ,widgets,dropdowns using Slate ,CSS,HTML and
Javascript from MPD datasets .
• Designed and implemented Pyspark ,Python & SQL to extract and analyze data,
providing insights into aircraft performance metrics.
• Created a Python based GUI application for Freight Tracking and processing.
• Worked on Code repository ,code workbook ,contour ,data lineage for connecting
different databases.
• Experience in designing and implementing data pipelines for data processing and
transformation using code repository,code workbook,data lineage,data health checks.
• Worked on Python-based data analytics platform with a Javascript frontend.
• Checked health checks ,data quality ,data health using data lineage .
• Used workshop for interactive and high-quality applications and for creating layouts .
• Developed multiple dashboards for quicker data processing using Palantir’s Skywise
Platform: Data Ingestion, Data Preparation, Data Visualization etc.
• Extensive skills within ETL design ,maintaining data pipeline and development using
Python, PySpark on Foundry (Palantir).
• Used monocle for data lineage for creating datasets, connecting [Link]
datasets using contour .
Environment:Palantir Foundry, Code Work book,Ontology ,Function-based Actions, Object
Views,Fusion,Code repository ,Health Checks ,Contour,Slate,Workshop ,Quiver
Nissan
• Developing scripts for Nissan Project in Alexa Developer console using
[Link] /Filtering the required data using Pyspark, Pandas.
• Worked on Amazon Alexa developer console and create skills for scenarios.
• Worked with Dynamo [Link] EC2 instances and other AWS servies.
• Created Lambda for email fire , update data into database.
• Added support for Amazon AWS EMR ,S3 ,Redshift,[Link] and created VPC for
migration [Link] clusters,VPC,EC2 instances.
• Created Dags and worked with amazon managed workflows for apache Airflow.
• Connected Python and Pyspark with AWS using boto and awscli modules.
• Worked with Cloud watch ,AWS batch,Step,SNS,SQS to monitor and send notifications
for airflow tasks.
Environment:AWS Services, Python 3.6.4, Pandas, NumPy, Sklearn, seaborn, matplotlib,
Palantir Foundry, MySQL,Flask,Postman
Dec 2018 – May 2020 Python Developer,CITI, Chennai, India
EAP Analytics: Objective of this project is to provide an execution platform with use interfaces
as needed for EAP execution module management and execution based on modules written in
python programming language. Scope of this project includes a platform to execute python-
based modules and store the module outputs in Optima Retail EAP.
• Developing Spark Jobs using Python. Upload and process data from various structured
and unstructured [Link] large datasets using Panda data frames and MySQL
• Engineered a distributed ETL framework using PySpark and Airflow, processing large
amount of data from diverse sources, which improved data processing efficiency
• Developing Airflow Dags for execution of Framework
• Conducted data exploratory analysis, data cleaning and ensure data quality,
consistency, integrity using Pandas, NumPy.
• Conducted data exploratory analysis using Spark-SQL with various data sources like
JSON, Parquet, CSV, Oracle and Hive.
Environment: Python 2.7.8, Airflow, shell,HDFS,PySpark, Hive,Pandas,
NumPy,Linux,Bitbucket
May 2017 -Dec 2018 Python Developer,COMCAST, Chennai, India
Python Development: Developing scripts in Python for Automation Testing tools. Responsible
for designing, developing, implementing, and executing tests for Customer Premises Equipment
(CPE) video products. Assisting in the identification and providing resolution of problems and
defects, identification and reproducing bugs or defects.
• Developing scripts in python and selenium for Comcast Tools.
• Used Pandas, NumPy API for triaging in loading of raw data, export to various formats.
• Development of libraries and tools for Data transformation and aggregation.
• Experience in python, Jupyter, Scientific computing stack (numpy, scipy, pandasand
matplotlib).
• Perform troubleshooting, fixed and deployed many Python bug fixes of the two main
applications that were a main source of data for both customers and internal customer
service team.
• Write Python scripts to parse JSON documents and load the data in database.
• Generating various capacity planning reports (graphical) using Python packages like
Numpy, matplotlib.
Environment: Python, Pandas, NumPy,Jenkins ,Github,Test Manager, Splunk,Pycharm
Dec 2014 – Apr 2017 Python Developer,Intel Corporation, Bangalore, India
Developing scripts in Python & Selenium for Automation Testing on Tablets, RVP Boards like
Sky lake, Kabylake, SOC platform.
• Developing scripts in python for NPK tool.
• R & D for achieving automation expectation on, particular platform.
• Providing solutions for the issues that are faced during execution.
• Developing scripts, libraries in python.
• Developing scripts in python and selenium, beautiful soup, web scrapping for Intel tools.
Environment: Sky Lake, Kabylake, Touch Robo, Touchless Tool Kit, Brain Box, NPK Tool,
Python 2.7.7, AutoIT, Batch,Windows 10,Pycharm
Oct 2013 - Apr 2014 Techical Associate, Dell Technologies
Responsible for diagnosing, troubleshooting and repairing computer systems, software systems
or applications and providing phone-based technical support for Dell Client products. Assists
customers in determining problems, and provides resolutions on technical and service
problems. Here we need to analyze and solve software and hardware problems by investigating
potential solutions using troubleshooting skills.