0% found this document useful (0 votes)
8 views

sample

sm
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

sample

sm
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Task 1:

The Analysis of the Global Terrorism Database


Overview
The Global Terrorism Database analysis aimed to uncover patterns and trends in
global terrorism incidents, utilizing data manipulation, exploratory data analysis,
and machine learning techniques.

Goals
1. Comprehensive data collection
2. Understanding patterns and trends
3. Comparative analysis
Methods Used
• Obtain the dataset from reliable sources and Data Pre-processing : This involves
removing duplicate entries, resolving inconsistencies, standardizing variables, and addressing
missing or incomplete data.

• Perform analysis like:


• 1.Time series analysis: This involves analyzing trends, seasonality, and cyclical patterns in
the data. Helps to identify periods of increased or decreased activity.

• 2. Geographic analysis : This helps identify regions or countries that are hotspots for
terrorism and allows for the exploration of spatial relationships and clustering.

• Data visualization: Visualizing the data through various plots and charts aids in
understanding patterns and relationships.

• Identify trends, patterns, and insights from the data through charts, graphs, and summary
statistics.
Task 2 :Web scraping on a books website
Overview

The web scraping project involved extracting book-related information, such as


titles, authors, ratings, and reviews, from a book’s website using Python libraries
like BeautifulSoup and Selenium

Goals
• gathering data available on websites
• gathering the following information about all the products of the website:
book title, Price, Availability, Image, Category, rating
Methods Used
• Identify the target website and define the data to be extracted (e.g., book titles, authors,
ratings)

• Some python libraries for web scraping:


• Beautiful Soup
• Requests
• LXML
• Selenium

• Analyze the website structure and develop scraping scripts to automate the data extraction
process.
• Handle challenges such as website changes, data formatting, and data quality.
hardware/software requirements:
Supported Processor Architectures: Intel (or compatible) 32 and 64 bit
RAM : 1 GB
Conclusion

My internship in machine learning and data science has been an enriching experience
that has allowed me to develop my skills and knowledge in these fields. Over the
course of the internship, I had the opportunity to work on several projects and tasks,
including data cleaning, exploratory data analysis, predictive modeling, and
deployment of web-based interfaces. Through these projects, I gained practical
experience in applying machine learning and data science techniques to real-world
problems and learned how to effectively communicate my findings to stakeholders.
Overall, this internship has provided me with valuable skills, knowledge, and experience
that will be highly relevant to my future career in this field.

You might also like