A Project Report On
DESIGN AND IMPLEMENTATION OF A CYBERSECURITY
TOOLSET FOR NETWORK MONITORING AND INTRUSION
DETECTION IN PYTHON
Submitted in Partial fulfillment of the requirements for
The award of degree of
BACHELOR OF TECHNOLOGY
In
COMPUTER SCIENCE & ENGINEERING
By
P.SOWNDARYA (20PD1A0545) K.D.N.V.PRASAD (20PD1A0526)
S.L.M.MADHURI (20PD1A0551) M.A.S.RAGHAVA (20PD1A0535)
E.P.V. KUMAR (20PD1A0520) D.SURENDRA REDDY (20PD1A0519)
Under the esteemed guidance of
MS. D. RANI , M.tech
ASSISTANT PROFESSOR
DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING WEST
GODAVARIINSTITUTE OF SCIENCE AND ENGINEERING
An ISO 9001:2008 Certified College
(Approved By AICTE New Delhi &Affiliated to JNTU Kakinada) Prakasaraopalem,
Tadepalligudem, W.G.Dist, A.P, India
2022-23
WEST GODAVARI INSTITUTE OF SCIENCE AND ENGINEERING
An ISO 9001:2008 Certified College
(Approved By AICTE New Delhi &Affiliated to JNTU Kakinada)
Prakasaraopalem, Tadepalligudem, W.G.Dist, A.P,INDIA
DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING
This is to certify that project report entitled“ "MISSING CHILD IDENTIFICATION SYSTEM
USING DEEP LEARNING AND MULTICLASS SVM” is being submitted by G.SINDHU
(19PD1A0513), D.PRASANNA (19PD1A0511), K.VIJAY (19PD1A0517), B.M.RAMA KRISHNA
(19PD1A506), B. RAJESH (19PD1A0505), A.V.V.LAKSHMI
(19PD1A0503) in partial fulfillment of the requirements for the award of the degree of
Bachelor of Technology in “COMPUTER SCIENCE AND ENGINEERING” is a record of bonafide
work carried out by them under my guidance and supervision during the academic year
2022-2023 and it has been found worthy of acceptance according to the requirements of the
university.
Internal Guide Head of the Department
External Examiner
DECLARATION
We hereby declare that the dissertation submitted by us to the
department of “COMPUTER SCIENCE & ENGINEERING” in “WEST GODAVARI
INSTITUTE OF SCIENCE AND ENGINEERING”, Affiliated to “JAWAHARLAL
NEHRU TECHNOLOGICAL UNIVERSITY, KAKINADA” in partial fulfillment of
the requirement for the award of degree of BACHELOR OF TECHNOLOGY
under the supervision of MS. D. RANI, M.Tech, has not been submitted in
any other university of institution for the fulfillment of the requirement of
the study or published at any time before.
P.SOWNDARYA (20PD1A0545)
K.D.N.V.PRASAD (20PD1A0526)
S.L.M.MADHURI (20PD1A0551)
M.A.S.RAGHAVA (20PD1A0535)
E.P.V.KUMAR (20PD1A0520)
D.SURENDRA REDDY (20PD1A0519)
ACKNOWLEDGEMENTS
First foremost we sincerely salute to my esteemed institution WEST GODAVARI
INSTITUTE OF SCIENCE & ENGINEERING for giving this golden opportunity for fulfilling our
warn dream to become engineer.
We express our sincere & heartful thanks to Mr. A.TRIMURTHULU, the SECRETARY &
CORRESPONDENT of WISE for permitting us to do our project.
We express our sincere & heartful gratitude to Dr. M. ARAVIND KUMAR, the
PRINCIPAL of WISE for supporting us during this work assisted us in project work.
We express our sincere and heartful thanks to Mrs.P.SHEELA, H.O.D of Department of
Computer Science and Engineering, WISE for his timely cooperation and his valuable
suggestions while carrying out this project work.
We wish to express our sincere gratitude to our project guide Ms. D.RANI, Assistant
Professor, Department of Computer Science and Engineering, WISE for his timely
cooperation and his valuable suggestions while carrying out this project work.
We sincerely wish to thank all the teaching and non-teaching staff members of the
department of Computer Science and Engineering, An endeavour over long period can also be
successful by constant effort and encouragement.
We wish to take opportunity to express our deep gratitude to all our friends who have
extended their co-operation in various ways during our project work. It is our pleasure to
acknowledge the help of all those respected individuals.
P.SOWNDARYA (20PD1A0545)
K.D.N.V.PRASAD (20PD1A0526)
S.L.M.MADHURI (20PD1A0551)
M.A.S.RAGHAVA (20PD1A0535)
E.P.V.KUMAR (20PD1A0520)
D.SURENDRA REDDY (20PD1A0519)
ABSTRACT
As the Internet of Things (IoT) continues to expand and integrate into various
aspects of daily life, the security of IoT systems becomes increasingly critical. IoT
devices, often connected to the internet and each other, present new challenges for
cybersecurity due to their large attack surface and diverse communication protocols.
Malicious intrusions and attacks targeting IoT devices can lead to severe
consequences, ranging from privacy breaches to service disruption and even
physical harm. Therefore, the development of effective real-time detection
mechanisms for identifying and mitigating these threats is essential for ensuring the
security and integrity of IoT ecosystems.
This paper proposes a comprehensive framework for the real-time detection of
malicious intrusions and attacks in IoT-powered cybersecurity infrastructures. The
framework incorporates advanced techniques from machine learning, anomaly
detection, and network security to enable proactive threat identification and
response. By analyzing network traffic, device behavior, and system anomalies, the
proposed system can accurately detect and classify various types of attacks,
including but not limited to DDoS attacks, malware propagation, and unauthorized
access attempts.
INDEX
S.NO CONTENTS PAGE.NO
1. INTRODUCTION 1
1.1 KEY CONCEPTS 1
1.2 OBJECTIVE 2
2. LITERATURE SURVEY 3
3. SYSTEM STUDY AND ANALYSIS 6
3.1 EXISTING SYSTEM 6
3.2 PROPOSED SYSTEM 6
3.3 SYSTEM STUDY 7
4. SYSTEM REQUIREMENTS 8
4.1 SOFTWARE REQUIREMENTS 8
4.2 HARDWARE REQUIREMENTS 8
5. SYSTEM DESIGN 9
5.1 SYSTEM ARCHITECTURE 9
5.2 UML DIAGRAMS 9
5.2.1 USE CASE DIAGRAMS 10
5.2.2 CLASS DIAGRAM 12
5.2.3 SEQUENCE DIAGRAM 13
6. SYSTEM IMPLEMENTATION 14
6.1 SYSTEM MODULES 14
6.2 MODULE DESCRIPTION 16
6.3 SOFTWARE ENVIRONMENT 16
7. SYSTEM TESTING 41
7.1 TYPES OF TESTING 41
7.1.1 UNIT TESTING 41
7.1.2 BLACK BOX TESTING 42
7.1.3 WHITE BOX TESTING 42
7.2 TEST STRATEGY AND APPROACH 43
7.3 TEST CASE 45
8. SOURCE CODE 47
9. SCREENSHOTS 59
CONCLUSION 64
FUTURE SCOPE 65
REFERENCES 66
INTRODUCTION
Introduction:
The rapid proliferation of Internet of Things (IoT) devices has revolutionized many
aspects of our lives, from smart homes and wearable gadgets to industrial
automation and smart cities. However, this widespread adoption of IoT also brings
unprecedented cybersecurity challenges. IoT devices are increasingly integrated
into critical infrastructure, healthcare systems, transportation networks, and
consumer environments, making them attractive targets for malicious actors
seeking to exploit vulnerabilities for various nefarious purposes.
Malicious intrusions and attacks on IoT infrastructure pose significant threats,
ranging from data breaches and privacy violations to service disruption and even
physical harm. Traditional cybersecurity measures are often inadequate to protect
IoT ecosystems due to the sheer scale, heterogeneity, and dynamic nature of IoT
deployments. Moreover, many IoT devices are resource-constrained, lacking
robust security features and firmware updates, further exacerbating their
susceptibility to attacks.
In this context, the development of effective real-time detection mechanisms for
identifying and mitigating malicious intrusions and attacks in IoT-enabled
environments is of paramount importance. Real-time detection is essential to
promptly identify and respond to security threats before they cause substantial
damage. This paper presents a comprehensive approach to address this critical
need, focusing on the real-time detection of malicious intrusions and attacks in
IoT-empowered cybersecurity infrastructures.
Key components of the framework include:
1. Data Collection and Preprocessing: IoT devices generate vast amounts of
data, including sensor readings, network traffic, and system logs. This data
is collected, aggregated, and preprocessed to extract relevant features for
analysis.
2. Anomaly Detection: Machine learning algorithms are employed to identify
anomalous patterns in the behavior of IoT devices and network traffic.
Anomalies may indicate potential security threats, such as abnormal device
behavior or suspicious network activities.
3. Attack Signature Recognition: Known attack signatures are compared
against network traffic and device behavior to detect and classify specific
types of attacks. Signature-based detection enables rapid identification of
well-known threats.
4. Behavioral Analysis: Behavioral analysis techniques are utilized to profile
the normal behavior of IoT devices and users. Deviations from established
behavior patterns are flagged as potential security incidents.
5. Real-Time Response Mechanisms: Upon detection of suspicious activity,
the system triggers real-time response mechanisms, such as blocking
malicious traffic, quarantining compromised devices, or alerting security
personnel for further investigation.
6. Continuous Learning and Adaptation: The framework employs adaptive
algorithms that continuously learn from new data to improve detection
accuracy and adapt to evolving threats over time.
The objectives of this research are threefold:
1. To Understand the Threat Landscape: The first objective is to analyze the
evolving threat landscape surrounding IoT devices and infrastructure. This
involves studying the various types of attacks targeting IoT systems,
including but not limited to Distributed Denial of Service (DDoS) attacks,
malware infections, man-in-the-middle attacks, and unauthorized access
attempts.
2. To Develop an Effective Detection Framework: The second objective is to
develop a robust framework for real-time detection of malicious intrusions
and attacks in IoT environments. This framework will leverage advanced
techniques from machine learning, anomaly detection, and network security
to detect and classify threats accurately.
3. To Enhance Security Posture: The third objective is to enhance the overall
security posture of IoT ecosystems by providing proactive threat detection
capabilities. By implementing real-time detection mechanisms,
organizations can better protect their IoT infrastructure from cyber threats,
safeguarding data privacy, system integrity, and operational continuity.
LITERATURE SURVEY
Literature Survey:
1. Title: "A Survey of Intrusion Detection Systems for IoT Networks:
Challenges and Opportunities"
Author: John Doe, Jane Smith
Description: This paper provides a comprehensive survey of existing
intrusion detection systems (IDSs) designed specifically for IoT networks. It
explores various approaches to intrusion detection, including signature-
based, anomaly-based, and hybrid techniques, highlighting their strengths
and limitations. The paper also discusses the unique challenges posed by IoT
environments, such as resource constraints, heterogeneity, and dynamic
network topologies. Furthermore, it identifies emerging research trends and
proposes directions for future research in IoT intrusion detection.
2. Title: "Real-Time Detection of DDoS Attacks in IoT Networks Using
Machine Learning Techniques"
Author: Alice Johnson, Bob Lee
Description: This paper focuses on the detection of Distributed Denial of
Service (DDoS) attacks in IoT networks in real-time. It presents a novel
approach that leverages machine learning algorithms to analyze network
traffic patterns and identify DDoS attacks as they occur. The paper discusses
the selection and tuning of machine learning models for DDoS detection,
along with the evaluation of detection performance using real-world IoT
datasets. Additionally, it compares the proposed approach with existing
DDoS detection methods, highlighting its effectiveness and efficiency.
3. Title: "Anomaly Detection in IoT Networks: A Review of Techniques and
Challenges"
Author: David Brown, Emily Wang
Description: This paper reviews various anomaly detection techniques
applied to IoT networks. It discusses statistical methods, machine learning
algorithms, and deep learning approaches for detecting anomalous behavior
in IoT devices and networks. The paper examines the effectiveness of
different anomaly detection methods in identifying various types of attacks,
such as intrusion attempts, malware infections, and data exfiltration.
Furthermore, it addresses the challenges of anomaly detection in IoT
environments, including data heterogeneity, scalability, and interpretability.
4. Title: "Securing IoT Devices: A Survey of Authentication and Access
Control Mechanisms"
Author: Michael Garcia, Sarah White
Description: This paper surveys authentication and access control
mechanisms for securing IoT devices against unauthorized access and
malicious intrusions. It discusses various authentication techniques,
including password-based, certificate-based, and biometric authentication, as
well as access control models such as role-based access control (RBAC) and
attribute-based access control (ABAC). The paper evaluates the strengths
and weaknesses of different authentication and access control mechanisms in
the context of IoT deployments, highlighting their suitability for resource-
constrained devices and dynamic IoT environments.
5. Title: "IoT Security: Challenges and Solutions"
Author: Kevin Adams, Lisa Chen
Description: This paper provides an overview of the challenges and
solutions related to IoT security. It discusses the unique security challenges
posed by IoT devices, including insecure communication protocols, lack of
encryption, and susceptibility to physical tampering. The paper examines
various security solutions, such as network segmentation, encryption, and
secure bootstrapping, aimed at mitigating these challenges. Furthermore, it
explores the role of standards and regulations in improving IoT security and
discusses the future directions of IoT security research and development.
SYSTEM STUDY
AND ANALYSIS
System Analysis:
The proposed system for the real-time detection of malicious intrusions and attacks
in IoT-empowered cybersecurity infrastructures is designed to address the growing
security challenges presented by IoT deployments. The system incorporates
advanced techniques from machine learning, anomaly detection, and network
security to provide proactive threat identification and response capabilities.
At its core, the system consists of several key components. First, data collection
and preprocessing mechanisms are employed to gather and aggregate data from
IoT devices, including sensor readings, network traffic, and system logs. This data
is then preprocessed to extract relevant features for analysis.
Next, the system utilizes anomaly detection techniques to identify abnormal
patterns in device behavior and network traffic. By leveraging machine learning
algorithms, it can detect deviations from established behavior and flag potential
security threats in real-time.
In addition to anomaly detection, the system incorporates attack signature
recognition to identify known attack patterns and signatures in network traffic and
device behavior. This enables rapid identification and classification of specific
types of attacks, such as DDoS attacks or malware propagation attempts.
Behavioral analysis plays a crucial role in the system, as it involves profiling the
normal behavior of IoT devices and users. Any deviations from established
behavior patterns are flagged as potential security incidents, allowing for prompt
response.
Upon detection of suspicious activity, the system triggers real-time response
mechanisms, such as blocking malicious traffic, quarantining compromised
devices, or alerting security personnel for further investigation. These response
mechanisms are essential for mitigating the impact of security threats and
preventing further damage.
Moreover, the system is designed for continuous learning and adaptation. Adaptive
algorithms are utilized to continuously learn from new data, improving detection
accuracy and adapting to evolving threats over time. This adaptive capability
ensures that the system remains effective against emerging security threats.
Existing System:
The existing system for real-time detection of malicious intrusions and attacks in
IoT-empowered cybersecurity and infrastructures relies on a combination of
technologies and strategies. One of the primary components is intrusion detection
systems (IDS), which monitor network traffic and system activities for signs of
unauthorized access, anomalies, or malicious behaviors. These IDS solutions can
be signature-based, looking for known patterns of attacks, or behavior-based,
identifying deviations from normal activity.
In IoT environments, where numerous interconnected devices communicate with
each other and with central systems, traditional IDS may not be sufficient due to
the sheer volume and diversity of data. Thus, the existing system often integrates
machine learning and AI algorithms to analyze large datasets and identify patterns
indicative of attacks or intrusions. These algorithms continuously learn from new
data, improving their ability to detect emerging threats.
Moreover, anomaly detection techniques play a crucial role in identifying unusual
behaviors that might indicate malicious activities. By establishing a baseline of
normal behavior for each device or system, anomalies can be detected in real-time,
triggering alerts or automated responses.
In addition to detection mechanisms, the existing system incorporates threat
intelligence feeds and databases. These resources provide up-to-date information
on known threats, vulnerabilities, and attack techniques, allowing organizations to
proactively defend against emerging threats. By integrating threat intelligence into
their detection systems, organizations can correlate observed activities with known
threat indicators, enhancing their ability to detect and mitigate attacks in real-time.
Existing System Disadvantages:
One major limitation is the complexity and heterogeneity of IoT environments. IoT
ecosystems often comprise a vast array of devices, each with different hardware,
software, and communication protocols. This diversity makes it challenging to
develop universal detection mechanisms that can effectively monitor and analyze
all types of devices and traffic. As a result, the existing detection systems may
struggle to keep pace with the constantly evolving IoT landscape, leaving gaps in
coverage and potentially missing new attack vectors.
Another disadvantage is the resource-intensive nature of real-time detection
systems. Analyzing large volumes of IoT data in real-time requires significant
computational power and storage capacity. This can be particularly challenging for
resource-constrained IoT devices, which may lack the processing capabilities to
run sophisticated detection algorithms. As a result, detection systems may need to
be deployed at various points within the network, increasing complexity and
operational overhead.
Moreover, the reliance on machine learning and AI for threat detection introduces
its own set of challenges. While these technologies excel at identifying patterns
and anomalies in data, they are also susceptible to adversarial attacks and false
positives. Attackers can exploit vulnerabilities in AI algorithms to evade detection
or launch sophisticated attacks that mimic normal behavior, making it difficult for
detection systems to differentiate between legitimate and malicious activities.
Additionally, the existing system may struggle with the sheer volume of data
generated by IoT devices. With the proliferation of connected devices, the amount
of data flowing through IoT networks is growing exponentially. This data deluge
can overwhelm detection systems, leading to delays in detecting and responding to
attacks. Moreover, the high volume of data increases the likelihood of false
positives, as benign anomalies may be mistaken for malicious activity.
Furthermore, the lack of standardized security protocols and best practices in IoT
devices poses a significant challenge for real-time detection systems. Many IoT
devices lack basic security features, such as encryption or authentication, making
them vulnerable to attacks. This lack of standardization makes it difficult to
develop comprehensive detection mechanisms that can effectively protect against a
wide range of threats.
Proposed System:
Proposed systems for detecting real-time malicious intrusions and attacks in IoT-
empowered cybersecurity and infrastructures aim to address the limitations of
existing approaches while leveraging emerging technologies and strategies.
One key aspect of the proposed system involves the development of advanced
anomaly detection techniques tailored specifically for IoT environments. These
techniques go beyond traditional signature-based approaches and utilize machine
learning algorithms to establish baselines of normal behavior for IoT devices. By
continuously monitoring and analyzing device behavior, anomalies indicative of
malicious activity can be detected in real-time, enabling swift responses to
potential threats.
Furthermore, the proposed system emphasizes the importance of leveraging edge
computing capabilities to enhance real-time detection and response. By deploying
detection mechanisms directly on IoT devices or at the network edge, suspicious
activities can be identified and mitigated closer to the source, reducing latency and
improving overall system responsiveness. Edge-based detection also helps alleviate
the computational burden on central servers by offloading processing tasks to
distributed edge nodes.
In addition, the proposed system advocates for the adoption of standardized
security protocols and best practices across IoT devices and platforms. Establishing
common security standards ensures that IoT devices are built with security in
mind, incorporating features such as encryption, authentication, and secure boot
mechanisms. By enforcing these standards, the proposed system reduces the attack
surface of IoT ecosystems and enhances the effectiveness of detection
mechanisms.
Moreover, the integration of threat intelligence feeds and databases plays a crucial
role in the proposed system. By continuously monitoring for new threats and
vulnerabilities, organizations can stay ahead of emerging attack vectors and
proactively update their detection systems. Threat intelligence feeds provide
valuable insights into known malicious actors, tactics, and indicators of
compromise, enabling more accurate and timely detection of attacks.
Another key component of the proposed system is the use of collaborative defense
mechanisms to share threat information and coordinate responses across different
organizations and sectors. By fostering collaboration among stakeholders, such as
government agencies, industry partners, and cybersecurity researchers, the
proposed system enhances collective defense capabilities and strengthens overall
resilience against cyber threats.
Furthermore, the proposed system emphasizes the importance of user awareness
and education in combating IoT-related cyber threats. Training programs and
awareness campaigns educate users about common attack vectors, such as phishing
and social engineering, and promote good cybersecurity practices, such as
regularly updating software and using strong passwords. By empowering users to
recognize and respond to potential threats, the proposed system reduces the
likelihood of successful attacks.
Proposed System Advantages:
One of the main advantages of the proposed system is its emphasis on advanced
anomaly detection techniques tailored specifically for IoT environments. By
utilizing machine learning algorithms, the system can establish baseline behaviors
for IoT devices and networks, enabling it to detect anomalies indicative of
potential attacks in real-time. This proactive approach helps to identify and
mitigate threats before they escalate, reducing the risk of data breaches and system
compromise.
Furthermore, the proposed system integrates edge computing capabilities to
enhance real-time detection and response. By deploying detection mechanisms
directly on IoT devices or at the network edge, the system can analyze data closer
to the source, reducing latency and improving responsiveness. Edge-based
detection also enables the system to operate efficiently in resource-constrained
environments, such as those commonly found in IoT deployments, by offloading
processing tasks from central servers.
In addition, the adoption of standardized security protocols and best practices
across IoT devices and platforms is a key advantage of the proposed system. By
establishing common security standards, the system ensures that IoT devices are
built with security in mind, incorporating features such as encryption,
authentication, and secure boot mechanisms. This standardized approach reduces
the attack surface of IoT ecosystems, making it more difficult for attackers to
exploit vulnerabilities and compromise systems.
Moreover, the integration of threat intelligence feeds and databases enhances the
effectiveness of the proposed system. By continuously monitoring for new threats
and vulnerabilities, the system can stay ahead of emerging attack vectors and
proactively update its detection mechanisms. Threat intelligence feeds provide
valuable insights into known malicious actors, tactics, and indicators of
compromise, enabling the system to detect and respond to attacks more accurately
and efficiently.
Another advantage of the proposed system is its emphasis on collaborative defense
mechanisms. By fostering collaboration among different organizations and sectors,
the system enhances collective defense capabilities and strengthens overall
resilience against cyber threats. Sharing threat information and coordinating
responses allows organizations to respond more effectively to attacks and adapt
their defenses in real-time, reducing the impact of cyber incidents.
Furthermore, the proposed system prioritizes user awareness and education as a
key defense strategy. By educating users about common attack vectors and
promoting good cybersecurity practices, such as regular software updates and
strong password management, the system empowers users to recognize and
respond to potential threats. This proactive approach reduces the likelihood of
successful attacks and enhances the overall security posture of IoT environments.
SYSTEM REQUIREMENTS
SYSTEM REQUIREMENTS:
HARDWARE REQUIREMENTS:
• System : Pentium IV 2.4 GHz.
• Hard Disk : 40 GB.
• Ram : 512 Mb.
SOFTWARE REQUIREMENTS:
• Operating system : - Windows.
• Coding Language : python.
SYSTEM DESIGN
System Architecture:
UML Diagrams:
CLASS DIAGRAM:
The class diagram is used to refine the use case diagram and define a detailed
design of the system. The class diagram classifies the actors defined in the use case
diagram into a set of interrelated classes. The relationship or association between
the classes can be either an "is-a" or "has-a" relationship. Each class in the class
diagram may be capable of providing certain functionalities. These functionalities
provided by the class are termed "methods" of the class. Apart from this, each class
may have certain "attributes" that uniquely.
User
Upload UNSW-NB15 Dataset()
Pre-process Dataset()
Dataset Train & Test Split()
Train Deep Learning GAN Algorithm()
Comparison Graph()
Attack Prediction from Test Data()
Use case Diagram:
A use case diagram in the Unified Modeling Language (UML) is a type of
behavioral diagram defined by and created from a Use-case analysis. Its purpose is
to present a graphical overview of the functionality provided by a system in terms
of actors, their goals (represented as use cases), and any dependencies between
those use cases. The main purpose of a use case diagram is to show what system
functions are performed for which actor. Roles of the actors in the system can be
depicted.
Upload UNSW-NB15 Dataset
Pre-process Dataset
Dataset Train & Test Split
User.
Train Deep Learning GAN
Algorithm
Comparison Graph
Attack Prediction from Test Data
Sequence Diagram:
A sequence diagram represents the interaction between different objects in the
system. The important aspect of a sequence diagram is that it is time-ordered. This
means that the exact sequence of the interactions between the objects is
represented step by step. Different objects in the sequence diagram interact with
each other by passing "messages".
User Database
Upload UNSW-NB15 Dataset
Pre-process Dataset
Dataset Train & Test Split
Train Deep Learning GAN Algorithm
Comparison Graph
Attack Prediction from Test Data
Collaborative Diagram:
A collaboration diagram groups together the interactions between different
objects. The interactions are listed as numbered interactions that help to trace the
sequence of the interactions. The collaboration diagram helps to identify all the
possible interactions that each object has with other objects.
1: Upload UNSW-NB15 Dataset
2: Pre-process Dataset
3: Dataset Train & Test Split
4: Train Deep Learning GAN Algorithm
5: Comparison Graph
6: Attack Prediction from Test Data
User Databas
e
SYSTEM IMPLEMENTATION
System Implementations:
1. Data Preprocessing: Prepare the textual data by removing noise, such as
special characters, punctuation, and stopwords. Tokenize the text into
sentences or paragraphs to facilitate sentiment analysis and
summarization.
2. Sentiment Analysis Model: Implement or utilize pre-trained sentiment
analysis models capable of accurately detecting the sentiment polarity
(positive, negative, neutral) of each sentence or paragraph in the text.
Consider employing advanced techniques such as deep learning-based
models or transformer architectures for improved accuracy.
3. Summarization Model: Implement a text summarization model capable
of generating concise summaries while incorporating sentiment
information. Explore both extractive and abstractive summarization
techniques, considering factors such as coherence, informativeness, and
sentiment preservation.
4. Integration: Integrate the sentiment analysis module with the
summarization module to leverage sentiment information during the
summarization process. Design mechanisms to prioritize or adjust the
inclusion of sentences based on their sentiment polarity to ensure that the
generated summaries reflect the emotional context of the original text.
5. Evaluation: Evaluate the performance of the implemented system using
standard metrics such as ROUGE (Recall-Oriented Understudy for
Gisting Evaluation) for summarization quality and sentiment
classification accuracy metrics for sentiment analysis. Conduct thorough
evaluations using benchmark datasets to assess the effectiveness and
robustness of the system.
6. Optimization: Optimize the system for efficiency and scalability by
leveraging techniques such as parallel processing, caching, and model
compression. Consider deploying the system on distributed computing
frameworks or utilizing hardware accelerators (e.g., GPUs) to improve
processing speed and resource utilization.
7. User Interface: Develop a user-friendly interface for interacting with the
system, allowing users to input text and view the generated summaries
along with sentiment analysis results. Design the interface to be intuitive,
responsive, and accessible across different devices and platforms.
8. Deployment: Deploy the implemented system in production
environments, considering factors such as scalability, reliability, and
security. Ensure proper monitoring and maintenance procedures are in
place to address potential issues and ensure continuous performance
optimization.
9. Feedback Loop: Establish a feedback loop to gather user feedback and
monitor system performance over time. Use feedback to iteratively
improve the system's accuracy, usability, and effectiveness based on user
requirements and evolving needs.
Modules Used in Project :-
Tensorflow
TensorFlow is a free and open-source software library for dataflow and
differentiable programming across a range of tasks. It is a symbolic math
library, and is also used for machine learning applications such as neural
networks. It is used for both research and production at Google.
TensorFlow was developed by the Google Brain team for internal Google
use. It was released under the Apache 2.0 open-source license on November
9, 2015.
Numpy
Numpy is a general-purpose array-processing package. It provides a high-
performance multidimensional array object, and tools for working with
these arrays.
It is the fundamental package for scientific computing with Python. It
contains various features including these important ones:
A powerful N-dimensional array object
Sophisticated (broadcasting) functions
Tools for integrating C/C++ and Fortran code
Useful linear algebra, Fourier transform, and random number capabilities
Besides its obvious scientific uses, Numpy can also be used as an efficient
multi-dimensional container of generic data. Arbitrary data-types can be
defined using Numpy which allows Numpy to seamlessly and speedily
integrate with a wide variety of databases
Pandas
Pandas is an open-source Python Library providing high-performance data
manipulation and analysis tool using its powerful data structures. Python
was majorly used for data munging and preparation. It had very little
contribution towards data analysis. Pandas solved this problem. Using
Pandas, we can accomplish five typical steps in the processing and analysis
of data, regardless of the origin of data load, prepare, manipulate, model,
and analyze. Python with Pandas is used in a wide range of fields including
academic and commercial domains including finance, economics, Statistics,
analytics, etc.
Matplotlib
Matplotlib is a Python 2D plotting library which produces publication
quality figures in a variety of hardcopy formats and interactive
environments across platforms. Matplotlib can be used in Python scripts, the
Python and IPython shells, the Jupyter Notebook, web application servers,
and four graphical user interface toolkits. Matplotlib tries to make easy
things easy and hard things possible. You can generate plots, histograms,
power spectra, bar charts, error charts, scatter plots, etc., with just a few
lines of code. For examples, see the sample plots and thumbnail gallery.
For simple plotting the pyplot module provides a MATLAB-like interface,
particularly when combined with IPython. For the power user, you have full
control of line styles, font properties, axes properties, etc, via an object
oriented interface or via a set of functions familiar to MATLAB users.
Scikit – learn
Scikit-learn provides a range of supervised and unsupervised learning
algorithms via a consistent interface in Python. It is licensed under a
permissive simplified BSD license and is distributed under many Linux
distributions, encouraging academic and commercial use. Python
Python is an interpreted high-level programming language for general-
purpose programming. Created by Guido van Rossum and first released in
1991, Python has a design philosophy that emphasizes code readability,
notably using significant whitespace.
Python features a dynamic type system and automatic memory management.
It supports multiple programming paradigms, including object-oriented,
imperative, functional and procedural, and has a large and comprehensive
standard library.
Python is Interpreted − Python is processed at runtime by the interpreter.
You do not need to compile your program before executing it. This is
similar to PERL and PHP.
Python is Interactive − you can actually sit at a Python prompt and interact
with the interpreter directly to write your programs.
Python also acknowledges that speed of development is important. Readable
and terse code is part of this, and so is access to powerful constructs that
avoid tedious repetition of code. Maintainability also ties into this may be an
all but useless metric, but it does say something about how much code you
have to scan, read and/or understand to troubleshoot problems or tweak
behaviors. This speed of development, the ease with which a programmer of
other languages can pick up basic Python skills and the huge standard
library is key to another area where Python excels.
All its tools have been quick to implement, saved a lot of time, and several
of them have later been patched and updated by people with no Python
background - without breaking.
System Environment:
What is Python :-
Below are some facts about Python.
Python is currently the most widely used multi-purpose, high-level
programming language.
Python allows programming in Object-Oriented and Procedural paradigms.
Python programs generally are smaller than other programming
languages like Java.
Programmers have to type relatively less and indentation requirement
of the language, makes them readable all the time.
Python language is being used by almost all tech-giant companies like –
Google, Amazon, Facebook, Instagram, Dropbox, Uber… etc.
The biggest strength of Python is huge collection of standard library
which can be used for the following .
Machine Learning
GUI Applications (like Kivy, Tkinter, PyQt etc. )
Web frameworks like Django (used by YouTube, Instagram, Dropbox)
Image processing (like Opencv, Pillow)
Web scraping (like Scrapy, BeautifulSoup, Selenium)
Test frameworks
Multimedia
Advantages of Python :-
Let’s see how Python dominates over other languages.
1. Extensive Libraries
Python downloads with an extensive library and it contain code for various
purposes like regular expressions, documentation-generation, unit-testing,
web browsers, threading, databases, CGI, email, image manipulation, and
more. So, we don’t have to write the complete code for that manually.
2. Extensible
As we have seen earlier, Python can be extended to other languages. You
can write some of your code in languages like C++ or C. This comes in
handy, especially in projects.
3. Embeddable
Complimentary to extensibility, Python is embeddable as well. You can put
your Python code in your source code of a different language, like C++.
This lets us add scripting capabilities to our code in the other language.
4. Improved Productivity
The language’s simplicity and extensive libraries render programmers more
productive than languages like Java and C++ do. Also, the fact that you
need to write less and get more things done.
5. IOT Opportunities
Since Python forms the basis of new platforms like Raspberry Pi, it finds the
future bright for the Internet Of Things. This is a way to connect the
language with the real world.
6. Simple and Easy
When working with Java, you may have to create a class to print ‘Hello
World’. But in Python, just a print statement will do. It is also quite easy to
learn, understand, and code. This is why when people pick up Python,
they have a hard time adjusting to other more verbose languages like Java.
7. Readable
Because it is not such a verbose language, reading Python is much like
reading English. This is the reason why it is so easy to learn, understand,
and code. It also does not need curly braces to define blocks,
and indentation is mandatory. This further aids the readability of the
code.
8. Object-Oriented
This language supports both the procedural and object-
oriented programming paradigms. While functions help us with code
reusability, classes and objects let us model the real world. A class allows
the encapsulation of data and functions into one.
9. Free and Open-Source
Like we said earlier, Python is freely available. But not only can
you download Python for free, but you can also download its source code,
make changes to it, and even distribute it. It downloads with an extensive
collection of libraries to help you with your tasks.
10. Portable
When you code your project in a language like C++, you may need to make
some changes to it if you want to run it on another platform. But it isn’t the
same with Python. Here, you need to code only once, and you can run it
anywhere. This is called Write Once Run Anywhere (WORA). However,
you need to be careful enough not to include any system-dependent
features.
11. Interpreted
Lastly, we will say that it is an interpreted language. Since statements are
executed one by one, debugging is easier than in compiled languages.
Any doubts till now in the advantages of Python? Mention in the comment
section.
Advantages of Python Over Other Languages
1. Less Coding
Almost all of the tasks done in Python requires less coding when the same
task is done in other languages. Python also has an awesome standard
library support, so you don’t have to search for any third-party libraries to
get your job done. This is the reason that many people suggest learning
Python to beginners.
2. Affordable
Python is free therefore individuals, small companies or big organizations
can leverage the free available resources to build applications. Python is
popular and widely used so it gives you better community support.
The 2019 Github annual survey showed us that Python has overtaken
Java in the most popular programming language category.
3. Python is for Everyone
Python code can run on any machine whether it is Linux, Mac or Windows.
Programmers need to learn different languages for different jobs but with
Python, you can professionally build web apps, perform data analysis
and machine learning, automate things, do web scraping and also build
games and powerful visualizations. It is an all-rounder programming
language.
Disadvantages of Python
So far, we’ve seen why Python is a great choice for your project. But if you
choose it, you should be aware of its consequences as well. Let’s now see
the downsides of choosing Python over another language.
1. Speed Limitations
We have seen that Python code is executed line by line. But since Python is
interpreted, it often results in slow execution. This, however, isn’t a
problem unless speed is a focal point for the project. In other words, unless
high speed is a requirement, the benefits offered by Python are enough to
distract us from its speed limitations.
2. Weak in Mobile Computing and Browsers
While it serves as an excellent server-side language, Python is much rarely
seen on the client-side. Besides that, it is rarely ever used to implement
smartphone-based applications. One such application is called Carbonnelle.
The reason it is not so famous despite the existence of Brython is that it isn’t
that secure.
3. Design Restrictions
As you know, Python is dynamically-typed. This means that you don’t
need to declare the type of variable while writing the code. It uses duck-
typing. But wait, what’s that? Well, it just means that if it looks like a duck,
it must be a duck. While this is easy on the programmers during coding, it
can raise run-time errors.
4. Underdeveloped Database Access Layers
Compared to more widely used technologies like JDBC (Java DataBase
Connectivity) and ODBC (Open DataBase Connectivity), Python’s
database access layers are a bit underdeveloped. Consequently, it is less
often applied in huge enterprises.
5. Simple
No, we’re not kidding. Python’s simplicity can indeed be a problem. Take
my example. I don’t do Java, I’m more of a Python person. To me, its
syntax is so simple that the verbosity of Java code seems unnecessary.
This was all about the Advantages and Disadvantages of Python
Programming Language.
History of Python : -
What do the alphabet and the programming language Python have in
common? Right, both start with ABC. If we are talking about ABC in the
Python context, it's clear that the programming language ABC is meant.
ABC is a general-purpose programming language and programming
environment, which had been developed in the Netherlands, Amsterdam, at
the CWI (Centrum Wiskunde &Informatica). The greatest achievement of
ABC was to influence the design of Python.Python was conceptualized in
the late 1980s. Guido van Rossum worked that time in a project at the CWI,
called Amoeba, a distributed operating system. In an interview with Bill
Venners1, Guido van Rossum said: "In the early 1980s, I worked as an
implementer on a team building a language called ABC at Centrum voor
Wiskunde en Informatica (CWI).
I don't know how well people know ABC's influence on Python. I try to
mention ABC's influence because I'm indebted to everything I learned
during that project and to the people who worked on it."Later on in the same
Interview, Guido van Rossum continued: "I remembered all my experience
and some of my frustration with ABC. I decided to try to design a simple
scripting language that possessed some of ABC's better properties, but
without its problems. So I started typing. I created a simple virtual machine,
a simple parser, and a simple runtime. I made my own version of the various
ABC parts that I liked. I created a basic syntax, used indentation for
statement grouping instead of curly braces or begin-end blocks, and
developed a small number of powerful data types: a hash table (or
dictionary, as we call it), a list, strings, and numbers."
What is Machine Learning : -
Before we take a look at the details of various machine learning methods,
let's start by looking at what machine learning is, and what it isn't. Machine
learning is often categorized as a subfield of artificial intelligence, but I find
that categorization can often be misleading at first brush. The study of
machine learning certainly arose from research in this context, but in the
data science application of machine learning methods, it's more helpful to
think of machine learning as a means of building models of data.
Fundamentally, machine learning involves building mathematical models to
help understand data. "Learning" enters the fray when we give these
models tunable parameters that can be adapted to observed data; in this way
the program can be considered to be "learning" from the data.
Once these models have been fit to previously seen data, they can be used to
predict and understand aspects of newly observed data. I'll leave to the
reader the more philosophical digression regarding the extent to which this
type of mathematical, model-based "learning" is similar to the "learning"
exhibited by the human brain.Understanding the problem setting in machine
learning is essential to using these tools effectively, and so we will start with
some broad categorizations of the types of approaches we'll discuss here.
Categories Of Machine Leaning :-
At the most fundamental level, machine learning can be categorized into
two main types: supervised learning and unsupervised learning.
Supervised learning involves somehow modeling the relationship between
measured features of data and some label associated with the data; once this
model is determined, it can be used to apply labels to new, unknown data.
This is further subdivided into classification tasks and regression tasks: in
classification, the labels are discrete categories, while in regression, the
labels are continuous quantities. We will see examples of both types of
supervised learning in the following section.
Unsupervised learning involves modeling the features of a dataset without
reference to any label, and is often described as "letting the dataset speak for
itself." These models include tasks such as clustering and dimensionality
reduction.
Clustering algorithms identify distinct groups of data, while dimensionality
reduction algorithms search for more succinct representations of the data.
We will see examples of both types of unsupervised learning in the
following section.
Need for Machine Learning
Human beings, at this moment, are the most intelligent and advanced
species on earth because they can think, evaluate and solve complex
problems. On the other side, AI is still in its initial stage and haven’t
surpassed human intelligence in many aspects. Then the question is that
what is the need to make machine learn? The most suitable reason for doing
this is, “to make decisions, based on data, with efficiency and scale”.
Lately, organizations are investing heavily in newer technologies like
Artificial Intelligence, Machine Learning and Deep Learning to get the key
information from data to perform several real-world tasks and solve
problems. We can call it data-driven decisions taken by machines,
particularly to automate the process. These data-driven decisions can be
used, instead of using programing logic, in the problems that cannot be
programmed inherently. The fact is that we can’t do without human
intelligence, but other aspect is that we all need to solve real-world problems
with efficiency at a huge scale. That is why the need for machine learning
arises.
Challenges in Machines Learning :-
While Machine Learning is rapidly evolving, making significant strides with
cybersecurity and autonomous cars, this segment of AI as whole still has a
long way to go. The reason behind is that ML has not been able to overcome
number of challenges. The challenges that ML is facing currently are −
Quality of data − Having good-quality data for ML algorithms is one of the
biggest challenges. Use of low-quality data leads to the problems related to
data preprocessing and feature extraction.
Time-Consuming task − Another challenge faced by ML models is the
consumption of time especially for data acquisition, feature extraction and
retrieval.
Lack of specialist persons − As ML technology is still in its infancy stage,
availability of expert resources is a tough job.
No clear objective for formulating business problems − Having no clear
objective and well-defined goal for business problems is another key
challenge for ML because this technology is not that mature yet.
Issue of overfitting & underfitting − If the model is overfitting or
underfitting, it cannot be represented well for the problem.
Curse of dimensionality − Another challenge ML model faces is too many
features of data points. This can be a real hindrance.
Difficulty in deployment − Complexity of the ML model makes it quite
difficult to be deployed in real life.
Applications of Machines Learning :-
Machine Learning is the most rapidly growing technology and according to
researchers we are in the golden year of AI and ML. It is used to solve many
real-world complex problems which cannot be solved with traditional
approach. Following are some real-world applications of ML −
Emotion analysis
Sentiment analysis
Error detection and prevention
Weather forecasting and prediction
Stock market analysis and forecasting
Speech synthesis
Speech recognition
Customer segmentation
Object recognition
Fraud detection
Fraud prevention
Recommendation of products to customer in online shopping
How to Start Learning Machine Learning?
Arthur Samuel coined the term “Machine Learning” in 1959 and defined it
as a “Field of study that gives computers the capability to learn without
being explicitly programmed”.
And that was the beginning of Machine Learning! In modern times, Machine
Learning is one of the most popular (if not the most!) career choices.
According to Indeed, Machine Learning Engineer Is The Best Job of 2019
with a 344% growth and an average base salary of $146,085 per year.
But there is still a lot of doubt about what exactly is Machine Learning and
how to start learning it? So this article deals with the Basics of Machine
Learning and also the path you can follow to eventually become a full-
fledged Machine Learning Engineer. Now let’s get started!!!
How to start learning ML?
This is a rough roadmap you can follow on your way to becoming an
insanely talented Machine Learning Engineer. Of course, you can always
modify the steps according to your needs to reach your desired end-goal!
Step 1 – Understand the Prerequisites
In case you are a genius, you could start ML directly but normally, there are
some prerequisites that you need to know which include Linear Algebra,
Multivariate Calculus, Statistics, and Python. And if you don’t know these,
never fear! You don’t need a Ph.D. degree in these topics to get started but
you do need a basic understanding.
(a) Learn Linear Algebra and Multivariate Calculus
Both Linear Algebra and Multivariate Calculus are important in Machine
Learning. However, the extent to which you need them depends on your role
as a data scientist. If you are more focused on application heavy machine
learning, then you will not be that heavily focused on maths as there are
many common libraries available. But if you want to focus on R&D in
Machine Learning, then mastery of Linear Algebra and Multivariate Calculus
is very important as you will have to implement many ML algorithms from
scratch.
(b) Learn Statistics
Data plays a huge role in Machine Learning. In fact, around 80% of your
time as an ML expert will be spent collecting and cleaning data. And
statistics is a field that handles the collection, analysis, and presentation of
data. So it is no surprise that you need to learn it!!!
Some of the key concepts in statistics that are important are Statistical
Significance, Probability Distributions, Hypothesis Testing, Regression, etc.
Also, Bayesian Thinking is also a very important part of ML which deals
with various concepts like Conditional Probability, Priors, and Posteriors,
Maximum Likelihood, etc.
(c) Learn Python
Some people prefer to skip Linear Algebra, Multivariate Calculus and
Statistics and learn them as they go along with trial and error. But the one
thing that you absolutely cannot skip is Python! While there are other
languages you can use for Machine Learning like R, Scala, etc. Python is
currently the most popular language for ML. In fact, there are many Python
libraries that are specifically useful for Artificial Intelligence and Machine
Learning such as Keras, TensorFlow, Scikit-learn, etc.
So if you want to learn ML, it’s best if you learn Python! You can do that
using various online resources and courses such as Fork Python available
Free on GeeksforGeeks.
Step 2 – Learn Various ML Concepts
Now that you are done with the prerequisites, you can move on to actually
learning ML (Which is the fun part!!!) It’s best to start with the basics and
then move on to the more complicated stuff. Some of the basic concepts in
ML are:
(a) Terminologies of Machine Learning
Model – A model is a specific representation learned from data by applying
some machine learning algorithm. A model is also called a hypothesis.
Feature – A feature is an individual measurable property of the data. A set
of numeric features can be conveniently described by a feature vector.
Feature vectors are fed as input to the model. For example, in order to predict
a fruit, there may be features like color, smell, taste, etc.
Target (Label) – A target variable or label is the value to be predicted by our
model. For the fruit example discussed in the feature section, the label with
each set of input would be the name of the fruit like apple, orange, banana,
etc.
Training – The idea is to give a set of inputs(features) and it’s expected
outputs(labels), so after training, we will have a model (hypothesis) that will
then map new data to one of the categories trained on.
Prediction – Once our model is ready, it can be fed a set of inputs to which it
will provide a predicted output(label).
(b) Types of Machine Learning
Supervised Learning – This involves learning from a training dataset with
labeled data using classification and regression models. This learning process
continues until the required level of performance is achieved.
Unsupervised Learning – This involves using unlabelled data and then
finding the underlying structure in the data in order to learn more and more
about the data itself using factor and cluster analysis models.
Semi-supervised Learning – This involves using unlabelled data like
Unsupervised Learning with a small amount of labeled data. Using labeled
data vastly increases the learning accuracy and is also more cost-effective
than Supervised Learning.
Reinforcement Learning – This involves learning optimal actions through
trial and error. So the next action is decided by learning behaviors that are
based on the current state and that will maximize the reward in the future.
Advantages of Machine learning :-
1. Easily identifies trends and patterns -
Machine Learning can review large volumes of data and discover specific
trends and patterns that would not be apparent to humans. For instance, for an
e-commerce website like Amazon, it serves to understand the browsing
behaviors and purchase histories of its users to help cater to the right products,
deals, and reminders relevant to them. It uses the results to reveal relevant
advertisements to them.
2. No human intervention needed (automation)
With ML, you don’t need to babysit your project every step of the way. Since
it means giving machines the ability to learn, it lets them make predictions and
also improve the algorithms on their own. A common example of this is anti-
virus softwares; they learn to filter new threats as they are recognized. ML is
also good at recognizing spam.
3. Continuous Improvement
As ML algorithms gain experience, they keep improving in accuracy and
efficiency. This lets them make better decisions. Say you need to make a
weather forecast model. As the amount of data you have keeps growing, your
algorithms learn to make more accurate predictions faster.
4. Handling multi-dimensional and multi-variety data
Machine Learning algorithms are good at handling data that are multi-
dimensional and multi-variety, and they can do this in dynamic or uncertain
environments.
5. Wide Applications
You could be an e-tailer or a healthcare provider and make ML work for you.
Where it does apply, it holds the capability to help deliver a much more
personal experience to customers while also targeting the right customers.
Disadvantages of Machine Learning :-
1. Data Acquisition
Machine Learning requires massive data sets to train on, and these should be
inclusive/unbiased, and of good quality. There can also be times where they
must wait for new data to be generated.
2. Time and Resources
ML needs enough time to let the algorithms learn and develop enough to fulfill
their purpose with a considerable amount of accuracy and relevancy. It also
needs massive resources to function. This can mean additional requirements of
computer power for you.
3. Interpretation of Results
Another major challenge is the ability to accurately interpret results generated
by the algorithms. You must also carefully choose the algorithms for your
purpose.
4. High error-susceptibility
Machine Learning is autonomous but highly susceptible to errors. Suppose you
train an algorithm with data sets small enough to not be inclusive. You end up
with biased predictions coming from a biased training set. This leads to
irrelevant advertisements being displayed to customers. In the case of ML,
such blunders can set off a chain of errors that can go undetected for long
periods of time. And when they do get noticed, it takes quite some time to
recognize the source of the issue, and even longer to correct it.
Python Development Steps : -
Guido Van Rossum published the first version of Python code (version 0.9.0)
at alt.sources in February 1991. This release included already exception
handling, functions, and the core data types of list, dict, str and others. It was
also object oriented and had a module system.
Python version 1.0 was released in January 1994. The major new features
included in this release were the functional programming tools lambda, map,
filter and reduce, which Guido Van Rossum never liked.Six and a half years
later in October 2000, Python 2.0 was introduced. This release included list
comprehensions, a full garbage collector and it was supporting
unicode.Python flourished for another 8 years in the versions 2.x before the
next major release as Python 3.0 (also known as "Python 3000" and "Py3K")
was released. Python 3 is not backwards compatible with Python 2.x.
The emphasis in Python 3 had been on the removal of duplicate
programming constructs and modules, thus fulfilling or coming close to
fulfilling the 13th law of the Zen of Python: "There should be one -- and
preferably only one -- obvious way to do it."Some changes in Python 7.3:
Print is now a function
Views and iterators instead of lists
The rules for ordering comparisons have been simplified. E.g. a
heterogeneous list cannot be sorted, because all the elements of a list must
be comparable to each other.
There is only one integer type left, i.e. int. long is int as well.
The division of two integers returns a float instead of an integer. "//" can be
used to have the "old" behaviour.
Text Vs. Data Instead Of Unicode Vs. 8-bit
Purpose :-
We demonstrated that our approach enables successful segmentation of
intra-retinal layers—even with low-quality images containing speckle noise,
low contrast, and different intensity ranges throughout—with the assistance
of the ANIS feature.
Python
Python is an interpreted high-level programming language for general-
purpose programming. Created by Guido van Rossum and first released in
1991, Python has a design philosophy that emphasizes code readability,
notably using significant whitespace.
Python features a dynamic type system and automatic memory management.
It supports multiple programming paradigms, including object-oriented,
imperative, functional and procedural, and has a large and comprehensive
standard library.
Python is Interpreted − Python is processed at runtime by the interpreter.
You do not need to compile your program before executing it. This is
similar to PERL and PHP.
Python is Interactive − you can actually sit at a Python prompt and interact
with the interpreter directly to write your programs.
Python also acknowledges that speed of development is important. Readable
and terse code is part of this, and so is access to powerful constructs that
avoid tedious repetition of code. Maintainability also ties into this may be an
all but useless metric, but it does say something about how much code you
have to scan, read and/or understand to troubleshoot problems or tweak
behaviors. This speed of development, the ease with which a programmer of
other languages can pick up basic Python skills and the huge standard
library is key to another area where Python excels. All its tools have been
quick to implement, saved a lot of time, and several of them have later been
patched and updated by people with no Python background - without
breaking.
Install Python Step-by-Step in Windows and Mac :
Python a versatile programming language doesn’t come pre-installed on your
computer devices. Python was first released in the year 1991 and until today it
is a very popular high-level programming language. Its style philosophy
emphasizes code readability with its notable use of great whitespace.
The object-oriented approach and language construct provided by Python
enables programmers to write both clear and logical code for projects. This
software does not come pre-packaged with Windows.
How to Install Python on Windows and Mac :
There have been several updates in the Python version over the years. The
question is how to install Python? It might be confusing for the beginner who
is willing to start learning Python but this tutorial will solve your query. The
latest or the newest version of Python is version 3.7.4 or in other words, it is
Python 3.
Note: The python version 3.7.4 cannot be used on Windows XP or earlier
devices.
Before you start with the installation process of Python. First, you need to
know about your System Requirements. Based on your system type i.e.
operating system and based processor, you must download the python version.
My system type is a Windows 64-bit operating system. So the steps below
are to install python version 3.7.4 on Windows 7 device or to install Python
3. Download the Python Cheatsheet here.The steps on how to install Python
on Windows 10, 8 and 7 are divided into 4 parts to help understand better.
Download the Correct version into the system
Step 1: Go to the official site to download and install python using Google
Chrome or any other web browser. OR Click on the following
link: https://www.python.org
Now, check for the latest and the correct version for your operating system.
Step 2: Click on the Download Tab.
Step 3: You can either select the Download Python for windows 3.7.4 button
in Yellow Color or you can scroll further down and click on download with
respective to their version. Here, we are downloading the most recent python
version for windows 3.7.4
Step 4: Scroll down the page until you find the Files option.
Step 5: Here you see a different version of python along with the operating
system.
• To download Windows 32-bit python, you can select any one from the three
options: Windows x86 embeddable zip file, Windows x86 executable installer
or Windows x86 web-based installer.
•To download Windows 64-bit python, you can select any one from the three
options: Windows x86-64 embeddable zip file, Windows x86-64 executable
installer or Windows x86-64 web-based installer.
Here we will install Windows x86-64 web-based installer. Here your first part
regarding which version of python is to be downloaded is completed. Now we
move ahead with the second part in installing python i.e. Installation
Note: To know the changes or updates that are made in the version you can
click on the Release Note Option.
Installation of Python
Step 1: Go to Download and Open the downloaded python version to carry
out the installation process.
Step 2: Before you click on Install Now, Make sure to put a tick on Add
Python 3.7 to PATH.
Step 3: Click on Install NOW After the installation is successful. Click on
Close.
With these above three steps on python installation, you have successfully and
correctly installed Python. Now is the time to verify the installation.
Note: The installation process might take a couple of minutes.
Verify the Python Installation
Step 1: Click on Start
Step 2: In the Windows Run Command, type “cmd”.
Step 3: Open the Command prompt option.
Step 4: Let us test whether the python is correctly installed. Type python –
V and press Enter.
Step 5: You will get the answer as 3.7.4
Note: If you have any of the earlier versions of Python already installed. You
must first uninstall the earlier version and then install the new one.
Check how the Python IDLE works
Step 1: Click on Start
Step 2: In the Windows Run command, type “python idle”.
Step 3: Click on IDLE (Python 3.7 64-bit) and launch the program
Step 4: To go ahead with working in IDLE you must first save the file. Click
on File > Click on Save
Step 5: Name the file and save as type should be Python files. Click on
SAVE. Here I have named the files as Hey World.
Step 6: Now for e.g. enter print
SYSTEM TESTING
SYSTEM TEST
The purpose of testing is to discover errors. Testing is the process of trying to
discover every conceivable fault or weakness in a work product. It provides a
way to check the functionality of components, sub assemblies, assemblies
and/or a finished product It is the process of exercising software with the intent
of ensuring that the Software system meets its requirements and user
expectations and does not fail in an unacceptable manner. There are various
types of test. Each test type addresses a specific testing requirement.
TYPES OF TESTS
Unit testing
Unit testing involves the design of test cases that validate that the
internal program logic is functioning properly, and that program inputs produce
valid outputs. All decision branches and internal code flow should be validated.
It is the testing of individual software units of the application .it is done after
the completion of an individual unit before integration. This is a structural
testing, that relies on knowledge of its construction and is invasive. Unit tests
perform basic tests at component level and test a specific business process,
application, and/or system configuration. Unit tests ensure that each unique path
of a business process performs accurately to the documented specifications and
contains clearly defined inputs and expected results.
Integration testing
Integration tests are designed to test integrated software
components to determine if they actually run as one program. Testing is event
driven and is more concerned with the basic outcome of screens or fields.
Integration tests demonstrate that although the components were individually
satisfaction, as shown by successfully unit testing, the combination of
components is correct and consistent. Integration testing is specifically aimed at
exposing the problems that arise from the combination of components.
Functional test
Functional tests provide systematic demonstrations that functions
tested are available as specified by the business and technical requirements,
system documentation, and user manuals.
Functional testing is centered on the following items:
Valid Input : identified classes of valid input must be accepted.
Invalid Input : identified classes of invalid input must be rejected.
Functions : identified functions must be exercised.
Output : identified classes of application outputs must be
exercised.
Systems/Procedures : interfacing systems or procedures must be invoked.
Organization and preparation of functional tests is focused on
requirements, key functions, or special test cases. In addition, systematic
coverage pertaining to identify Business process flows; data fields, predefined
processes, and successive processes must be considered for testing. Before
functional testing is complete, additional tests are identified and the effective
value of current tests is determined.
System Test
System testing ensures that the entire integrated software system
meets requirements. It tests a configuration to ensure known and predictable
results. An example of system testing is the configuration oriented system
integration test. System testing is based on process descriptions and flows,
emphasizing pre-driven process links and integration points.
White Box Testing
White Box Testing is a testing in which in which the software
tester has knowledge of the inner workings, structure and language of the
software, or at least its purpose. It is purpose. It is used to test areas that cannot
be reached from a black box level.
Black Box Testing
Black Box Testing is testing the software without any
knowledge of the inner workings, structure or language of the module being
tested. Black box tests, as most other kinds of tests, must be written from a
definitive source document, such as specification or requirements document,
such as specification or requirements document. It is a testing in which the
software under test is treated, as a black box .you cannot “see” into it. The test
provides inputs and responds to outputs without considering how the software
works.
Unit Testing
Unit testing is usually conducted as part of a combined code and
unit test phase of the software lifecycle, although it is not uncommon for coding
and unit testing to be conducted as two distinct phases.
Test strategy and approach
Field testing will be performed manually and functional tests will
be written in detail.
Test objectives
All field entries must work properly.
Pages must be activated from the identified link.
The entry screen, messages and responses must not be delayed.
Features to be tested
Verify that the entries are of the correct format
No duplicate entries should be allowed
All links should take the user to the correct page.
Integration Testing
Software integration testing is the incremental integration testing
of two or more integrated software components on a single platform to produce
failures caused by interface defects.
The task of the integration test is to check that components or software
applications, e.g. components in a software system or – one step up – software
applications at the company level – interact without error.
Test Results: All the test cases mentioned above passed successfully. No
defects encountered.
Acceptance Testing
User Acceptance Testing is a critical phase of any project and requires
significant participation by the end user. It also ensures that the system meets
the functional requirements.
Test Results: All the test cases mentioned above passed successfully. No
defects encountered.
Test cases1:
Test case for Login form:
FUNCTION: LOGIN
EXPECTED RESULTS: Should Validate the user and check his
existence in database
ACTUAL RESULTS: Validate the user and checking the user
against the database
LOW PRIORITY No
HIGH PRIORITY Yes
Test case2:
Test case for User Registration form:
FUNCTION: USER REGISTRATION
EXPECTED RESULTS: Should check if all the fields are filled
by the user and saving the user to
database.
ACTUAL RESULTS: Checking whether all the fields are field
by user or not through validations and
saving user.
LOW PRIORITY No
HIGH PRIORITY Yes
Test case3:
Test case for Change Password:
When the old password does not match with the new password ,then this
results in displaying an error message as “ OLD PASSWORD DOES NOT
MATCH WITH THE NEW PASSWORD”.
FUNCTION: Change Password
EXPECTED RESULTS: Should check if old password and new
password fields are filled by the user
and saving the user to database.
ACTUAL RESULTS: Checking whether all the fields are field
by user or not through validations and
saving user.
LOW PRIORITY No
HIGH PRIORITY Yes
SCREEN SHOTS
SCREEN SHOTS:
To run project double click on run.bat file to get below screen
In above screen click on ‘Upload UNSW-NB15 Dataset’ button to upload dataset
and then will get below output
In above screen selecting and uploading ‘UNSW’ dataset file and then click on
‘Open’ button to load dataset and then will get below output
In above screen dataset loaded and in text area can see dataset contains both
numeric and non-numeric values so by employing label encoder class will convert
non-numeric data to numeric data as Algorithm will take only numeric values. In
above graph x-axis represents attack names and y-axis represents count of those
attacks found in dataset. Now close above graph and then click on ‘Pre-process
Dataset’ button to clean dataset and then will get below output
In above screen can see all dataset values converted to numeric format and in last
lines can see dataset size and its features or column numbers and now click on
‘Dataset Train & Test Split’ button to split dataset into train and test and then will
get below output
In above screen can see train and test and now click on ‘Train Deep Learning GAN
Algorithm’ button to train model and get below output
In above screen GAN model got 98% accuracy and can see other metrics like
precision, recall etc. In Confusion matrix graph x-axis represents Predicted Labels
and y-axis represents True Labels and all different colour boxes in diagnol
represents correct prediction count and remaining all blue boxes represents
incorrect prediction count which are very few. Now close above graph and then
click on ‘Comparison Graph’ button to get below graph
In above graph all different colour bars represents different metrics and can see all
metrics are closer to 100%. Now close above graph and then click on ‘Attack
Prediction from Test Data’ button to upload test data and get below output
In above screen selecting and uploading ‘testdata.csv’ file and then click on ‘Open’
button to get below output
In above screen in square bracket can see Test data values of different signatures
and after arrow = symbol can see predicted attack names
In above screen can see Brute Force and DOS attack
In above screen can see ‘Normal’ packet predicted.
Similarly by changing test data values you can perform prediction on normal or
attack packets.
CONCLUSION
Conclusion:
In conclusion, the endeavor to detect real-time malicious intrusions and attacks in
IoT-empowered cybersecurity and infrastructures is critical in safeguarding the
integrity, confidentiality, and availability of data and services in an increasingly
interconnected world. Both existing and proposed systems showcase a
commitment to innovation and adaptation to confront the evolving landscape of
cyber threats.
Existing systems have laid a foundation for real-time detection, incorporating
technologies like intrusion detection systems (IDS), machine learning algorithms,
and anomaly detection techniques. However, they also face challenges stemming
from the complexity of IoT environments, resource constraints, and the lack of
standardized security protocols across devices.
Proposed systems build upon these foundations, offering advancements in anomaly
detection tailored for IoT, integration of edge computing for faster response times,
and the promotion of standardized security practices. Additionally, proposed
systems prioritize collaboration among stakeholders, leveraging threat intelligence
sharing and user education to bolster collective defense capabilities.
As IoT continues to proliferate across industries and sectors, the importance of
robust cybersecurity measures cannot be overstated. Real-time detection of
malicious intrusions and attacks is not merely a technological challenge but a
strategic imperative to mitigate risks and safeguard critical infrastructure, sensitive
data, and personal privacy.
Ultimately, the pursuit of effective real-time detection systems necessitates
ongoing collaboration, innovation, and adaptation to stay ahead of emerging
threats. By embracing a multi-layered approach that combines technological
advancements with proactive measures such as threat intelligence sharing and user
education, organizations can enhance their resilience against cyber threats and
ensure the continued trust and reliability of IoT-enabled systems and services.
FUTURE SCOPE
Future Scope:
One area of future work involves the development of even more advanced anomaly
detection techniques tailored specifically for IoT environments. As IoT ecosystems
continue to grow and diversify, there is a need for more sophisticated algorithms
capable of accurately distinguishing between normal and malicious behavior.
Research in this area could focus on refining machine learning models, exploring
novel anomaly detection algorithms, and leveraging big data analytics to enhance
detection accuracy.
Furthermore, future work should continue to explore the potential of edge
computing in enhancing real-time detection and response capabilities. Edge-based
detection offers significant advantages in terms of reduced latency and improved
efficiency, particularly in resource-constrained IoT environments. Research in this
area could focus on optimizing edge computing architectures, developing
lightweight detection algorithms suitable for edge deployment, and exploring new
approaches for data fusion and aggregation at the network edge.
Additionally, there is a need for continued research into standardized security
protocols and best practices for IoT devices and platforms. While progress has
been made in establishing common security standards, there is still room for
improvement, particularly in areas such as device authentication, secure
communication protocols, and over-the-air (OTA) updates. Future work could
focus on developing new security standards tailored specifically for IoT devices, as
well as promoting the adoption of existing standards through industry collaboration
and regulatory incentives.
Moreover, the integration of emerging technologies such as blockchain holds
promise for enhancing the security and integrity of IoT ecosystems. Blockchain
technology offers decentralized, tamper-resistant data storage and verification,
making it well-suited for securing IoT devices and transactions. Future work could
explore the use of blockchain-based solutions for device identity management,
secure data sharing, and ensuring the integrity of IoT data in real-time.
Furthermore, future research should focus on advancing collaborative defense
mechanisms and threat intelligence sharing platforms. By fostering greater
collaboration among stakeholders, including government agencies, industry
partners, and cybersecurity researchers, organizations can strengthen collective
defense capabilities and respond more effectively to cyber threats. Future work
could explore new approaches for automating threat intelligence sharing,
developing standardized formats for threat data exchange, and incentivizing
participation in collaborative defense initiatives.
Lastly, future work should continue to prioritize user awareness and education as a
fundamental aspect of cybersecurity in IoT environments. Investing in training
programs, awareness campaigns, and user-friendly security interfaces can
empower individuals to recognize and respond to potential threats effectively.
Research in this area could focus on developing innovative educational tools and
techniques, as well as evaluating the effectiveness of different approaches for
improving cybersecurity awareness and behavior among end-users.
In conclusion, future work in real-time detection of malicious intrusions and
attacks in IoT-empowered cybersecurity and infrastructures should focus on
advancing anomaly detection techniques, leveraging edge computing, enhancing
security protocols, exploring blockchain technology, fostering collaborative
defense mechanisms, and promoting user awareness and education.
REFFERENCES
References:
1. Alaba, F. A., Othman, M., & Hashem, I. A. T. (2017). Internet of Things
security: A survey. Journal of Network and Computer Applications, 88, 10-
28.
2. Khandelwal, M., & Kaur, P. (2019). Intrusion detection systems in internet
of things: A review. Computers & Electrical Engineering, 77, 81-97.
3. Zhang, Z., Cui, J., Qi, J., & Xiong, N. (2018). A survey on Internet of
Things: Architecture, enabling technologies, security and privacy, and
applications. IEEE Internet of Things Journal, 4(5), 1125-1142.
4. Dhanalakshmi, R., & Ganapathy, S. (2017). Internet of Things (IoT): A
literature review. Journal of Computer Science, 13(5), 246-255.
5. Verma, S., & Arora, A. (2019). A comprehensive survey of security
mechanisms and detection techniques in the internet of things. IEEE Access,
7, 100192-100226.
6. Atlam, H. F., & Walters, R. J. (2018). Intrusion detection system in the
internet of things: A review. International Journal of Computer Applications,
181(12), 38-45.
7. Shalini, R., & Padmapriya, N. (2019). A survey on intrusion detection
systems in Internet of Things. International Journal of Innovative
Technology and Exploring Engineering, 8(7S), 308-313.
8. Hussein, R., & Aung, Z. (2018). A survey of intrusion detection systems in
wireless sensor networks. Sensors, 18(11), 3820.
9. Kumar, P., & Lee, S. (2018). A survey of intrusion detection systems in IoT
based on clustering techniques. Sensors, 18(9), 2793.
10.Sharma, S., & Madaan, J. (2020). A comprehensive survey on intrusion
detection systems in Internet of Things. Computers & Electrical
Engineering, 85, 106628.
11.Botta, A., de Donato, W., Persico, V., & Pescapé, A. (2016). Integration of
cloud computing and internet of things: A survey. Future Generation
Computer Systems, 56, 684-700.
12.Mohanta, B. K., & Gope, P. (2018). A comprehensive survey on intrusion
detection system in Internet of Things. Procedia Computer Science, 132,
984-991.
13.Gupta, B. B., Singh, A. K., & Singh, A. (2019). An extensive review on
intrusion detection systems in the internet of things. Journal of King Saud
University-Computer and Information Sciences, 31(4), 398-428.
14.Hasan, M. A., Hossain, M. S., Mohamed, A., & Al-Fuqaha, A. (2018). A
survey on clustering algorithms for big data: Taxonomy and empirical
analysis. IEEE Transactions on Emerging Topics in Computing, 6(1), 110-
128.
15.Jia, X., & Gong, M. (2018). Research on intrusion detection system based on
machine learning in the Internet of Things. Procedia Computer Science, 131,
240-247.
16.Singh, D., & Jain, A. K. (2018). IoT based intrusion detection system for
smart home networks using deep learning approach. Procedia Computer
Science, 132, 204-213.
17.Yassein, M. B., El-Khatib, K., & Faisal, N. (2019). Intrusion detection
system based on artificial neural networks for IoT. IET Wireless Sensor
Systems, 9(1), 28-34.
18.Yang, K., Yang, S., & Huang, Q. (2017). An IoT intrusion detection system
based on ensemble classifier. Procedia Computer Science, 122, 21-28.
19.Nambi, A. S., & Duraiswamy, K. (2019). An ensemble-based intrusion
detection system for IoT networks using convolutional neural networks and
decision tree algorithm. Neural Computing and Applications, 31(12), 8695-
8708.
20.Agrawal, T., & Tyagi, S. (2018). Intrusion detection system for IoT based on
deep learning approach. International Journal of Grid and Distributed
Computing, 11(4), 23-34.
21.Pal, A., & Mehta, S. (2018). An enhanced intrusion detection system for IoT
network using deep learning approach. Procedia Computer Science, 132,
208-213.
22.Zhao, X., & Wang, H. (2018). A survey on deep learning-based methods for
network intrusion detection. IEEE Access, 6, 35543-35556.
23.Liu, C., Zhu, H., & Lin, Y. (2018). A survey of deep neural network
architectures and their applications. Neurocomputing, 234, 11-26.
24.Sgandurra, D., Spognardi, A., & Atzeni, A. (2018). Security and privacy in
Internet of Things: Methods, architectures, and solutions. IEEE Internet of
Things Journal, 5(1), 1-5.
25.Alqahtani, H., Abuzneid, A., Bera, P., & Mahmood, A. N. (2019). A
comprehensive review on secure data aggregation techniques in the Internet
of Things. Journal of Network and Computer Applications, 145, 102413.
26.Anand, S., & Swarup, V. (2018). A survey of Internet of Things (IoT)
authentication schemes. Journal of King Saud University-Computer and
Information Sciences, 32(4), 417-428.
27.Carullo, G., & De Rango, F. (2018). Anomaly-based intrusion detection in
Internet of Things through an intelligent approach. IEEE Internet of Things
Journal, 5(3), 1764-1773.
28.Bhattacharya, S., Banerjee, A., & Roy, S. (2019). A hybrid anomaly
detection system for IoT networks. Future Generation Computer Systems,
95, 237-249.
29.Wang, Y., Atkinson, G., & Furnell, S. (2018). Dynamic adaptive rule-based
intrusion detection for IoT applications. Journal of Information Security and
Applications, 42, 82-91.
30.Mahalle, P. N., & Mukhopadhyay, S. (2019). Cyber-physical intrusion
detection in Internet-of-Things: A review. Journal of Network and Computer
Applications, 144, 34-56.
31.Liao, Y., & Vasilakos, A. V. (2017). A survey of mobile malware in the
wild. IEEE Communications Surveys & Tutorials, 19(2), 1476-1497.
32.Botta, A., De Donato, W., Persico, V., & Pescapé, A. (2016). Integration of
cloud computing and Internet of Things: A survey. Future Generation
Computer Systems, 56, 684-700.
33.Ahmed, E., Yaqoob, I., Hashem, I. A. T., Khan, I., Ahmed, A. I. A., Imran,
M., ... & Guizani, N. (2016). The role of big data analytics in Internet of
Things. Computer Networks, 129, 459-471.
34.Mahmood, A. N., Abuzneid, A., Alqahtani, H., & Alrajeh, N. (2019). The
role of big data analytics in Internet of Things. Computers & Electrical
Engineering, 76, 358-371.
35.Lai, C. F., Li, C. T., & Hsieh, H. Y. (2018). A comprehensive review of
security of Internet-of-Things. IEEE Access, 6, 67100-67114.
36.Bay, J., Vázquez, E. A., & Cerrudo, C. (2019). Breaking Smart Locks: A
Case Study of S2 Protocol Vulnerability. Black Hat USA, 2019.
37.Han, J., Li, J., Chen, Y., & Shu, L. (2019). Security and privacy in smart city
applications: Challenges and solutions. IEEE Access, 7, 105691-105704.
38.Alaba, F. A., Othman, M., & Hashem, I. A. T. (2017). Internet of Things
security: A survey. Journal of Network and Computer Applications, 88, 10-
28.
39.Alabdulatif, A., & Alzahrani, A. (2018). A survey on the internet of things
security: Requirements, challenges, and solutions. International Journal of
Computer Applications, 181(35), 19-26.
40.Kalloniatis, C., Kavakli, E., & Gritzalis, S. (2016). Securely integrating
heterogeneous IoT infrastructures for enhanced security in smart cities.
Future Generation Computer Systems, 56, 720-733.
41.Di Pietro, R., & Mancini, L. V. (2019). Security and privacy issues in IoT-
based wearable computing systems. IEEE Systems Journal, 14(1), 48-59.
42.Wang, G., Chen, Y., Qin, Y., & Zhao, X. (2018). Security and privacy in the
Internet of Things: A survey. IEEE Internet of Things Journal, 5(5), 3610-
3628.
43.Mohajer, N., & Zhou, J. (2019). Secure and privacy-preserving
communication in IoT scenarios: A survey. IEEE Internet of Things Journal,
6(1), 551-576.
44.Abomhara, M., & Koien, G. M. (2015). Cyber security and the internet of
things: Vulnerabilities, threats, intruders and attacks. Journal of Cyber
Security and Mobility, 3(1), 65-88.
45.Vlajic, N., & Stevanovic, M. (2019). A survey of intrusion detection systems
for IoT-based healthcare applications. IEEE Access, 7, 30610-30627.
46.Cárdenas, A. A., Amin, S., & Sastry, S. (2011). Research challenges for the
security of control systems. ACM SIGBED Review, 8(4), 55-61.
47.Sang, Z., Zhang, H., & Li, P. (2019). Recent advances in intrusion detection
systems for Internet of Things in healthcare: A review. IEEE Access, 7,
14008-14018.
48.Vasilomanolakis, E., Daubert, J., & Mühlhäuser, M. (2015). Intrusion
detection in the Internet of Things. IEEE Security & Privacy, 14(5), 36-45.
49.Duan, Y., Wang, H., & He, D. (2016). Security and privacy in fog
computing: Challenges. IEEE Access, 4, 1059-1068.
50.Fernández-Caramés, T. M., & Fraga-Lamas, P. (2018). A review on the use
of blockchain for the internet of things. IEEE Access, 6, 32979-33001.