0% found this document useful (0 votes)
38 views60 pages

22.credit Card Fraud Detection Using Fuzzy Logic and Neural Network

This paper presents a two-stage neuro-fuzzy expert system for credit card fraud detection, combining fuzzy logic and neural networks to classify transactions as genuine, suspicious, or fraudulent. The first stage processes transactions using fuzzy clustering and address matching, while the second stage employs a neural network for verification. The proposed system aims to enhance detection accuracy and reduce financial losses associated with credit card fraud.

Uploaded by

gswetha0555
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views60 pages

22.credit Card Fraud Detection Using Fuzzy Logic and Neural Network

This paper presents a two-stage neuro-fuzzy expert system for credit card fraud detection, combining fuzzy logic and neural networks to classify transactions as genuine, suspicious, or fraudulent. The first stage processes transactions using fuzzy clustering and address matching, while the second stage employs a neural network for verification. The proposed system aims to enhance detection accuracy and reduce financial losses associated with credit card fraud.

Uploaded by

gswetha0555
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 60

Credit Card Fraud Detection Using Fuzzy Logic and Neural

Network

ABSTRACT:

In this paper, a two-stage neuro-fuzzy expert system has been proposed for credit card fraud
detection. An incoming transaction is initially processed by a pattern-matching system in the first
stage. This component comprises of a fuzzy clustering module and an address-matching module,
and each of them assigns a score to the transaction based on its extent of deviation. A fuzzy
inference system computes a suspicious score by combining these score values and accordingly
classifies the transaction as genuine, suspicious, or fraudulent. Once a transaction is detected as
suspicious, a neural network trained with history transactions is employed in the second stage to
verify whether it was an actual fraudulent action or an occasional deviation by the legitimate
user. The effectiveness of the proposed system has been verified by conducting experiments and
comparative analysis with other systems.
INTRODUCTION :

The use of credit card as a mode of payment for online as well as daily purchases has increased
in the last few decades. As a consequence, associated credit card fraud has also proliferated.
Fraud in credit card is practiced with an intension of acquiring goods without paying for it. This
kind of illegitimate actions on credit cards is done in two possible ways: by using stolen credit
cards physically (physical fraud) and by exploiting the card details without the knowledge of the
genuine cardholder (virtual fraud) via online transactions. The loss because of credit card fraud is
found to be very high in the past few years
LITERATURE SURVEY:

TITILE :
Credit card fraud detection with a neural-network
AUTHORS :
Ghosh, S., Reilly

ABSTRACT:
Frauds in credit card transactions are common today as most of us are using the credit card
payment methods more frequently. This is due to the advancement of Technology and increase
in online transaction resulting in frauds causing huge financial loss. Therefore, there is need for
effective methods to reduce the loss. In addition, fraudsters find ways to steal the credit card
information of the user by sending fake SMS and calls, also through masquerading attack,
phishing attack and so on. This paper aims in using the multiple algorithms of Machine learning
such as support vector machine (SVM), k-nearest neighbor (Knn) and artificial neural network
(ANN) in predicting the occurrence of the fraud. Further, we conduct a differentiation of the
accomplished supervised machine learning and deep learning techniques to differentiate between
fraud and non-fraud transactions.
2.1EXISTING SYSTEM :
In detecting credit card fraud, various techniques have been used including neural network (NN),
genetic algorithm, data mining, game-theoretic approach, support vector machines, and meta-
learning. Artificial neural network (ANN) is the first technique which was employed in credit
card fraud detection (CCFD). A feasibility study has been performed by Ghosh and Reilly for
Mellon Bank to test the potency of the ANN in CCFD and reached at a reduction of 20–40%
losses due to fraud [4]. An NN-based data mining technique known as CARDWATCH has been
proposed by Aleskerov et al. [5]. An online CCFDS has been presented by Dorronsoro et al.
based on a neural classifier where Fisher’s discriminant analysis is employed to distinguish the
fraudulent activities from the normal ones

2.2 Limitations of Existing Methods:


The fraud detection problem can be envisioned as a problem of data mining, where goal is to
determine whether an operation of transaction is a genuine one or a fraudulent one. Chan et al.
took huge number of transactions and divided them into smaller subsets, and then, distributed
data mining is applied for building prototypes of users’ behaviors. Chiu and Tsai have also
applied data mining concept in their work where Web service for data exchange was taken into
consideration . The prime concern in such real-life problem is that the fraction of fraudulent
transactions is comparatively less in numbers and hence requires finding out a rare occasion from
a huge set of genuine transactions. This may result in the causation of false alarms in many cases
and thus requires minimization. In case of failure in detecting a fraud case, there is a straight loss
to the company; moreover, the follow-up actions taken in addressing false alerts are high-priced
too. In addition, the change of behavior of cardholders and fraudsters is essential to be captured
by the CCFDS for reducing the mis classification.
PROPOSEDMETHOD;

3.1PROPOSED SYSTEM :

In this paper author is applying Fuzzy Logic member ship functions to correctly
detect Fraud, suspicious or legal credit card transactions. In propose paper author is
finding following values from dataset by using Fuzzy member ship functions After
extracting all values then we will find class label as FRAUD if LOW values are
more, label as Suspicious if MEDIUM values are more, label as FRAUD if HIGH
values are more.

In below screen we are showing code to extract all member values. Here we are
using 0 for LOW and 1 for MEDIUM and 2 for HIGH as LSTM and FUZZY will
take only numeric values not character values.

3.1.1 Proposed Methodology:


In this work, we have proposed a neuro-fuzzy expert system for credit card fraud detection
(NFES_CCFD) which ingrates evidences obtained from two distinct sources based on different
transaction attributes for analyzing the deviation of user’s behavior from his normal spending
profile. Furthermore, learning mechanism based on NN is used to affirm the suspicious cases.
The proposed FDS is divided into the following four components:
HARDWARE & SOFTWARE REQUIREMENTS:

HARD REQUIRMENTS :

 System : i3 or above.
 Ram : 4 GB.
 Hard Disk : 40 GB

SOFTWARE REQUIRMENTS :

 Operating system : Windows8 or Above.


 Coding Language : python
SYSTEMDESIGN

4.1 SYSTEM ARCHITECTURE :


4.2 DataFlowDiagram
4.3 .UML DIAGRAMS :

UML stands for Unified Modeling Language. UML is a standardized


general-purpose modeling language in the field of object-oriented software
engineering. The standard is managed, and was created by, the Object
Management Group.
The goal is for UML to become a common language for creating models of
object oriented computer software. In its current form UML is comprised of two
major components: a Meta-model and a notation. In the future, some form of
method or process may also be added to; or associated with, UML.
The Unified Modeling Language is a standard language for specifying,
Visualization, Constructing and documenting the artifacts of software system, as
well as for business modeling and other non-software systems.
The UML represents a collection of best engineering practices that have
proven successful in the modeling of large and complex systems.
The UML is a very important part of developing objects oriented software
and the software development process. The UML uses mostly graphical notations
to express the design of software projects.

GOALS:
The Primary goals in the design of the UML are as follows:
 Provide users a ready-to-use, expressive visual modeling Language so
that they can develop and exchange meaningful models.
 Provide extendibility and specialization mechanisms to extend the
core concepts.
 Be independent of particular programming languages and
development process.
 Provide a formal basis for understanding the modeling language.
 Encourage the growth of OO tools market.
 Support higher level development concepts such as collaborations,
frameworks, patterns and components.
 Integrate best practices.
4.3.1USE CASE DIAGRAM:

A use case diagram in the Unified Modeling Language (UML) is a type of


behavioral diagram defined by and created from a Use-case analysis. Its purpose is
to present a graphical overview of the functionality provided by a system in terms
of actors, their goals (represented as use cases), and any dependencies between
those use cases. The main purpose of a use case diagram is to show what system
functions are performed for which actor. Roles of the actors in the system can be
depicted.
upload credit card fraud dataaset

calcute fuzzy membership function

run fuzz logic algorithm

run LSTM algorithm


user

LSTM training graph

mse comparison

exit

4.3.2CLASS DIAGRAM:

In software engineering, a class diagram in the Unified Modeling Language


(UML) is a type of static structure diagram that describes the structure of a system
by showing the system's classes, their attributes, operations (or methods), and the
relationships among the classes. It explains which class contains information.

USER
DATASET

upload credit card fraud dataset()


calcute fuzzy membership function()
run fuzzy logic algoithm()
run lstm algorithm()
lstm training graph()
mse comprison graph()
exit()

4.3 3SEQUENCE DIAGRAM:

A sequence diagram in Unified Modeling Language (UML) is a kind of interaction


diagram that shows how processes operate with one another and in what order. It is
a construct of a Message Sequence Chart. Sequence diagrams are sometimes called
event diagrams, event scenarios, and timing diagrams.
user dataset

upload credit card fraud dataaset

caclute fuzzy membership function

run fuzzy logic algorithm

run lstm algorithm

lstm training graph

mse comoarison graph

exit

4.3.4COLLABRATION DIAGRAM:

Activity diagrams are graphical representations of workflows of stepwise activities


and actions with support for choice, iteration and concurrency. In the Unified
Modeling Language, activity diagrams can be used to describe the business and
operational step-by-step workflows of components in a system. An activity
diagram shows the overall flow of control.

1: upload credit card fraud dataaset


2: caclute fuzzy membership function
3: run fuzzy logic algorithm
4: run lstm algorithm
5: lstm training graph
6: mse comoarison graph
7: exit
user dataset
MODULES

To implement this project we have designed following modules

1) Upload Credit Card Fraud Dataset: using this module we will upload dataset
to application
2) Calculate Fuzzy Membership Functions: using this module we will extract
all membership function values from dataset
3) Run Fuzzy Logic Algorithm: using this module we will trained Fuzzy
algorithm on membership values and then test the algorithm in terms of
Mean Square Error (MSE refers to prediction error). The lower the MSE the
better is the algorithm
4) Run LSTM Algorithm: using this module we will trained LSTM algorithm
on same membership values and then calculate MSE
5) LSTM Training Graph: using this module we will plot LSTM training and
testing MSE
6) MSE Comparison Graph: using this module we will plot MSE comparison
graph between FUZZY algorithm and LSTM
1.1* CLASSIFICATION

Classification is a technique where we categorize data into a given number of


classes.The main goal of a classification problem is to identify the category/class to
which a new data will fall under. Image Classification is the task of assigning an
input image one label from a fixed set of categories. This is one of the core
problems in Computer Vision that, despite its simplicity, has a large variety of
practical applications . Many other seemingly distinct Computer Vision tasks (such
as object detection, segmentation) can be reduced to image classification.

1.2ClassificationAlgorithms:
LSTM networks extend the recurrent neural network (RNNs) mainly designed to deal with
situations in which RNNs do not work. When we talk about RNN, it is an algorithm that
processes the current input by taking into account the output of previous events (feedback) and
then storing it in the memory of its users for a brief amount of time (short-term memory). Of the
many applications, its most well-known ones are those in the areas of non-Markovian speech
control and music composition. However, there are some drawbacks to RNNs. It is the first to
fail to save information over long periods of time. Sometimes an ancestor of data stored a
considerable time ago is needed to determine the output of the present. However, RNNs are
utterly incapable of managing these "long-term dependencies." The second issue is that there is
no better control of which component of the context is required to continue and what part of the
past must be forgotten. Other issues associated with RNNs are the exploding or disappearing
slopes (explained later) that occur in training an RNN through backtracking. Therefore, the
Long-Short-Term Memory (LSTM) was introduced into the picture. It was designed so that the
problem of the gradient disappearing is eliminated almost entirely as the training model is
unaffected. Long-time lags within specific issues are solved using LSTMs, which also deal with
the effects of noise, distributed representations, or endless numbers. With LSTMs, they do not
meet the requirement to maintain the same number of states before the time required by the
hideaway Markov model (HMM). LSTMs offer us an extensive range of parameters like learning
rates and output and input biases. Therefore, there is no need for minor adjustments. The effort to
update each weight is decreased to O(1) by using LSTMs like those used in Back Propagation
Through Time (BPTT), which is a significant advantage

1.3 Machine Learning:


Machine learning is a growing technology which enables computers
to learn automatically from past data. Machine learning uses various
algorithms for building mathematical models and making predictions
using historical data or information. Currently, it is being used for various
tasks such as image recognition, speech recognition, email
filtering, Facebook auto-tagging, recommender system, and many
more.Machine Learning is said as a subset of artificial intelligence that is
mainly concerned with the development of algorithms which allow a
computer to learn from the data and past experiences on their own

AdvantagesofMachineLearning:
Machine learning is a field of computer science and artificial intelligence that
deals with the task of teaching computers to learn from data without being explicitly
programmed. It is a type of data mining that allows computers to “learn” on their own by
analyzing data sets and using pattern recognition. Machine learning has many benefits,
including improved accuracy, efficiency, and decision-making.

Handling large amounts of data: With the ever-growing volume of data generated
every day, it is increasingly difficult for humans to process and make sense of all this
information. Machine learning can help businesses handle large amounts of data more
efficiently and effectively and even use decision trees to take action on the information.

Reducing bias: Machine learning algorithms are not biased toward certain data sets,
unlike human beings, who may have personal biases that can distort their judgment. As a
result, machine learning can help reduce bias in business decisions.

Improving accuracy: Machine learning algorithms can achieve much higher accuracy
than humans when making predictions or classifying labeled data. This improved
accuracy can lead to better business outcomes and increased profits.

Discovering patterns and correlations: Machine learning can help businesses uncover
patterns and correlations in data that they may not have been able to detect otherwise.
These learning systems can lead to better decision-making and a deeper understanding of
the data.

Making predictions about future events: Machine learning algorithms can predict future
events, such as consumer behavior, stock prices, and election outcomes.
1.4Applications of Machine Learning

Image Recognition

Image recognition is one of the most common applications of machine


learning. It is used to identify objects, persons, places, digital images, etc. The popular
use case of image recognition and face detection is, Automatic friend tagging suggestion:
Facebook provides us a feature of auto friend tagging suggestion. Whenever we upload a
photo with our Facebook friends, then we automatically get a tagging suggestion with
name, and the technology behind this is machine learning's face detection and
recognition algorithm. It is based on the Facebook project named "Deep Face," which is
responsible for face recognition and person identification in the picture.

Speechrecognition:

While using Google, we get an option of "Search by voice," it comes


under speech recognition, and it's a popular application of machine learning.
Speech recognition is a process of converting voice instructions into text,
and it is also known as "Speech to text", or "Computer speech recognition."
At present, machine learning algorithms are widely used by various
applications of speech recognition. Google assistant, Siri, Cortana, and
Alexa are using speech recognition technology to follow the voice
instructions.

Traffic Prediction:
If we want to visit a new place, we take help of Google Maps, which shows
us the correct path with the shortest route and predicts the traffic conditions. It
predicts the traffic conditions such as whether traffic is cleared, slow-moving, or
heavily congested with the help of two ways:

1.Real Time location of the vehicle form Google Map app and sensors

2.Average time has taken on past days at the same time.

Product Recommendations

Machine learning is widely used by various e-commerce and


entertainment companies such as Amazon, Netflix, etc., for product
recommendation to the user. Whenever we search for some product on
Amazon, then we started getting an advertisement for the same product
while internet surfing on the same browser and this is because of machine
learning.

1.5 DEEPLEARNING

Deep learning technology is based on artificial neural networks


(ANNs). These
ANNsconstantlyreceivelearningalgorithmsandcontinuouslygrowingamounts
ofdatatoincrease the efficiency of training processes. The larger data
volumes are, the more efficient this process is. The training process is called
deep, because, with the time passing, a neural network covers a growing
number of levels .The deep erth is network penetrates, the higherit
Productivity is. DLalgorithmscancreatenewtaskstosolvecurrentones.

AdvantagesofDeepLearning
CreatingNewFeaturesOneofthemainbenefitsofdeeplearningovervariou
smachine learning algorithms is its ability to generate new features from a
limited series off eatures located in the training data set. Therefore, deep
learning algorithms can create new tasks to solve current ones. What does it
mean for data scientists working in technological startups? Since deep
learning can create features without a human intervention, data
scientistscansavemuchtimeonworkingwithbigdataandrelyingonthistechnolog
y.Itallowsthemtousemorecomplexsetsoffeaturesincomparisonwithtraditional
machinelearningsoftware.

Advanced Analysis
Duetoitsimproveddataprocessingmodels,deeplearninggeneratesactionableresults
when solving data science tasks. While machine learning works only with labelled
data,deep learning supportsunsupervisedlearningtechniquesthatallowthesystemtobecome
smarter on its own. The capacity to determine the most important features allows deep
learning to efficiently provide data scientists with conciseandreliable analysis results.

 APPLICATIONSOFDEEPLEARNING

Automaticspeechrecognition

Large-scale automatic speech recognition is the first and most convincing successful case
of deep learning. LSTM RNNs can learn "Very Deep Learning" tasks that involve multi-second
intervals containing speech events separated by thousands of discrete time steps,where one- time
step corresponds to about 10 milliseconds. LSTM with forget gates is competitive with
traditional speech recognizers on certain tasks. The initial success in speech recognition was
based on small-scale recognition tasks base on TIMIT. The data set contains630 speakers from
eight major dialects of American English, where each speaker reads 10sentences. Its small size
lets many configurations be tried. More importantly, the TIMIT task concerns phone-sequence
recognition, which, unlike word-sequence recognition, allows weak phone bigram language
models. This lets the strength of the acoustic modelling aspects of speech recognition be mor
eeasily analysed.
Imagerecognition

A common evaluation set for image classification is the MNIST database data
set.MNIST is composed of handwritten digits and includes 60,000 training examples and
10,000testexamples.As with TIMIT, its small size letsuserstest multiple configurations.
Acomprehensive list of results on this set is available. Deep learning- based image recognition
has become"superhuman",producing more accurate results than human contestants.This first
occurredin 2011.

Military
The United States Department of
Defenceapplieddeeplearningtotrainrobotsinnewtasksthroughobservation.

Bioinformatics

An auto encoder ANN was used in bio informatics, to predict geneontology annotations
and gene-function relationships. In medical informatics, deep learning was used to predict sleep
quality based on data from wearables and predictions of health complications from electronic
health record data.

Self-Drivingcars

Deep Learning is the force that is bringing autonomous driving to life.A millio nsets of
data are fed to a system to build a model, to train the machines to learn, and then test the results
ina safe environment. Data from cameras, sensors, geo- mapping is helping
createsuccinctandsophisticatedmodelstonavigatethroughtraffic ,identify paths, signage,
pedestrian- only routes, and real-time elements like traffic volume and road blockages.

Fraud-Detection

Another domain benefitting from Deep Learning is the banking and financial sector that
is plagued with the task of fraud detection with money transactions going digital. Auto encoders
in Keras and TensorFlow are being developed to detect credit card frauds
savingbillionsofdollarsofcostinrecoveryandinsuranceforfinancialinstitutions.Fraud prevention and
detection are done based on identifying patterns in customer transactions and credit scores,
identifying a nomal ous behaviour and outliers.

 CNNARCHITECTURES
Convolutional Neural Network (CNN, or ConvNet) are a special kind of multi-layer
neural networks,designedtorecognizevisualpatternsdirectlyfrompixelimageswithminimalpre-
processing.

VGGNet

VGGNet consists of 16 convolutional layers and is very appealing because of its very
uniform architecture. Similar to AlexNet, only3x3 convolutions, but lots of filters. Traine don 4
GPUs for 2–3 weeks. It is currently the most preferred choice in the community for extracting
features from images.TheweightconfigurationoftheVGGNetispubliclyavailable and has been used
in many other applications and challenges as a baseline feature extractor stands for Visual
Geometry Group. VGGN etisa neural network that performed very well in the Image Net Large
Scale Visual Recognition Challenge (ILSVRC) in 2014. It scored first place on the image
localization task and second place on the image classification task.

VGG-19:
VGG-19 is a convolutional neural network (CNN) architecture developed by the Visual
Geometry Group (VGG) at the University of Oxford. It is a deep neural network with 19 layers
and was introduced as part of the ImageNet Large Scale Visual Recognition Challenge
(ILSVRC) in 2014.One of the key innovations in VGG-19 is its deep architecture, which allows
for a more expressive representation of image features.VGG-19 is trained on the Image Net data
set, which consists of millions of labeled images from thousands of categories

Inception v3 algorithm:
Inception-v3 is a convolutional neural network (CNN) architecture developed by Google for
image recognition and classification tasks. It is an improvement over the original Inception
model. The Inception-v3 architecture is designed to be deeper and more powerful than previous
CNNs. It consists of 48 layers, including convolutional layers, pooling layers, and fully
connected layers. It also includes several unique features such as the Inception module. This
technique reduces the number of parameters in the model, which helps to reduce over fitting and
improve performance

ADVANTAGESOFCNN
 Once trained ,the prediction sarepretty fast.

 With any number of inputs and layers, CNN cantrain.

 Neural networks work best with more data points.

 One of the powerful models in classification.

DISADVANTAGESOFCNN

 High Computational cost.

 They use toneeda lot of Training data.

APPLICATIONSOFCNN

ImageRecognition
CNNs are often used in image recognition systems. In 2012 an error rate of 0.23percent
on the MNIST data base was reported. Another paper on using CNN for image classification
reported that the learning process was "surprisingly fast"; in the same paper, the best published
results as of 2011 were achieved in the MNIST database and the NORB data base. Subsequently,
a similar CNN called Alex Net won the Image Net Large Scale Visual Recognition Challenge
2012. Image recognition, in the context of machine vision, is the ability of software to identify
objects, places,people,writing and actions in images. Computers can use machine vision
technologies in combination with a camera and
artificialintelligencesoftwaretoachieveimagerecognition.
VideoAnalysis

Compared to image data domains, there is relatively little work on applying CNNs to
video classification. Video is more complex than images since it has another
(temporal)dimension. However, some extensions of CNN sintothe video domain have been
explored.One approach is to treat space and time as equivalent dimensions of the input and
perform convolutions in both time and space. Another way is to fuse the features of two
convolutional neural networks, one for the spatialan done for the temporal stream. Long short-
termmemory (LSTM) recurrent units are typically incorporated after the CNN to account for
inter -frame or inter-clip dependencies. Unsupervised learning schemes for training spatial-
temporal features have been introduced, based on Convolutional Gated Restricted Boltzmann.

Historicand Environmental Collections


CNNs are also used for more complex purposes such as natural history
collections.Thesecollectionsactaskeyplayersindocumentingmajorpartsofhistorysuchasbiodiversity
,evolution,habit at loss, biological in vasion, and climate change.

UnderstandingClimate

Thereasonswhyweseesuchdrasticchangesandhowwecouldexperiment in curbing the


effect. It is said that the data in such natural history collections can also provide greater social
and scientific insights, but this would require skilled human resources such as researchers who
can physically visit these types of repositories.

Decoding Facial Recognition

Facialrecognitionisbrokendownbyaconvolutionalneuralnetworkintothefollowing major
components–

 Identifying every face in the picture.

 Focusing on each face despite external factors, suchas light,angle,pose,etc.

 Identifyinguniquefeatures.
 Comparingallthecollecteddatawithalreadyexistingdatainthedatabase
tomatchafacewithaname. A similar process is followed for scene
labelling as well.

Analysing Documents

Convolutional neural networks can also be used for document analysis. This is not just useful for
handwriting analysis, but also has a major stake in recognizers. For a machine to be able to scan an
individual's writing, and then compare that to the wide database it has, it must execute almost a million
commands a minute. It is said with the use of CNNs and newe rmodels and algorithms, the error rate has
been brought down toa minimum of 0.4% at acharacter level, though its complete testing i s yet to be
widelyseen.
SOFTWARE ENVIRONMENT :

What is Python :

Below are some facts about Python.

Python is currently the most widely used multi-purpose, high-level programming language.

Python allows programming in Object-Oriented and Procedural paradigms. Python programs


generally are smaller than other programming languages like Java.

Programmers have to type relatively less and indentation requirement of the language, makes
them readable all the time.

Python language is being used by almost all tech-giant companies like – Google, Amazon,
Facebook, Instagram, Dropbox, Uber… etc.

The biggest strength of Python is huge collection of standard library which can be used for the
following –

 Machine Learning

 GUI Applications (like Kivy, Tkinter, PyQt etc. )

 Web frameworks like Django (used by YouTube, Instagram, Dropbox)

 Image processing (like Opencv, Pillow)

 Web scraping (like Scrapy, BeautifulSoup, Selenium)

 Test frameworks

 Multimedia

Advantages of Python :-

Let’s see how Python dominates over other languages.

1. Extensive Libraries

Python downloads with an extensive library and it contain code for various purposes like regular
expressions, documentation-generation, unit-testing, web browsers, threading, databases, CGI,
email, image manipulation, and more. So, we don’t have to write the complete code for that
manually.
2. Extensible

As we have seen earlier, Python can be extended to other languages. You can write some of your
code in languages like C++ or C. This comes in handy, especially in projects.

3. Embeddable

Complimentary to extensibility, Python is embeddable as well. You can put your Python code in
your source code of a different language, like C++. This lets us add scripting capabilities to our
code in the other language.

4. Improved Productivity

The language’s simplicity and extensive libraries render programmers more productive than
languages like Java and C++ do. Also, the fact that you need to write less and get more things
done.

5. IOT Opportunities

Since Python forms the basis of new platforms like Raspberry Pi, it finds the future bright for the
Internet Of Things. This is a way to connect the language with the real world.

When working with Java, you may have to create a class to print ‘Hello World’. But in Python,
just a print statement will do. It is also quite easy to learn, understand, and code. This is why
when people pick up Python, they have a hard time adjusting to other more verbose languages
like Java.

7. Readable

Because it is not such a verbose language, reading Python is much like reading English. This is
the reason why it is so easy to learn, understand, and code. It also does not need curly braces to
define blocks, and indentation is mandatory. This further aids the readability of the code.

8. Object-Oriented

This language supports both the procedural and object-oriented programming paradigms. While
functions help us with code reusability, classes and objects let us model the real world. A class
allows the encapsulation of data and functions into one.

9. Free and Open-Source

Like we said earlier, Python is freely available. But not only can you download Python for free,
but you can also download its source code, make changes to it, and even distribute it. It
downloads with an extensive collection of libraries to help you with your tasks.
10. Portable

When you code your project in a language like C++, you may need to make some changes to it if
you want to run it on another platform. But it isn’t the same with Python. Here, you need to code
only once, and you can run it anywhere. This is called Write Once Run Anywhere (WORA).
However, you need to be careful enough not to include any system-dependent features.

11. Interpreted

Lastly, we will say that it is an interpreted language. Since statements are executed one by
one, debugging is easier than in compiled languages.

Any doubts till now in the advantages of Python? Mention in the comment section.

Advantages of Python Over Other Languages :

1. Less Coding

Almost all of the tasks done in Python requires less coding when the same task is done in other
languages. Python also has an awesome standard library support, so you don’t have to search for
any third-party libraries to get your job done. This is the reason that many people suggest
learning Python to beginners.

2. Affordable

Python is free therefore individuals, small companies or big organizations can leverage the free
available resources to build applications. Python is popular and widely used so it gives you better
community support.

The 2019 Github annual survey showed us that Python has overtaken Java in the most popular
programming language category.

3. Python is for Everyone

Python code can run on any machine whether it is Linux, Mac or Windows. Programmers need
to learn different languages for different jobs but with Python, you can professionally build web
apps, perform data analysis and machine learning, automate things, do web scraping and also
build games and powerful visualizations. It is an all-rounder programming language.

Disadvantages of Python
So far, we’ve seen why Python is a great choice for your project. But if you choose it, you should
be aware of its consequences as well. Let’s now see the downsides of choosing Python over
another language.

1. Speed Limitations

We have seen that Python code is executed line by line. But since Python is interpreted, it often
results in slow execution. This, however, isn’t a problem unless speed is a focal point for the
project. In other words, unless high speed is a requirement, the benefits offered by Python are
enough to distract us from its speed limitations.

2. Weak in Mobile Computing and Browsers

While it serves as an excellent server-side language, Python is much rarely seen on the client-
side. Besides that, it is rarely ever used to implement smartphone-based applications. One such
application is called Carbonnelle.

The reason it is not so famous despite the existence of Brython is that it isn’t that secure.

3. Design Restrictions

As you know, Python is dynamically-typed. This means that you don’t need to declare the type
of variable while writing the code. It uses duck-typing. But wait, what’s that? Well, it just means
that if it looks like a duck, it must be a duck. While this is easy on the programmers during
coding, it can raise run-time errors.

4. Underdeveloped Database Access Layers

Compared to more widely used technologies like JDBC (Java DataBase


Connectivity) and ODBC (Open DataBase Connectivity), Python’s database access layers are a
bit underdeveloped. Consequently, it is less often applied in huge enterprises.

5. Simple

No, we’re not kidding. Python’s simplicity can indeed be a problem. Take my example. I don’t
do Java, I’m more of a Python person. To me, its syntax is so simple that the verbosity of Java
code seems unnecessary.

This was all about the Advantages and Disadvantages of Python Programming Language.

History of Python : -

What do the alphabet and the programming language Python have in common? Right, both start
with ABC. If we are talking about ABC in the Python context, it's clear that the programming
language ABC is meant. ABC is a general-purpose programming language and programming
environment, which had been developed in the Netherlands, Amsterdam, at the CWI (Centrum
Wiskunde &Informatica). The greatest achievement of ABC was to influence the design of
Python.Python was conceptualized in the late 1980s. Guido van Rossum worked that time in a
project at the CWI, called Amoeba, a distributed operating system. In an interview with Bill
Venners1, Guido van Rossum said: "In the early 1980s, I worked as an implementer on a team
building a language called ABC at Centrum voor Wiskunde en Informatica (CWI). I don't know
how well people know ABC's influence on Python. I try to mention ABC's influence because I'm
indebted to everything I learned during that project and to the people who worked on it."Later on
in the same Interview, Guido van Rossum continued: "I remembered all my experience and some
of my frustration with ABC. I decided to try to design a simple scripting language that possessed
some of ABC's better properties, but without its problems. So I started typing. I created a simple
virtual machine, a simple parser, and a simple runtime. I made my own version of the various
ABC parts that I liked. I created a basic syntax, used indentation for statement grouping instead
of curly braces or begin-end blocks, and developed a small number of powerful data types: a
hash table (or dictionary, as we call it), a list, strings, and numbers."

What is Machine Learning : -

Before we take a look at the details of various machine learning methods, let's start by looking at
what machine learning is, and what it isn't. Machine learning is often categorized as a subfield of
artificial intelligence, but I find that categorization can often be misleading at first brush. The
study of machine learning certainly arose from research in this context, but in the data science
application of machine learning methods, it's more helpful to think of machine learning as a
means of building models of data.

Fundamentally, machine learning involves building mathematical models to help understand


data. "Learning" enters the fray when we give these models tunable parameters that can be
adapted to observed data; in this way the program can be considered to be "learning" from the
data. Once these models have been fit to previously seen data, they can be used to predict and
understand aspects of newly observed data. I'll leave to the reader the more philosophical
digression regarding the extent to which this type of mathematical, model-based "learning" is
similar to the "learning" exhibited by the human brain.Understanding the problem setting in
machine learning is essential to using these tools effectively, and so we will start with some
broad categorizations of the types of approaches we'll discuss here.

Categories Of Machine Leaning :-

At the most fundamental level, machine learning can be categorized into two main types:
supervised learning and unsupervised learning.

Supervised learning involves somehow modeling the relationship between measured features of
data and some label associated with the data; once this model is determined, it can be used to
apply labels to new, unknown data. This is further subdivided into classification tasks
and regression tasks: in classification, the labels are discrete categories, while in regression, the
labels are continuous quantities. We will see examples of both types of supervised learning in the
following section.

Unsupervised learning involves modeling the features of a dataset without reference to any label,
and is often described as "letting the dataset speak for itself." These models include tasks such
as clustering and dimensionality reduction. Clustering algorithms identify distinct groups of data,
while dimensionality reduction algorithms search for more succinct representations of the data.
We will see examples of both types of unsupervised learning in the following section.

Need for Machine Learning

Human beings, at this moment, are the most intelligent and advanced species on earth because
they can think, evaluate and solve complex problems. On the other side, AI is still in its initial
stage and haven’t surpassed human intelligence in many aspects. Then the question is that what
is the need to make machine learn? The most suitable reason for doing this is, “to make
decisions, based on data, with efficiency and scale”.

Lately, organizations are investing heavily in newer technologies like Artificial Intelligence,
Machine Learning and Deep Learning to get the key information from data to perform several
real-world tasks and solve problems. We can call it data-driven decisions taken by machines,
particularly to automate the process. These data-driven decisions can be used, instead of using
programing logic, in the problems that cannot be programmed inherently. The fact is that we
can’t do without human intelligence, but other aspect is that we all need to solve real-world
problems with efficiency at a huge scale. That is why the need for machine learning arises.

Challenges in Machines Learning :-

While Machine Learning is rapidly evolving, making significant strides with cybersecurity and
autonomous cars, this segment of AI as whole still has a long way to go. The reason behind is
that ML has not been able to overcome number of challenges. The challenges that ML is facing
currently are −

Quality of data − Having good-quality data for ML algorithms is one of the biggest challenges.
Use of low-quality data leads to the problems related to data preprocessing and feature
extraction.

Time-Consuming task − Another challenge faced by ML models is the consumption of time


especially for data acquisition, feature extraction and retrieval.
Lack of specialist persons − As ML technology is still in its infancy stage, availability of expert
resources is a tough job.

No clear objective for formulating business problems − Having no clear objective and well-
defined goal for business problems is another key challenge for ML because this technology is
not that mature yet.

Issue of overfitting & underfitting − If the model is overfitting or underfitting, it cannot be


represented well for the problem.

Curse of dimensionality − Another challenge ML model faces is too many features of data
points. This can be a real hindrance.

Difficulty in deployment − Complexity of the ML model makes it quite difficult to be deployed


in real life.

Applications of Machines Learning :-

Machine Learning is the most rapidly growing technology and according to researchers we are in
the golden year of AI and ML. It is used to solve many real-world complex problems which
cannot be solved with traditional approach. Following are some real-world applications of ML −

 Emotion analysis

 Sentiment analysis

 Error detection and prevention

 Weather forecasting and prediction

 Stock market analysis and forecasting

 Speech synthesis

 Speech recognition

 Customer segmentation

 Object recognition

 Fraud detection

 Fraud prevention

 Recommendation of products to customer in online shoppingHow to Start Learning


Machine Learning?
Arthur Samuel coined the term “Machine Learning” in 1959 and defined it as a “Field of study
that gives computers the capability to learn without being explicitly programmed”.

And that was the beginning of Machine Learning! In modern times, Machine Learning is one of
the most popular (if not the most!) career choices. According to Indeed, Machine Learning
Engineer Is The Best Job of 2019 with a 344% growth and an average base salary
of $146,085 per year.

But there is still a lot of doubt about what exactly is Machine Learning and how to start learning
it? So this article deals with the Basics of Machine Learning and also the path you can follow to
eventually become a full-fledged Machine Learning Engineer. Now let’s get started!!!

How to start learning ML?

This is a rough roadmap you can follow on your way to becoming an insanely talented Machine
Learning Engineer. Of course, you can always modify the steps according to your needs to reach
your desired end-goal!

Step 1 – Understand the Prerequisites

In case you are a genius, you could start ML directly but normally, there are some prerequisites
that you need to know which include Linear Algebra, Multivariate Calculus, Statistics, and
Python. And if you don’t know these, never fear! You don’t need a Ph.D. degree in these topics
to get started but you do need a basic understanding.

(a) Learn Linear Algebra and Multivariate Calculus

Both Linear Algebra and Multivariate Calculus are important in Machine Learning. However,
the extent to which you need them depends on your role as a data scientist. If you are more
focused on application heavy machine learning, then you will not be that heavily focused on
maths as there are many common libraries available. But if you want to focus on R&D in
Machine Learning, then mastery of Linear Algebra and Multivariate Calculus is very important
as you will have to implement many ML algorithms from scratch.

(b) Learn Statistics

Data plays a huge role in Machine Learning. In fact, around 80% of your time as an ML expert
will be spent collecting and cleaning data. And statistics is a field that handles the collection,
analysis, and presentation of data. So it is no surprise that you need to learn it!!!
Some of the key concepts in statistics that are important are Statistical Significance, Probability
Distributions, Hypothesis Testing, Regression, etc. Also, Bayesian Thinking is also a very
important part of ML which deals with various concepts like Conditional Probability, Priors, and
Posteriors, Maximum Likelihood, etc.

(c) Learn Python


Some people prefer to skip Linear Algebra, Multivariate Calculus and Statistics and learn them
as they go along with trial and error. But the one thing that you absolutely cannot skip is Python!
While there are other languages you can use for Machine Learning like R, Scala, etc. Python is
currently the most popular language for ML. In fact, there are many Python libraries that are
specifically useful for Artificial Intelligence and Machine Learning such
as Keras, TensorFlow, Scikit-learn, etc.

So if you want to learn ML, it’s best if you learn Python! You can do that using various online
resources and courses such as Fork Python available Free on GeeksforGeeks.

Step 2 – Learn Various ML Concepts

Now that you are done with the prerequisites, you can move on to actually learning ML (Which
is the fun part!!!) It’s best to start with the basics and then move on to the more complicated
stuff. Some of the basic concepts in ML are:

(a) Terminologies of Machine Learning

 Model – A model is a specific representation learned from data by applying some


machine learning algorithm. A model is also called a hypothesis.

 Feature – A feature is an individual measurable property of the data. A set of numeric


features can be conveniently described by a feature vector. Feature vectors are fed as
input to the model. For example, in order to predict a fruit, there may be features like
color, smell, taste, etc.

 Target (Label) – A target variable or label is the value to be predicted by our model. For
the fruit example discussed in the feature section, the label with each set of input would
be the name of the fruit like apple, orange, banana, etc.

 Training – The idea is to give a set of inputs(features) and it’s expected outputs(labels),
so after training, we will have a model (hypothesis) that will then map new data to one of
the categories trained on.

 Prediction – Once our model is ready, it can be fed a set of inputs to which it will provide
a predicted output(label).

(b) Types of Machine Learning

 Supervised Learning – This involves learning from a training dataset with labeled data
using classification and regression models. This learning process continues until the
required level of performance is achieved.
 Unsupervised Learning – This involves using unlabelled data and then finding the
underlying structure in the data in order to learn more and more about the data itself using
factor and cluster analysis models.

 Semi-supervised Learning – This involves using unlabelled data like Unsupervised


Learning with a small amount of labeled data. Using labeled data vastly increases the
learning accuracy and is also more cost-effective than Supervised Learning.

 Reinforcement Learning – This involves learning optimal actions through trial and error.
So the next action is decided by learning behaviors that are based on the current state and
that will maximize the reward in the future.

Advantages of Machine learning :-

1. Easily identifies trends and patterns -

Machine Learning can review large volumes of data and discover specific trends and patterns
that would not be apparent to humans. For instance, for an e-commerce website like Amazon, it
serves to understand the browsing behaviors and purchase histories of its users to help cater to
the right products, deals, and reminders relevant to them. It uses the results to reveal relevant
advertisements to them.

2. No human intervention needed (automation)

With ML, you don’t need to babysit your project every step of the way. Since it means giving
machines the ability to learn, it lets them make predictions and also improve the algorithms on
their own. A common example of this is anti-virus softwares; they learn to filter new threats as
they are recognized. ML is also good at recognizing spam.

3. Continuous Improvement

As ML algorithms gain experience, they keep improving in accuracy and efficiency. This lets
them make better decisions. Say you need to make a weather forecast model. As the amount of
data you have keeps growing, your algorithms learn to make more accurate predictions faster.

4. Handling multi-dimensional and multi-variety data

Machine Learning algorithms are good at handling data that are multi-dimensional and multi-
variety, and they can do this in dynamic or uncertain environments.

5. Wide Applications

You could be an e-tailer or a healthcare provider and make ML work for you. Where it does
apply, it holds the capability to help deliver a much more personal experience to customers while
also targeting the right customers.
Disadvantages of Machine Learning :-

1. Data Acquisition

Machine Learning requires massive data sets to train on, and these should be inclusive/unbiased,
and of good quality. There can also be times where they must wait for new data to be generated.

2. Time and Resources

ML needs enough time to let the algorithms learn and develop enough to fulfill their purpose
with a considerable amount of accuracy and relevancy. It also needs massive resources to
function. This can mean additional requirements of computer power for you.

3. Interpretation of Results

Another major challenge is the ability to accurately interpret results generated by the algorithms.
You must also carefully choose the algorithms for your purpose.

4. High error-susceptibility

Machine Learning is autonomous but highly susceptible to errors. Suppose you train an
algorithm with data sets small enough to not be inclusive. You end up with biased predictions
coming from a biased training set. This leads to irrelevant advertisements being displayed to
customers. In the case of ML, such blunders can set off a chain of errors that can go undetected
for long periods of time. And when they do get noticed, it takes quite some time to recognize the
source of the issue, and even longer to correct it.

Python Development Steps : -

Guido Van Rossum published the first version of Python code (version 0.9.0) at alt.sources in
February 1991. This release included already exception handling, functions, and the core data
types of list, dict, str and others. It was also object oriented and had a module system.
Python version 1.0 was released in January 1994. The major new features included in this release
were the functional programming tools lambda, map, filter and reduce, which Guido Van
Rossum never liked.Six and a half years later in October 2000, Python 2.0 was introduced. This
release included list comprehensions, a full garbage collector and it was supporting
unicode.Python flourished for another 8 years in the versions 2.x before the next major release as
Python 3.0 (also known as "Python 3000" and "Py3K") was released. Python 3 is not backwards
compatible with Python 2.x. The emphasis in Python 3 had been on the removal of duplicate
programming constructs and modules, thus fulfilling or coming close to fulfilling the 13th law of
the Zen of Python: "There should be one -- and preferably only one -- obvious way to do
it."Some changes in Python 7.3:
 Print is now a function

 Views and iterators instead of lists

 The rules for ordering comparisons have been simplified. E.g. a heterogeneous list cannot
be sorted, because all the elements of a list must be comparable to each other.

 There is only one integer type left, i.e. int. long is int as well.

 The division of two integers returns a float instead of an integer. "//" can be used to have
the "old" behaviour.

 Text Vs. Data Instead Of Unicode Vs. 8-bit

Purpose :-

We demonstrated that our approach enables successful segmentation of intra-retinal layers—


even with low-quality images containing speckle noise, low contrast, and different intensity
ranges throughout—with the assistance of the ANIS feature.

Python

Python is an interpreted high-level programming language for general-purpose programming.


Created by Guido van Rossum and first released in 1991, Python has a design philosophy that
emphasizes code readability, notably using significant whitespace.

Python features a dynamic type system and automatic memory management. It supports multiple
programming paradigms, including object-oriented, imperative, functional and procedural, and
has a large and comprehensive standard library.

 Python is Interpreted − Python is processed at runtime by the interpreter. You do not need
to compile your program before executing it. This is similar to PERL and PHP.

 Python is Interactive − you can actually sit at a Python prompt and interact with the
interpreter directly to write your programs.

Python also acknowledges that speed of development is important. Readable and terse code is
part of this, and so is access to powerful constructs that avoid tedious repetition of code.
Maintainability also ties into this may be an all but useless metric, but it does say something
about how much code you have to scan, read and/or understand to troubleshoot problems or
tweak behaviors. This speed of development, the ease with which a programmer of other
languages can pick up basic Python skills and the huge standard library is key to another area
where Python excels. All its tools have been quick to implement, saved a lot of time, and several
of them have later been patched and updated by people with no Python background - without
breaking.
Modules Used in Project :-

Tensorflow

TensorFlow is a free and open-source software library for dataflow and differentiable
programming across a range of tasks. It is a symbolic math library, and is also used for machine
learning applications such as neural networks. It is used for both research and production
at Google.‍

TensorFlow was developed by the Google Brain team for internal Google use. It was released
under the Apache 2.0 open-source license on November 9, 2015.

Numpy

Numpy is a general-purpose array-processing package. It provides a high-performance


multidimensional array object, and tools for working with these arrays.

It is the fundamental package for scientific computing with Python. It contains various features
including these important ones:

 A powerful N-dimensional array object

 Sophisticated (broadcasting) functions

 Tools for integrating C/C++ and Fortran code

 Useful linear algebra, Fourier transform, and random number capabilities

Besides its obvious scientific uses, Numpy can also be used as an efficient multi-dimensional
container of generic data. Arbitrary data-types can be defined using Numpy which allows
Numpy to seamlessly and speedily integrate with a wide variety of databases.

Pandas

Pandas is an open-source Python Library providing high-performance data manipulation and


analysis tool using its powerful data structures. Python was majorly used for data munging and
preparation. It had very little contribution towards data analysis. Pandas solved this problem.
Using Pandas, we can accomplish five typical steps in the processing and analysis of data,
regardless of the origin of data load, prepare, manipulate, model, and analyze. Python with
Pandas is used in a wide range of fields including academic and commercial domains including
finance, economics, Statistics, analytics, etc.

Matplotlib

Matplotlib is a Python 2D plotting library which produces publication quality figures in a variety
of hardcopy formats and interactive environments across platforms. Matplotlib can be used in
Python scripts, the Python and IPython shells, the Jupyter Notebook, web application servers,
and four graphical user interface toolkits. Matplotlib tries to make easy things easy and hard
things possible. You can generate plots, histograms, power spectra, bar charts, error charts,
scatter plots, etc., with just a few lines of code. For examples, see the sample plots and thumbnail
gallery.

For simple plotting the pyplot module provides a MATLAB-like interface, particularly when
combined with IPython. For the power user, you have full control of line styles, font properties,
axes properties, etc, via an object oriented interface or via a set of functions familiar to
MATLAB users.

Scikit – learn

Scikit-learn provides a range of supervised and unsupervised learning algorithms via a consistent
interface in Python. It is licensed under a permissive simplified BSD license and is distributed
under many Linux distributions, encouraging academic and commercial use. Python

Python is an interpreted high-level programming language for general-purpose programming.


Created by Guido van Rossum and first released in 1991, Python has a design philosophy that
emphasizes code readability, notably using significant whitespace.

Python features a dynamic type system and automatic memory management. It supports multiple
programming paradigms, including object-oriented, imperative, functional and procedural, and
has a large and comprehensive standard library.

 Python is Interpreted − Python is processed at runtime by the interpreter. You do not need
to compile your program before executing it. This is similar to PERL and PHP.

 Python is Interactive − you can actually sit at a Python prompt and interact with the
interpreter directly to write your programs.

Python also acknowledges that speed of development is important. Readable and terse code is
part of this, and so is access to powerful constructs that avoid tedious repetition of code.
Maintainability also ties into this may be an all but useless metric, but it does say something
about how much code you have to scan, read and/or understand to troubleshoot problems or
tweak behaviors. This speed of development, the ease with which a programmer of other
languages can pick up basic Python skills and the huge standard library is key to another area
where Python excels. All its tools have been quick to implement, saved a lot of time, and several
of them have later been patched and updated by people with no Python background - without
breaking.

Install Python Step-by-Step in Windows and Mac :


Python a versatile programming language doesn’t come pre-installed on your computer devices.
Python was first released in the year 1991 and until today it is a very popular high-level
programming language. Its style philosophy emphasizes code readability with its notable use of
great whitespace.

The object-oriented approach and language construct provided by Python enables programmers
to write both clear and logical code for projects. This software does not come pre-packaged with
Windows.

How to Install Python on Windows and Mac :

There have been several updates in the Python version over the years. The question is how to
install Python? It might be confusing for the beginner who is willing to start learning Python but
this tutorial will solve your query. The latest or the newest version of Python is version 3.7.4 or
in other words, it is Python 3.

Note: The python version 3.7.4 cannot be used on Windows XP or earlier devices.

Before you start with the installation process of Python. First, you need to know about
your System Requirements. Based on your system type i.e. operating system and based
processor, you must download the python version. My system type is a Windows 64-bit
operating system. So the steps below are to install python version 3.7.4 on Windows 7 device or
to install Python 3. Download the Python Cheatsheet here.The steps on how to install Python on
Windows 10, 8 and 7 are divided into 4 parts to help understand better.

Download the Correct version into the system

Step 1: Go to the official site to download and install python using Google Chrome or any other
web browser. OR Click on the following link: https://www.python.org
Now, check for the latest and the correct version for your operating system.

Step 2: Click on the Download Tab.

Step 3: You can either select the Download Python for windows 3.7.4 button in Yellow Color or
you can scroll further down and click on download with respective to their version. Here, we are
downloading the most recent python version for windows 3.7.4
Step 4: Scroll down the page until you find the Files option.

Step 5: Here you see a different version of python along with the operating system.

• To download Windows 32-bit python, you can select any one from the three options: Windows
x86 embeddable zip file, Windows x86 executable installer or Windows x86 web-based
installer.
•To download Windows 64-bit python, you can select any one from the three options: Windows
x86-64 embeddable zip file, Windows x86-64 executable installer or Windows x86-64 web-
based installer.

Here we will install Windows x86-64 web-based installer. Here your first part regarding which
version of python is to be downloaded is completed. Now we move ahead with the second part in
installing python i.e. Installation

Note: To know the changes or updates that are made in the version you can click on the Release
Note Option.

Installation of Python

Step 1: Go to Download and Open the downloaded python version to carry out the installation
process.

Step 2: Before you click on Install Now, Make sure to put a tick on Add Python 3.7 to PATH.
Step 3: Click on Install NOW After the installation is successful. Click on Close.
With these above three steps on python installation, you have successfully and correctly installed
Python. Now is the time to verify the installation.

Note: The installation process might take a couple of minutes.

Verify the Python Installation

Step 1: Click on Start

Step 2: In the Windows Run Command, type “cmd”.

Step 3: Open the Command prompt option.

Step 4: Let us test whether the python is correctly installed. Type python –V and press Enter.
Step 5: You will get the answer as 3.7.4

Note: If you have any of the earlier versions of Python already installed. You must first uninstall
the earlier version and then install the new one.

Check how the Python IDLE works

Step 1: Click on Start

Step 2: In the Windows Run command, type “python idle”.

Step 3: Click on IDLE (Python 3.7 64-bit) and launch the program

Step 4: To go ahead with working in IDLE you must first save the file. Click on File > Click on
Save
Step 5: Name the file and save as type should be Python files. Click on SAVE. Here I have
named the files as Hey World.

Step 6: Now for e.g. enter print


SYSTEMTESTING

5.1.1SYSTEM TEST
The purpose of testing is to discover errors. Testing is the process of trying to discover every
conceivable fault or weakness in a work product. It provides a way to check the functionality of
components, sub assemblies, assemblies and/or a finished product It is the process of exercising
software with the intent of ensuring that the Software system meets its requirements and user
expectations and does not fail in an unacceptable manner. There are various types of test. Each
test type addresses a specific testing requirement.

5.1.2Unit testing
Unit testing involves the design of test cases that validate that the internal program logic is
functioning properly, and that program inputs produce valid outputs. All decision branches and
internal code flow should be validated. It is the testing of individual software units of the
application .it is done after the completion of an individual unit before integration. This is a
structural testing, that relies on knowledge of its construction and is invasive. Unit tests perform
basic tests at component level and test a specific business process, application, and/or system
configuration. Unit tests ensure that each unique path of a business process performs accurately
to the documented specifications and contains clearly defined inputs and expected results.

5.1.3Integration testing
Integration tests are designed to test integrated software components to determine if they actually
run as one program. Testing is event driven and is more concerned with the basic outcome of
screens or fields. Integration tests demonstrate that although the components were individually
satisfaction, as shown by successfully unit testing, the combination of components is correct and
consistent. Integration testing is specifically aimed at exposing the problems that arise from the
combination of components.
5.1.4Functional test
Functional tests provide systematic demonstrations that functions tested are
available as specified by the business and technical requirements, system
documentation, and user manuals.
Functional testing is centered on the following items:

Valid Input : identified classes of valid input must be accepted.

Invalid Input : identified classes of invalid input must be rejected.

Functions : identified functions must be exercised.

Output : identified classes of application outputs must be exercised.

Systems/Procedures : interfacing systems or procedures must be invoked.

Organization and preparation of functional tests is focused on requirements, key


functions, or special test cases. In addition, systematic coverage pertaining to identify Business
process flows; data fields, predefined processes, and successive processes must be considered for
testing. Before functional testing is complete, additional tests are identified and the effective
value of current tests is determined.

5.1.5 System Test


System testing ensures that the entire integrated software system meets
requirements. It tests a configuration to ensure known and predictable results. An example of
system testing is the configuration oriented system integration test. System testing is based on
process descriptions and flows, emphasizing pre-driven process links and integration points.

5.1.6 White Box Testing


White Box Testing is a testing in which in which the software tester has
knowledge of the inner workings, structure and language of the software, or at least its purpose.
It is purpose. It is used to test areas that cannot be reached from a black box level.
5.1.7 Black Box Testing
Black Box Testing is testing the software without any knowledge of the inner
workings, structure or language of the module being tested. Black box tests, as
most other kinds of tests, must be written from a definitive source document, such
as specification or requirements document, such as specification or requirements
document. It is a testing in which the software under test is treated, as a black
box .you cannot “see” into it. The test provides inputs and responds to outputs
without considering how the software works.

Unit Testing

Unit testing is usually conducted as part of a combined code and unit test phase
of the software lifecycle, although it is not uncommon for coding and unit testing to be
conducted as two distinct phases.

Test strategy and approach

Field testing will be performed manually and functional tests will be written in
detail.
Test objectives
 All field entries must work properly.
 Pages must be activated from the identified link.
 The entry screen, messages and responses must not be delayed.

Features to be tested
 Verify that the entries are of the correct format
 No duplicate entries should be allowed
 All links should take the user to the correct page.

5.1.8 Integration Testing


Software integration testing is the incremental integration testing of two or more
integrated software components on a single platform to produce failures caused by interface
defects.

The task of the integration test is to check that components or software applications, e.g.
components in a software system or – one step up – software applications at the company level –
interact without error.

Test Results:All the test cases mentioned above passed successfully. No defects encountered.

Acceptance Testing
User Acceptance Testing is a critical phase of any project and requires significant participation
by the end user. It also ensures that the system meets the functional requirements.

Test Results:All the test cases mentioned above passed successfully. No defects encounter
CHAPTER-6:

RESULTS SCREENSHOTS :

To run project double click on ‘run.bat’ file to get below screen

In above screen click on ‘Upload Credit Card Fraud Dataset’ button to upload
dataset and to get below output
In above screen selecting and uploading fraud dataset and then click on ‘Open’
button to load dataset and get below output

In above screen dataset loaded and from above dataset we need to calculate
FUZZY member function so click on ‘Calculate Fuzzy Membership Functions’
button to calculate FUZY values and get below output
Note: in this project as LOW we have used 0 and 1 for MEDIUM and 2 for HIGH

In above screen we have extracted fuzzy values from dataset and in graph x-axis
represents type of transaction and y-axis represents number of records available in
dataset for that transaction and now close above graph and then click on ‘Run
Fuzzy Logic Algorithm’ button to train Fuzzy Logic on member ship values and
get below output
In above screen fuzzy training completed and we got MSE as 0.59 (algorithms get
train on random splitted train and test data so MSE may vary for each run due to
random data) and now click on ‘Run LSTM Algorithm’ button to train LSTM and
get below output
In above screen with LSTM we got 0.0032 MSE which is lesser than Fuzzy
algorithm and now click on ‘LSTM Training Graph’ button to get below graph

In above graph x-axis represents training POCH and y-axis represents MSE and we
can see with each increasing epoch MSE got decrease and for any model
decreasing MSE consider as best mode. Now close above graph and then click on
‘MSE Comparison Graph’ button to get below graph
In above graph x-axis represents algorithm names and y-axis represents MSE
values and in above graph we can see LSTM MSE is very low compare to Fuzzy
so LSTM is better than Fuzzy algorithm
CHAPTER-7:

CONCLUSION:
In this research, we have presented a novel scheme for CCFD by combining a rule-based fuzzy
inference system and a learning component that uses back-propagation neural network. We have
tested the proposed system by carrying out experiments using stochastic models. Based on the
results obtained, it is inferred that incorporation of neural network along with fuzzy inferencing
is appropriate in addressing this sort of real-world issues.
References :
1. Inscoe, S. W.: Global Consumers: Losing Confidence in the Battle against Fraud. ACI
Universal Payments, June, 2014, www.aciworldwide.com.

2. Credit Card Fraud Statistics-2013, <http://www.cardhub.com/edu/creditfraud-Statistics>, 5


Feb, 2015.

3. Online Fraud is twelve times higher than offline fraud, <http://sellitontheweb.com/ezine/ne-


ws034.shtml>, 20, June, 2007.

4. Ghosh, S., Reilly, D.L.: Credit card fraud detection with a neural-network. In: Proceedings of
the Annual International Conference on System Science, pp. 621–630 (1994).

5. Aleskerov, E., Freisleben, B., Rao, B.: CARDWATCH: a neural-network based database
mining system for credit card fraud detection. In: Proceedings of the Computational Intelligence
for Financial Engineering, pp. 220–226 (1997).

6. Dorronsoro, J.R., Ginel, F., Sanchez, C., Cruz, C.S.: Neural fraud detection in credit card
operations. IEEE Transactions on Neural Networks, pp. 827–834 (1997).

7. Liu, P., Li, L.: A Game-Theoretic Approach for Attack Prediction. Technical Report, PSU-S2-
2002-01, Penn State University (2002).

8. Vatsa, V., Sural, S., Majumdar, A.K.: A game-theoretic approach to credit card fraud
detection. In: Proceedings of the International Conference on Information Systems Security,
Lecture Notes in Computer Science, vol. 3803. pp. 263–276 (2005).

9. Quah, J. T. S., Srinagesh, M.: Real-time credit fraud detection using computational
intelligence. Expert Systems with Applications, 35, pp. 1721–1732 (2008).

10. Panigrahi, S., Kundu, A., Sural, S., Majumdar, A.: Credit card fraud detection a fusion
approach using Dempster–Shafer theory and bayesian learning.

You might also like