22.credit Card Fraud Detection Using Fuzzy Logic and Neural Network
22.credit Card Fraud Detection Using Fuzzy Logic and Neural Network
Network
ABSTRACT:
In this paper, a two-stage neuro-fuzzy expert system has been proposed for credit card fraud
detection. An incoming transaction is initially processed by a pattern-matching system in the first
stage. This component comprises of a fuzzy clustering module and an address-matching module,
and each of them assigns a score to the transaction based on its extent of deviation. A fuzzy
inference system computes a suspicious score by combining these score values and accordingly
classifies the transaction as genuine, suspicious, or fraudulent. Once a transaction is detected as
suspicious, a neural network trained with history transactions is employed in the second stage to
verify whether it was an actual fraudulent action or an occasional deviation by the legitimate
user. The effectiveness of the proposed system has been verified by conducting experiments and
comparative analysis with other systems.
INTRODUCTION :
The use of credit card as a mode of payment for online as well as daily purchases has increased
in the last few decades. As a consequence, associated credit card fraud has also proliferated.
Fraud in credit card is practiced with an intension of acquiring goods without paying for it. This
kind of illegitimate actions on credit cards is done in two possible ways: by using stolen credit
cards physically (physical fraud) and by exploiting the card details without the knowledge of the
genuine cardholder (virtual fraud) via online transactions. The loss because of credit card fraud is
found to be very high in the past few years
LITERATURE SURVEY:
TITILE :
Credit card fraud detection with a neural-network
AUTHORS :
Ghosh, S., Reilly
ABSTRACT:
Frauds in credit card transactions are common today as most of us are using the credit card
payment methods more frequently. This is due to the advancement of Technology and increase
in online transaction resulting in frauds causing huge financial loss. Therefore, there is need for
effective methods to reduce the loss. In addition, fraudsters find ways to steal the credit card
information of the user by sending fake SMS and calls, also through masquerading attack,
phishing attack and so on. This paper aims in using the multiple algorithms of Machine learning
such as support vector machine (SVM), k-nearest neighbor (Knn) and artificial neural network
(ANN) in predicting the occurrence of the fraud. Further, we conduct a differentiation of the
accomplished supervised machine learning and deep learning techniques to differentiate between
fraud and non-fraud transactions.
2.1EXISTING SYSTEM :
In detecting credit card fraud, various techniques have been used including neural network (NN),
genetic algorithm, data mining, game-theoretic approach, support vector machines, and meta-
learning. Artificial neural network (ANN) is the first technique which was employed in credit
card fraud detection (CCFD). A feasibility study has been performed by Ghosh and Reilly for
Mellon Bank to test the potency of the ANN in CCFD and reached at a reduction of 20–40%
losses due to fraud [4]. An NN-based data mining technique known as CARDWATCH has been
proposed by Aleskerov et al. [5]. An online CCFDS has been presented by Dorronsoro et al.
based on a neural classifier where Fisher’s discriminant analysis is employed to distinguish the
fraudulent activities from the normal ones
3.1PROPOSED SYSTEM :
In this paper author is applying Fuzzy Logic member ship functions to correctly
detect Fraud, suspicious or legal credit card transactions. In propose paper author is
finding following values from dataset by using Fuzzy member ship functions After
extracting all values then we will find class label as FRAUD if LOW values are
more, label as Suspicious if MEDIUM values are more, label as FRAUD if HIGH
values are more.
In below screen we are showing code to extract all member values. Here we are
using 0 for LOW and 1 for MEDIUM and 2 for HIGH as LSTM and FUZZY will
take only numeric values not character values.
HARD REQUIRMENTS :
System : i3 or above.
Ram : 4 GB.
Hard Disk : 40 GB
SOFTWARE REQUIRMENTS :
GOALS:
The Primary goals in the design of the UML are as follows:
Provide users a ready-to-use, expressive visual modeling Language so
that they can develop and exchange meaningful models.
Provide extendibility and specialization mechanisms to extend the
core concepts.
Be independent of particular programming languages and
development process.
Provide a formal basis for understanding the modeling language.
Encourage the growth of OO tools market.
Support higher level development concepts such as collaborations,
frameworks, patterns and components.
Integrate best practices.
4.3.1USE CASE DIAGRAM:
mse comparison
exit
4.3.2CLASS DIAGRAM:
USER
DATASET
exit
4.3.4COLLABRATION DIAGRAM:
1) Upload Credit Card Fraud Dataset: using this module we will upload dataset
to application
2) Calculate Fuzzy Membership Functions: using this module we will extract
all membership function values from dataset
3) Run Fuzzy Logic Algorithm: using this module we will trained Fuzzy
algorithm on membership values and then test the algorithm in terms of
Mean Square Error (MSE refers to prediction error). The lower the MSE the
better is the algorithm
4) Run LSTM Algorithm: using this module we will trained LSTM algorithm
on same membership values and then calculate MSE
5) LSTM Training Graph: using this module we will plot LSTM training and
testing MSE
6) MSE Comparison Graph: using this module we will plot MSE comparison
graph between FUZZY algorithm and LSTM
1.1* CLASSIFICATION
1.2ClassificationAlgorithms:
LSTM networks extend the recurrent neural network (RNNs) mainly designed to deal with
situations in which RNNs do not work. When we talk about RNN, it is an algorithm that
processes the current input by taking into account the output of previous events (feedback) and
then storing it in the memory of its users for a brief amount of time (short-term memory). Of the
many applications, its most well-known ones are those in the areas of non-Markovian speech
control and music composition. However, there are some drawbacks to RNNs. It is the first to
fail to save information over long periods of time. Sometimes an ancestor of data stored a
considerable time ago is needed to determine the output of the present. However, RNNs are
utterly incapable of managing these "long-term dependencies." The second issue is that there is
no better control of which component of the context is required to continue and what part of the
past must be forgotten. Other issues associated with RNNs are the exploding or disappearing
slopes (explained later) that occur in training an RNN through backtracking. Therefore, the
Long-Short-Term Memory (LSTM) was introduced into the picture. It was designed so that the
problem of the gradient disappearing is eliminated almost entirely as the training model is
unaffected. Long-time lags within specific issues are solved using LSTMs, which also deal with
the effects of noise, distributed representations, or endless numbers. With LSTMs, they do not
meet the requirement to maintain the same number of states before the time required by the
hideaway Markov model (HMM). LSTMs offer us an extensive range of parameters like learning
rates and output and input biases. Therefore, there is no need for minor adjustments. The effort to
update each weight is decreased to O(1) by using LSTMs like those used in Back Propagation
Through Time (BPTT), which is a significant advantage
AdvantagesofMachineLearning:
Machine learning is a field of computer science and artificial intelligence that
deals with the task of teaching computers to learn from data without being explicitly
programmed. It is a type of data mining that allows computers to “learn” on their own by
analyzing data sets and using pattern recognition. Machine learning has many benefits,
including improved accuracy, efficiency, and decision-making.
Handling large amounts of data: With the ever-growing volume of data generated
every day, it is increasingly difficult for humans to process and make sense of all this
information. Machine learning can help businesses handle large amounts of data more
efficiently and effectively and even use decision trees to take action on the information.
Reducing bias: Machine learning algorithms are not biased toward certain data sets,
unlike human beings, who may have personal biases that can distort their judgment. As a
result, machine learning can help reduce bias in business decisions.
Improving accuracy: Machine learning algorithms can achieve much higher accuracy
than humans when making predictions or classifying labeled data. This improved
accuracy can lead to better business outcomes and increased profits.
Discovering patterns and correlations: Machine learning can help businesses uncover
patterns and correlations in data that they may not have been able to detect otherwise.
These learning systems can lead to better decision-making and a deeper understanding of
the data.
Making predictions about future events: Machine learning algorithms can predict future
events, such as consumer behavior, stock prices, and election outcomes.
1.4Applications of Machine Learning
Image Recognition
Speechrecognition:
Traffic Prediction:
If we want to visit a new place, we take help of Google Maps, which shows
us the correct path with the shortest route and predicts the traffic conditions. It
predicts the traffic conditions such as whether traffic is cleared, slow-moving, or
heavily congested with the help of two ways:
1.Real Time location of the vehicle form Google Map app and sensors
Product Recommendations
1.5 DEEPLEARNING
AdvantagesofDeepLearning
CreatingNewFeaturesOneofthemainbenefitsofdeeplearningovervariou
smachine learning algorithms is its ability to generate new features from a
limited series off eatures located in the training data set. Therefore, deep
learning algorithms can create new tasks to solve current ones. What does it
mean for data scientists working in technological startups? Since deep
learning can create features without a human intervention, data
scientistscansavemuchtimeonworkingwithbigdataandrelyingonthistechnolog
y.Itallowsthemtousemorecomplexsetsoffeaturesincomparisonwithtraditional
machinelearningsoftware.
Advanced Analysis
Duetoitsimproveddataprocessingmodels,deeplearninggeneratesactionableresults
when solving data science tasks. While machine learning works only with labelled
data,deep learning supportsunsupervisedlearningtechniquesthatallowthesystemtobecome
smarter on its own. The capacity to determine the most important features allows deep
learning to efficiently provide data scientists with conciseandreliable analysis results.
APPLICATIONSOFDEEPLEARNING
Automaticspeechrecognition
Large-scale automatic speech recognition is the first and most convincing successful case
of deep learning. LSTM RNNs can learn "Very Deep Learning" tasks that involve multi-second
intervals containing speech events separated by thousands of discrete time steps,where one- time
step corresponds to about 10 milliseconds. LSTM with forget gates is competitive with
traditional speech recognizers on certain tasks. The initial success in speech recognition was
based on small-scale recognition tasks base on TIMIT. The data set contains630 speakers from
eight major dialects of American English, where each speaker reads 10sentences. Its small size
lets many configurations be tried. More importantly, the TIMIT task concerns phone-sequence
recognition, which, unlike word-sequence recognition, allows weak phone bigram language
models. This lets the strength of the acoustic modelling aspects of speech recognition be mor
eeasily analysed.
Imagerecognition
A common evaluation set for image classification is the MNIST database data
set.MNIST is composed of handwritten digits and includes 60,000 training examples and
10,000testexamples.As with TIMIT, its small size letsuserstest multiple configurations.
Acomprehensive list of results on this set is available. Deep learning- based image recognition
has become"superhuman",producing more accurate results than human contestants.This first
occurredin 2011.
Military
The United States Department of
Defenceapplieddeeplearningtotrainrobotsinnewtasksthroughobservation.
Bioinformatics
An auto encoder ANN was used in bio informatics, to predict geneontology annotations
and gene-function relationships. In medical informatics, deep learning was used to predict sleep
quality based on data from wearables and predictions of health complications from electronic
health record data.
Self-Drivingcars
Deep Learning is the force that is bringing autonomous driving to life.A millio nsets of
data are fed to a system to build a model, to train the machines to learn, and then test the results
ina safe environment. Data from cameras, sensors, geo- mapping is helping
createsuccinctandsophisticatedmodelstonavigatethroughtraffic ,identify paths, signage,
pedestrian- only routes, and real-time elements like traffic volume and road blockages.
Fraud-Detection
Another domain benefitting from Deep Learning is the banking and financial sector that
is plagued with the task of fraud detection with money transactions going digital. Auto encoders
in Keras and TensorFlow are being developed to detect credit card frauds
savingbillionsofdollarsofcostinrecoveryandinsuranceforfinancialinstitutions.Fraud prevention and
detection are done based on identifying patterns in customer transactions and credit scores,
identifying a nomal ous behaviour and outliers.
CNNARCHITECTURES
Convolutional Neural Network (CNN, or ConvNet) are a special kind of multi-layer
neural networks,designedtorecognizevisualpatternsdirectlyfrompixelimageswithminimalpre-
processing.
VGGNet
VGGNet consists of 16 convolutional layers and is very appealing because of its very
uniform architecture. Similar to AlexNet, only3x3 convolutions, but lots of filters. Traine don 4
GPUs for 2–3 weeks. It is currently the most preferred choice in the community for extracting
features from images.TheweightconfigurationoftheVGGNetispubliclyavailable and has been used
in many other applications and challenges as a baseline feature extractor stands for Visual
Geometry Group. VGGN etisa neural network that performed very well in the Image Net Large
Scale Visual Recognition Challenge (ILSVRC) in 2014. It scored first place on the image
localization task and second place on the image classification task.
VGG-19:
VGG-19 is a convolutional neural network (CNN) architecture developed by the Visual
Geometry Group (VGG) at the University of Oxford. It is a deep neural network with 19 layers
and was introduced as part of the ImageNet Large Scale Visual Recognition Challenge
(ILSVRC) in 2014.One of the key innovations in VGG-19 is its deep architecture, which allows
for a more expressive representation of image features.VGG-19 is trained on the Image Net data
set, which consists of millions of labeled images from thousands of categories
Inception v3 algorithm:
Inception-v3 is a convolutional neural network (CNN) architecture developed by Google for
image recognition and classification tasks. It is an improvement over the original Inception
model. The Inception-v3 architecture is designed to be deeper and more powerful than previous
CNNs. It consists of 48 layers, including convolutional layers, pooling layers, and fully
connected layers. It also includes several unique features such as the Inception module. This
technique reduces the number of parameters in the model, which helps to reduce over fitting and
improve performance
ADVANTAGESOFCNN
Once trained ,the prediction sarepretty fast.
DISADVANTAGESOFCNN
APPLICATIONSOFCNN
ImageRecognition
CNNs are often used in image recognition systems. In 2012 an error rate of 0.23percent
on the MNIST data base was reported. Another paper on using CNN for image classification
reported that the learning process was "surprisingly fast"; in the same paper, the best published
results as of 2011 were achieved in the MNIST database and the NORB data base. Subsequently,
a similar CNN called Alex Net won the Image Net Large Scale Visual Recognition Challenge
2012. Image recognition, in the context of machine vision, is the ability of software to identify
objects, places,people,writing and actions in images. Computers can use machine vision
technologies in combination with a camera and
artificialintelligencesoftwaretoachieveimagerecognition.
VideoAnalysis
Compared to image data domains, there is relatively little work on applying CNNs to
video classification. Video is more complex than images since it has another
(temporal)dimension. However, some extensions of CNN sintothe video domain have been
explored.One approach is to treat space and time as equivalent dimensions of the input and
perform convolutions in both time and space. Another way is to fuse the features of two
convolutional neural networks, one for the spatialan done for the temporal stream. Long short-
termmemory (LSTM) recurrent units are typically incorporated after the CNN to account for
inter -frame or inter-clip dependencies. Unsupervised learning schemes for training spatial-
temporal features have been introduced, based on Convolutional Gated Restricted Boltzmann.
UnderstandingClimate
Facialrecognitionisbrokendownbyaconvolutionalneuralnetworkintothefollowing major
components–
Identifyinguniquefeatures.
Comparingallthecollecteddatawithalreadyexistingdatainthedatabase
tomatchafacewithaname. A similar process is followed for scene
labelling as well.
Analysing Documents
Convolutional neural networks can also be used for document analysis. This is not just useful for
handwriting analysis, but also has a major stake in recognizers. For a machine to be able to scan an
individual's writing, and then compare that to the wide database it has, it must execute almost a million
commands a minute. It is said with the use of CNNs and newe rmodels and algorithms, the error rate has
been brought down toa minimum of 0.4% at acharacter level, though its complete testing i s yet to be
widelyseen.
SOFTWARE ENVIRONMENT :
What is Python :
Python is currently the most widely used multi-purpose, high-level programming language.
Programmers have to type relatively less and indentation requirement of the language, makes
them readable all the time.
Python language is being used by almost all tech-giant companies like – Google, Amazon,
Facebook, Instagram, Dropbox, Uber… etc.
The biggest strength of Python is huge collection of standard library which can be used for the
following –
Machine Learning
Test frameworks
Multimedia
Advantages of Python :-
1. Extensive Libraries
Python downloads with an extensive library and it contain code for various purposes like regular
expressions, documentation-generation, unit-testing, web browsers, threading, databases, CGI,
email, image manipulation, and more. So, we don’t have to write the complete code for that
manually.
2. Extensible
As we have seen earlier, Python can be extended to other languages. You can write some of your
code in languages like C++ or C. This comes in handy, especially in projects.
3. Embeddable
Complimentary to extensibility, Python is embeddable as well. You can put your Python code in
your source code of a different language, like C++. This lets us add scripting capabilities to our
code in the other language.
4. Improved Productivity
The language’s simplicity and extensive libraries render programmers more productive than
languages like Java and C++ do. Also, the fact that you need to write less and get more things
done.
5. IOT Opportunities
Since Python forms the basis of new platforms like Raspberry Pi, it finds the future bright for the
Internet Of Things. This is a way to connect the language with the real world.
When working with Java, you may have to create a class to print ‘Hello World’. But in Python,
just a print statement will do. It is also quite easy to learn, understand, and code. This is why
when people pick up Python, they have a hard time adjusting to other more verbose languages
like Java.
7. Readable
Because it is not such a verbose language, reading Python is much like reading English. This is
the reason why it is so easy to learn, understand, and code. It also does not need curly braces to
define blocks, and indentation is mandatory. This further aids the readability of the code.
8. Object-Oriented
This language supports both the procedural and object-oriented programming paradigms. While
functions help us with code reusability, classes and objects let us model the real world. A class
allows the encapsulation of data and functions into one.
Like we said earlier, Python is freely available. But not only can you download Python for free,
but you can also download its source code, make changes to it, and even distribute it. It
downloads with an extensive collection of libraries to help you with your tasks.
10. Portable
When you code your project in a language like C++, you may need to make some changes to it if
you want to run it on another platform. But it isn’t the same with Python. Here, you need to code
only once, and you can run it anywhere. This is called Write Once Run Anywhere (WORA).
However, you need to be careful enough not to include any system-dependent features.
11. Interpreted
Lastly, we will say that it is an interpreted language. Since statements are executed one by
one, debugging is easier than in compiled languages.
Any doubts till now in the advantages of Python? Mention in the comment section.
1. Less Coding
Almost all of the tasks done in Python requires less coding when the same task is done in other
languages. Python also has an awesome standard library support, so you don’t have to search for
any third-party libraries to get your job done. This is the reason that many people suggest
learning Python to beginners.
2. Affordable
Python is free therefore individuals, small companies or big organizations can leverage the free
available resources to build applications. Python is popular and widely used so it gives you better
community support.
The 2019 Github annual survey showed us that Python has overtaken Java in the most popular
programming language category.
Python code can run on any machine whether it is Linux, Mac or Windows. Programmers need
to learn different languages for different jobs but with Python, you can professionally build web
apps, perform data analysis and machine learning, automate things, do web scraping and also
build games and powerful visualizations. It is an all-rounder programming language.
Disadvantages of Python
So far, we’ve seen why Python is a great choice for your project. But if you choose it, you should
be aware of its consequences as well. Let’s now see the downsides of choosing Python over
another language.
1. Speed Limitations
We have seen that Python code is executed line by line. But since Python is interpreted, it often
results in slow execution. This, however, isn’t a problem unless speed is a focal point for the
project. In other words, unless high speed is a requirement, the benefits offered by Python are
enough to distract us from its speed limitations.
While it serves as an excellent server-side language, Python is much rarely seen on the client-
side. Besides that, it is rarely ever used to implement smartphone-based applications. One such
application is called Carbonnelle.
The reason it is not so famous despite the existence of Brython is that it isn’t that secure.
3. Design Restrictions
As you know, Python is dynamically-typed. This means that you don’t need to declare the type
of variable while writing the code. It uses duck-typing. But wait, what’s that? Well, it just means
that if it looks like a duck, it must be a duck. While this is easy on the programmers during
coding, it can raise run-time errors.
5. Simple
No, we’re not kidding. Python’s simplicity can indeed be a problem. Take my example. I don’t
do Java, I’m more of a Python person. To me, its syntax is so simple that the verbosity of Java
code seems unnecessary.
This was all about the Advantages and Disadvantages of Python Programming Language.
History of Python : -
What do the alphabet and the programming language Python have in common? Right, both start
with ABC. If we are talking about ABC in the Python context, it's clear that the programming
language ABC is meant. ABC is a general-purpose programming language and programming
environment, which had been developed in the Netherlands, Amsterdam, at the CWI (Centrum
Wiskunde &Informatica). The greatest achievement of ABC was to influence the design of
Python.Python was conceptualized in the late 1980s. Guido van Rossum worked that time in a
project at the CWI, called Amoeba, a distributed operating system. In an interview with Bill
Venners1, Guido van Rossum said: "In the early 1980s, I worked as an implementer on a team
building a language called ABC at Centrum voor Wiskunde en Informatica (CWI). I don't know
how well people know ABC's influence on Python. I try to mention ABC's influence because I'm
indebted to everything I learned during that project and to the people who worked on it."Later on
in the same Interview, Guido van Rossum continued: "I remembered all my experience and some
of my frustration with ABC. I decided to try to design a simple scripting language that possessed
some of ABC's better properties, but without its problems. So I started typing. I created a simple
virtual machine, a simple parser, and a simple runtime. I made my own version of the various
ABC parts that I liked. I created a basic syntax, used indentation for statement grouping instead
of curly braces or begin-end blocks, and developed a small number of powerful data types: a
hash table (or dictionary, as we call it), a list, strings, and numbers."
Before we take a look at the details of various machine learning methods, let's start by looking at
what machine learning is, and what it isn't. Machine learning is often categorized as a subfield of
artificial intelligence, but I find that categorization can often be misleading at first brush. The
study of machine learning certainly arose from research in this context, but in the data science
application of machine learning methods, it's more helpful to think of machine learning as a
means of building models of data.
At the most fundamental level, machine learning can be categorized into two main types:
supervised learning and unsupervised learning.
Supervised learning involves somehow modeling the relationship between measured features of
data and some label associated with the data; once this model is determined, it can be used to
apply labels to new, unknown data. This is further subdivided into classification tasks
and regression tasks: in classification, the labels are discrete categories, while in regression, the
labels are continuous quantities. We will see examples of both types of supervised learning in the
following section.
Unsupervised learning involves modeling the features of a dataset without reference to any label,
and is often described as "letting the dataset speak for itself." These models include tasks such
as clustering and dimensionality reduction. Clustering algorithms identify distinct groups of data,
while dimensionality reduction algorithms search for more succinct representations of the data.
We will see examples of both types of unsupervised learning in the following section.
Human beings, at this moment, are the most intelligent and advanced species on earth because
they can think, evaluate and solve complex problems. On the other side, AI is still in its initial
stage and haven’t surpassed human intelligence in many aspects. Then the question is that what
is the need to make machine learn? The most suitable reason for doing this is, “to make
decisions, based on data, with efficiency and scale”.
Lately, organizations are investing heavily in newer technologies like Artificial Intelligence,
Machine Learning and Deep Learning to get the key information from data to perform several
real-world tasks and solve problems. We can call it data-driven decisions taken by machines,
particularly to automate the process. These data-driven decisions can be used, instead of using
programing logic, in the problems that cannot be programmed inherently. The fact is that we
can’t do without human intelligence, but other aspect is that we all need to solve real-world
problems with efficiency at a huge scale. That is why the need for machine learning arises.
While Machine Learning is rapidly evolving, making significant strides with cybersecurity and
autonomous cars, this segment of AI as whole still has a long way to go. The reason behind is
that ML has not been able to overcome number of challenges. The challenges that ML is facing
currently are −
Quality of data − Having good-quality data for ML algorithms is one of the biggest challenges.
Use of low-quality data leads to the problems related to data preprocessing and feature
extraction.
No clear objective for formulating business problems − Having no clear objective and well-
defined goal for business problems is another key challenge for ML because this technology is
not that mature yet.
Curse of dimensionality − Another challenge ML model faces is too many features of data
points. This can be a real hindrance.
Machine Learning is the most rapidly growing technology and according to researchers we are in
the golden year of AI and ML. It is used to solve many real-world complex problems which
cannot be solved with traditional approach. Following are some real-world applications of ML −
Emotion analysis
Sentiment analysis
Speech synthesis
Speech recognition
Customer segmentation
Object recognition
Fraud detection
Fraud prevention
And that was the beginning of Machine Learning! In modern times, Machine Learning is one of
the most popular (if not the most!) career choices. According to Indeed, Machine Learning
Engineer Is The Best Job of 2019 with a 344% growth and an average base salary
of $146,085 per year.
But there is still a lot of doubt about what exactly is Machine Learning and how to start learning
it? So this article deals with the Basics of Machine Learning and also the path you can follow to
eventually become a full-fledged Machine Learning Engineer. Now let’s get started!!!
This is a rough roadmap you can follow on your way to becoming an insanely talented Machine
Learning Engineer. Of course, you can always modify the steps according to your needs to reach
your desired end-goal!
In case you are a genius, you could start ML directly but normally, there are some prerequisites
that you need to know which include Linear Algebra, Multivariate Calculus, Statistics, and
Python. And if you don’t know these, never fear! You don’t need a Ph.D. degree in these topics
to get started but you do need a basic understanding.
Both Linear Algebra and Multivariate Calculus are important in Machine Learning. However,
the extent to which you need them depends on your role as a data scientist. If you are more
focused on application heavy machine learning, then you will not be that heavily focused on
maths as there are many common libraries available. But if you want to focus on R&D in
Machine Learning, then mastery of Linear Algebra and Multivariate Calculus is very important
as you will have to implement many ML algorithms from scratch.
Data plays a huge role in Machine Learning. In fact, around 80% of your time as an ML expert
will be spent collecting and cleaning data. And statistics is a field that handles the collection,
analysis, and presentation of data. So it is no surprise that you need to learn it!!!
Some of the key concepts in statistics that are important are Statistical Significance, Probability
Distributions, Hypothesis Testing, Regression, etc. Also, Bayesian Thinking is also a very
important part of ML which deals with various concepts like Conditional Probability, Priors, and
Posteriors, Maximum Likelihood, etc.
So if you want to learn ML, it’s best if you learn Python! You can do that using various online
resources and courses such as Fork Python available Free on GeeksforGeeks.
Now that you are done with the prerequisites, you can move on to actually learning ML (Which
is the fun part!!!) It’s best to start with the basics and then move on to the more complicated
stuff. Some of the basic concepts in ML are:
Target (Label) – A target variable or label is the value to be predicted by our model. For
the fruit example discussed in the feature section, the label with each set of input would
be the name of the fruit like apple, orange, banana, etc.
Training – The idea is to give a set of inputs(features) and it’s expected outputs(labels),
so after training, we will have a model (hypothesis) that will then map new data to one of
the categories trained on.
Prediction – Once our model is ready, it can be fed a set of inputs to which it will provide
a predicted output(label).
Supervised Learning – This involves learning from a training dataset with labeled data
using classification and regression models. This learning process continues until the
required level of performance is achieved.
Unsupervised Learning – This involves using unlabelled data and then finding the
underlying structure in the data in order to learn more and more about the data itself using
factor and cluster analysis models.
Reinforcement Learning – This involves learning optimal actions through trial and error.
So the next action is decided by learning behaviors that are based on the current state and
that will maximize the reward in the future.
Machine Learning can review large volumes of data and discover specific trends and patterns
that would not be apparent to humans. For instance, for an e-commerce website like Amazon, it
serves to understand the browsing behaviors and purchase histories of its users to help cater to
the right products, deals, and reminders relevant to them. It uses the results to reveal relevant
advertisements to them.
With ML, you don’t need to babysit your project every step of the way. Since it means giving
machines the ability to learn, it lets them make predictions and also improve the algorithms on
their own. A common example of this is anti-virus softwares; they learn to filter new threats as
they are recognized. ML is also good at recognizing spam.
3. Continuous Improvement
As ML algorithms gain experience, they keep improving in accuracy and efficiency. This lets
them make better decisions. Say you need to make a weather forecast model. As the amount of
data you have keeps growing, your algorithms learn to make more accurate predictions faster.
Machine Learning algorithms are good at handling data that are multi-dimensional and multi-
variety, and they can do this in dynamic or uncertain environments.
5. Wide Applications
You could be an e-tailer or a healthcare provider and make ML work for you. Where it does
apply, it holds the capability to help deliver a much more personal experience to customers while
also targeting the right customers.
Disadvantages of Machine Learning :-
1. Data Acquisition
Machine Learning requires massive data sets to train on, and these should be inclusive/unbiased,
and of good quality. There can also be times where they must wait for new data to be generated.
ML needs enough time to let the algorithms learn and develop enough to fulfill their purpose
with a considerable amount of accuracy and relevancy. It also needs massive resources to
function. This can mean additional requirements of computer power for you.
3. Interpretation of Results
Another major challenge is the ability to accurately interpret results generated by the algorithms.
You must also carefully choose the algorithms for your purpose.
4. High error-susceptibility
Machine Learning is autonomous but highly susceptible to errors. Suppose you train an
algorithm with data sets small enough to not be inclusive. You end up with biased predictions
coming from a biased training set. This leads to irrelevant advertisements being displayed to
customers. In the case of ML, such blunders can set off a chain of errors that can go undetected
for long periods of time. And when they do get noticed, it takes quite some time to recognize the
source of the issue, and even longer to correct it.
Guido Van Rossum published the first version of Python code (version 0.9.0) at alt.sources in
February 1991. This release included already exception handling, functions, and the core data
types of list, dict, str and others. It was also object oriented and had a module system.
Python version 1.0 was released in January 1994. The major new features included in this release
were the functional programming tools lambda, map, filter and reduce, which Guido Van
Rossum never liked.Six and a half years later in October 2000, Python 2.0 was introduced. This
release included list comprehensions, a full garbage collector and it was supporting
unicode.Python flourished for another 8 years in the versions 2.x before the next major release as
Python 3.0 (also known as "Python 3000" and "Py3K") was released. Python 3 is not backwards
compatible with Python 2.x. The emphasis in Python 3 had been on the removal of duplicate
programming constructs and modules, thus fulfilling or coming close to fulfilling the 13th law of
the Zen of Python: "There should be one -- and preferably only one -- obvious way to do
it."Some changes in Python 7.3:
Print is now a function
The rules for ordering comparisons have been simplified. E.g. a heterogeneous list cannot
be sorted, because all the elements of a list must be comparable to each other.
There is only one integer type left, i.e. int. long is int as well.
The division of two integers returns a float instead of an integer. "//" can be used to have
the "old" behaviour.
Purpose :-
Python
Python features a dynamic type system and automatic memory management. It supports multiple
programming paradigms, including object-oriented, imperative, functional and procedural, and
has a large and comprehensive standard library.
Python is Interpreted − Python is processed at runtime by the interpreter. You do not need
to compile your program before executing it. This is similar to PERL and PHP.
Python is Interactive − you can actually sit at a Python prompt and interact with the
interpreter directly to write your programs.
Python also acknowledges that speed of development is important. Readable and terse code is
part of this, and so is access to powerful constructs that avoid tedious repetition of code.
Maintainability also ties into this may be an all but useless metric, but it does say something
about how much code you have to scan, read and/or understand to troubleshoot problems or
tweak behaviors. This speed of development, the ease with which a programmer of other
languages can pick up basic Python skills and the huge standard library is key to another area
where Python excels. All its tools have been quick to implement, saved a lot of time, and several
of them have later been patched and updated by people with no Python background - without
breaking.
Modules Used in Project :-
Tensorflow
TensorFlow is a free and open-source software library for dataflow and differentiable
programming across a range of tasks. It is a symbolic math library, and is also used for machine
learning applications such as neural networks. It is used for both research and production
at Google.
TensorFlow was developed by the Google Brain team for internal Google use. It was released
under the Apache 2.0 open-source license on November 9, 2015.
Numpy
It is the fundamental package for scientific computing with Python. It contains various features
including these important ones:
Besides its obvious scientific uses, Numpy can also be used as an efficient multi-dimensional
container of generic data. Arbitrary data-types can be defined using Numpy which allows
Numpy to seamlessly and speedily integrate with a wide variety of databases.
Pandas
Matplotlib
Matplotlib is a Python 2D plotting library which produces publication quality figures in a variety
of hardcopy formats and interactive environments across platforms. Matplotlib can be used in
Python scripts, the Python and IPython shells, the Jupyter Notebook, web application servers,
and four graphical user interface toolkits. Matplotlib tries to make easy things easy and hard
things possible. You can generate plots, histograms, power spectra, bar charts, error charts,
scatter plots, etc., with just a few lines of code. For examples, see the sample plots and thumbnail
gallery.
For simple plotting the pyplot module provides a MATLAB-like interface, particularly when
combined with IPython. For the power user, you have full control of line styles, font properties,
axes properties, etc, via an object oriented interface or via a set of functions familiar to
MATLAB users.
Scikit – learn
Scikit-learn provides a range of supervised and unsupervised learning algorithms via a consistent
interface in Python. It is licensed under a permissive simplified BSD license and is distributed
under many Linux distributions, encouraging academic and commercial use. Python
Python features a dynamic type system and automatic memory management. It supports multiple
programming paradigms, including object-oriented, imperative, functional and procedural, and
has a large and comprehensive standard library.
Python is Interpreted − Python is processed at runtime by the interpreter. You do not need
to compile your program before executing it. This is similar to PERL and PHP.
Python is Interactive − you can actually sit at a Python prompt and interact with the
interpreter directly to write your programs.
Python also acknowledges that speed of development is important. Readable and terse code is
part of this, and so is access to powerful constructs that avoid tedious repetition of code.
Maintainability also ties into this may be an all but useless metric, but it does say something
about how much code you have to scan, read and/or understand to troubleshoot problems or
tweak behaviors. This speed of development, the ease with which a programmer of other
languages can pick up basic Python skills and the huge standard library is key to another area
where Python excels. All its tools have been quick to implement, saved a lot of time, and several
of them have later been patched and updated by people with no Python background - without
breaking.
The object-oriented approach and language construct provided by Python enables programmers
to write both clear and logical code for projects. This software does not come pre-packaged with
Windows.
There have been several updates in the Python version over the years. The question is how to
install Python? It might be confusing for the beginner who is willing to start learning Python but
this tutorial will solve your query. The latest or the newest version of Python is version 3.7.4 or
in other words, it is Python 3.
Note: The python version 3.7.4 cannot be used on Windows XP or earlier devices.
Before you start with the installation process of Python. First, you need to know about
your System Requirements. Based on your system type i.e. operating system and based
processor, you must download the python version. My system type is a Windows 64-bit
operating system. So the steps below are to install python version 3.7.4 on Windows 7 device or
to install Python 3. Download the Python Cheatsheet here.The steps on how to install Python on
Windows 10, 8 and 7 are divided into 4 parts to help understand better.
Step 1: Go to the official site to download and install python using Google Chrome or any other
web browser. OR Click on the following link: https://www.python.org
Now, check for the latest and the correct version for your operating system.
Step 3: You can either select the Download Python for windows 3.7.4 button in Yellow Color or
you can scroll further down and click on download with respective to their version. Here, we are
downloading the most recent python version for windows 3.7.4
Step 4: Scroll down the page until you find the Files option.
Step 5: Here you see a different version of python along with the operating system.
• To download Windows 32-bit python, you can select any one from the three options: Windows
x86 embeddable zip file, Windows x86 executable installer or Windows x86 web-based
installer.
•To download Windows 64-bit python, you can select any one from the three options: Windows
x86-64 embeddable zip file, Windows x86-64 executable installer or Windows x86-64 web-
based installer.
Here we will install Windows x86-64 web-based installer. Here your first part regarding which
version of python is to be downloaded is completed. Now we move ahead with the second part in
installing python i.e. Installation
Note: To know the changes or updates that are made in the version you can click on the Release
Note Option.
Installation of Python
Step 1: Go to Download and Open the downloaded python version to carry out the installation
process.
Step 2: Before you click on Install Now, Make sure to put a tick on Add Python 3.7 to PATH.
Step 3: Click on Install NOW After the installation is successful. Click on Close.
With these above three steps on python installation, you have successfully and correctly installed
Python. Now is the time to verify the installation.
Step 4: Let us test whether the python is correctly installed. Type python –V and press Enter.
Step 5: You will get the answer as 3.7.4
Note: If you have any of the earlier versions of Python already installed. You must first uninstall
the earlier version and then install the new one.
Step 3: Click on IDLE (Python 3.7 64-bit) and launch the program
Step 4: To go ahead with working in IDLE you must first save the file. Click on File > Click on
Save
Step 5: Name the file and save as type should be Python files. Click on SAVE. Here I have
named the files as Hey World.
5.1.1SYSTEM TEST
The purpose of testing is to discover errors. Testing is the process of trying to discover every
conceivable fault or weakness in a work product. It provides a way to check the functionality of
components, sub assemblies, assemblies and/or a finished product It is the process of exercising
software with the intent of ensuring that the Software system meets its requirements and user
expectations and does not fail in an unacceptable manner. There are various types of test. Each
test type addresses a specific testing requirement.
5.1.2Unit testing
Unit testing involves the design of test cases that validate that the internal program logic is
functioning properly, and that program inputs produce valid outputs. All decision branches and
internal code flow should be validated. It is the testing of individual software units of the
application .it is done after the completion of an individual unit before integration. This is a
structural testing, that relies on knowledge of its construction and is invasive. Unit tests perform
basic tests at component level and test a specific business process, application, and/or system
configuration. Unit tests ensure that each unique path of a business process performs accurately
to the documented specifications and contains clearly defined inputs and expected results.
5.1.3Integration testing
Integration tests are designed to test integrated software components to determine if they actually
run as one program. Testing is event driven and is more concerned with the basic outcome of
screens or fields. Integration tests demonstrate that although the components were individually
satisfaction, as shown by successfully unit testing, the combination of components is correct and
consistent. Integration testing is specifically aimed at exposing the problems that arise from the
combination of components.
5.1.4Functional test
Functional tests provide systematic demonstrations that functions tested are
available as specified by the business and technical requirements, system
documentation, and user manuals.
Functional testing is centered on the following items:
Unit Testing
Unit testing is usually conducted as part of a combined code and unit test phase
of the software lifecycle, although it is not uncommon for coding and unit testing to be
conducted as two distinct phases.
Field testing will be performed manually and functional tests will be written in
detail.
Test objectives
All field entries must work properly.
Pages must be activated from the identified link.
The entry screen, messages and responses must not be delayed.
Features to be tested
Verify that the entries are of the correct format
No duplicate entries should be allowed
All links should take the user to the correct page.
The task of the integration test is to check that components or software applications, e.g.
components in a software system or – one step up – software applications at the company level –
interact without error.
Test Results:All the test cases mentioned above passed successfully. No defects encountered.
Acceptance Testing
User Acceptance Testing is a critical phase of any project and requires significant participation
by the end user. It also ensures that the system meets the functional requirements.
Test Results:All the test cases mentioned above passed successfully. No defects encounter
CHAPTER-6:
RESULTS SCREENSHOTS :
In above screen click on ‘Upload Credit Card Fraud Dataset’ button to upload
dataset and to get below output
In above screen selecting and uploading fraud dataset and then click on ‘Open’
button to load dataset and get below output
In above screen dataset loaded and from above dataset we need to calculate
FUZZY member function so click on ‘Calculate Fuzzy Membership Functions’
button to calculate FUZY values and get below output
Note: in this project as LOW we have used 0 and 1 for MEDIUM and 2 for HIGH
In above screen we have extracted fuzzy values from dataset and in graph x-axis
represents type of transaction and y-axis represents number of records available in
dataset for that transaction and now close above graph and then click on ‘Run
Fuzzy Logic Algorithm’ button to train Fuzzy Logic on member ship values and
get below output
In above screen fuzzy training completed and we got MSE as 0.59 (algorithms get
train on random splitted train and test data so MSE may vary for each run due to
random data) and now click on ‘Run LSTM Algorithm’ button to train LSTM and
get below output
In above screen with LSTM we got 0.0032 MSE which is lesser than Fuzzy
algorithm and now click on ‘LSTM Training Graph’ button to get below graph
In above graph x-axis represents training POCH and y-axis represents MSE and we
can see with each increasing epoch MSE got decrease and for any model
decreasing MSE consider as best mode. Now close above graph and then click on
‘MSE Comparison Graph’ button to get below graph
In above graph x-axis represents algorithm names and y-axis represents MSE
values and in above graph we can see LSTM MSE is very low compare to Fuzzy
so LSTM is better than Fuzzy algorithm
CHAPTER-7:
CONCLUSION:
In this research, we have presented a novel scheme for CCFD by combining a rule-based fuzzy
inference system and a learning component that uses back-propagation neural network. We have
tested the proposed system by carrying out experiments using stochastic models. Based on the
results obtained, it is inferred that incorporation of neural network along with fuzzy inferencing
is appropriate in addressing this sort of real-world issues.
References :
1. Inscoe, S. W.: Global Consumers: Losing Confidence in the Battle against Fraud. ACI
Universal Payments, June, 2014, www.aciworldwide.com.
4. Ghosh, S., Reilly, D.L.: Credit card fraud detection with a neural-network. In: Proceedings of
the Annual International Conference on System Science, pp. 621–630 (1994).
5. Aleskerov, E., Freisleben, B., Rao, B.: CARDWATCH: a neural-network based database
mining system for credit card fraud detection. In: Proceedings of the Computational Intelligence
for Financial Engineering, pp. 220–226 (1997).
6. Dorronsoro, J.R., Ginel, F., Sanchez, C., Cruz, C.S.: Neural fraud detection in credit card
operations. IEEE Transactions on Neural Networks, pp. 827–834 (1997).
7. Liu, P., Li, L.: A Game-Theoretic Approach for Attack Prediction. Technical Report, PSU-S2-
2002-01, Penn State University (2002).
8. Vatsa, V., Sural, S., Majumdar, A.K.: A game-theoretic approach to credit card fraud
detection. In: Proceedings of the International Conference on Information Systems Security,
Lecture Notes in Computer Science, vol. 3803. pp. 263–276 (2005).
9. Quah, J. T. S., Srinagesh, M.: Real-time credit fraud detection using computational
intelligence. Expert Systems with Applications, 35, pp. 1721–1732 (2008).
10. Panigrahi, S., Kundu, A., Sural, S., Majumdar, A.: Credit card fraud detection a fusion
approach using Dempster–Shafer theory and bayesian learning.