Vignesh
Vignesh
DEPARTMENT OF
COMPUTER SCIENCE & ENGINEERING
SREE CHAITANYA COLLEGE OF ENGINEERING
(Affiliated to JNTUH, HYDERABAD)
THIMMAPUR, KARIMNAGAR, TELANGANA-505
APRIL - 2025 —1
CERTIFICATE
This is to certify that the Research project report entitled “DESIGNING
for partial fulfillment of the requirement for the award of the degree of
the Jawaharlal
supervision. The result embidied in this report has not been submitted to any
External Examiner
—2
DEPARTMENT OF
COMPUTER SCIENCE & ENGINEERING
DECLARATION
We hereby declare that the work which is being presented in this report entitled,
“Designing Secure and Efficient Biometric Cloud Service
Access”,
submitted towards the partial fulfillment of the requirements for the
award of the degree of Bachelor of Technology in Computer
Science & Engineering, Sree Chaitanya college of Engineering,
Karimnagar is an authentic record of our own work carried out under
the supervision of Dr. [Link], Associate Professor,
Department of CSE, Sree Chaitanya College of Engineering,
Karimnagar.
DATE :
PLACE :
ACKNOWLEDGEMENT
. We would like to express our sincere gratitude to our Project Guide, Dr. M
VARUN, Assistant Professor whose knowledge and guidance has motivated us
to achieve goals we never thought possible. The time we have spent working
under his supervision has truly been a pleasure.
.We also thank, Dr. KHAJA ZIAUDDIN HOD & Associate Professor of CSE
Department for providing seamless support and knowledge for the entire project
work and also for providing right suggestions at every phase of the development of
the project. He has consistently been a source of motivation, encouragement, and
inspiration.
1. Certificate 2
2. Declaration 3
3. Acknowledgement 4
4. Abstract 7
5. Table of Content 9 -10
6. List of Figures 12
7. List of Tables 13
8. Abbreviations 13
9. List of Symbols 14
Chapter 1:INTRODUCTION……………………………………………..15-19
1.1:OVERVIEW……………………………………………………16
1.2: MOTIVATION……………………………………..………….16
1.7:OBJECTIVE………………………………………….……..…19
5.1.2:MODULE DESCRIPTION………………..………………….24
5.1.3.B:………………………………………………………………..26
Chapter 4: IMPLEMENTATIO……..……..…………….……………………30
4.1MODULEDESCRIPTION……………….……………….……..30
5: RESULTS…………………………………..………………………………….31-42
Chapter 6: TESTING……………………………………………………..……43-51
Chapter7: CONCLUSION………………..…………………………………………52
LIST OF FIGURES
INTRODUCTION
1.1 OVERVIEW
—
this project we seek to design a secure and efficient authentication
protocol. Specifically, we will first provide an alternative to conventional
password-based authentication mechanism. Then, we demonstrate
how one can build a secure communication between communicating
parties involved in the authentication protocol, without having any
secret pre-loaded (i.e., shared) information.
1.2 MOTIVATION
With the rapid growth of cloud services, securing user access has
become a major challenge. Traditional authentication methods still rely
on password storage and shared keys, which are vulnerable to theft and
misuse. Recent studies have exposed serious flaws in many well-known
authentication protocols, highlighting the urgent need for a better
solution.
Motivated by these challenges, this project seeks to design a secure
authentication system that protects user credentials and reduces key
management overhead, offering a safer and more efficient way to
access cloud services.
1.3 EXISTING SYSTEM
❖ Jiang et al. designed a password-based user authentication
scheme for wireless sensor networks (WSNs). This is a two-factor
authentication scheme as it relies on both a smart card and some
password. During the user registration process, an authorized
user registers or re-registers with the trusted gateway node
(GWN).
❖ Althobaiti et al. proposed a biometric-based user authentication
mechanism for WSNs. However, their scheme is insecure against
impersonation attacks and man-in-the-middle attacks [23]. Das
[23] then proposed a new biometric-based user authentication
approach.
❖ Xue et al. also designed a temporal-credential-based mutual
authenticated key agreement mechanism for WSNs. In their
scheme, the remote authorized users are permitted to access
—
authorized sensor nodes in order to obtain information and also
to send some important commands to the sensor nodes in WSN.
In this scheme, the GWN issues temporal credentials to each user
and sensor node deployed in WSN with the help of the
passwordbased authentication mechanism.
❖ Later, Li et al. demonstrated that Xue et al.’s scheme fails to
resist stolen-verifier, offline password guessing, insider, many
logged-in users, and smart card lost attacks. He et al. also
demonstrated that Xue et al.’s scheme is insecure against user
impersonation, off-line password guessing, modification and
sensor node impersonation attacks.
❖ Dhillon and Kalra designed a biometric based user authenticated
key agreement mechanism for secure access to services
provided by Internet of Things (IoT) devices. Though this scheme
uses lightweight operations, it does not protect against DoS
attacks as it uses the perceptual hashing (bio hashing) operation
instead of fuzzy extractor. This is primarily because the bio
hashing technique hardly creates a unique value BH(BIOi) from
the biometric data BIO of a legitimate user Ui at different input
times though it may reduce output error, where BH is the
biohashing function.
❖ Kaul and Awasthi designed an authenticated key agreement
scheme, but it was later revealed to be insecure against user
impersonation and off-line password guessing attacks. In
addition, the scheme of Kaul and Awasthi does not preserve user
anonymity. Therefore, Kang et al. proposed an enhanced
biometric-based user authentication scheme. However, this
scheme is insecure against DoS attacks and also impersonation
attacks where a privilegedinsider attacker can easily mount such
an attack.
—
1.4 DISADVANTAGES OF EXISTING SYSTEM o In the existing work,
the system is less effective due to absence of user finger print
image authentication.
o The system is less security due to absence of Message
Authentication Code.
—
➢ A message authentication mechanism, as an alternative to the
existing message authentication protocols (i.e., Message
Authentication Code (MAC)), is introduced.
1.7 OBJECTIVE
The objective of this project is to develop a secure and efficient
authentication protocol for accessing cloud services.
It aims to replace traditional password-based systems with a more
secure alternative, reducing the risk of credential theft.
The project focuses on building a protocol that does not rely on storing
user passwords or sharing secret keys in advance.
It also seeks to create a secure communication channel between users
and servers without assuming full trust in any remote server.
Overall, the goal is to enhance security, minimize authentication
overhead, and make cloud access safer and faster for users.
—
CHAPTER 2
—
Despite these efforts, designing a secure and efficient
authentication protocol that avoids credential storage and pre-
shared key dependencies remains an open challenge. Many protocols
also introduce high computational overhead due to multiple
encryption and decryption operations, making them impractical for real-
time cloud applications. Motivated by these shortcomings, the current
project aims to propose a novel authentication mechanism that
does not rely on stored passwords or pre-shared secret keys. Instead, it
focuses on dynamically establishing trust between users and servers,
ensuring secure communication even in partially trusted environments.
This approach seeks to minimize risks while enhancing both the security
and efficiency of cloud service access.
—
CHAPTER 3
CHAPTER 4
—
Software Requirements:
➢ Operating System - Windows 7
➢ Coding Language - Python
➢ Front End - Tkinter
➢ Back End - MySQL
➢ Web Server - Flask CHAPTER
5
5.0 DESIGN AND IMPLEMENTATION
5.1 ARCHITECTURE OF THE PROPOSED SYSTEM
5.1.1 ARCHITECTURE
—
In this module, initially the data owner has to register to the
cloud server and get authorized. After the authorization from cloud
data owner will encrypt and add file to the cloud server where in after
the addition of file data owner View All Uploaded Files, View All
Transactions.
Remote Server:
The remote server manages a cloud to provide data storage
service. Data owners encrypt their data files and store them in the
cloud for sharing with cloud End users and performs the following
operations such as View All Owners and Authorize
, View All Users and Authorize, View All Cloud Files, View All
Transactions
, View All Attackers, View File Score Results, View Time Delay Results,
View Throughput Results
Authenticate Server:
CA generates the content key and the secret key requested by
the end user and also View All Attackers.
Client:
User has to register and login for accessing the files in the cloud.
User is authorized by the cloud to verify the registration. User has to
View All Files , Download.
—
5.1.3 DATA FLOW DIAGRAM
LEVEL-0
—
View All
Data owner Uploaded Client
Files
—
Fig 5.1.3.C: FLOW CHART
—
Fig 5.1.3.D: FLOW CHART
—
• Performance Monitoring: Tracks system performance and detects
attacks.
3. Authenticate Server Module
• Key Generation: Generates and distributes encryption keys based
on biometric authentication.
• Authentication: Verifies biometric data to grant access.
• Security Monitoring: Detects and alerts on unauthorized access
attempts.
4. Client Module
• Registration: Client registers biometric data for authentication.
• Login: Client logs in using biometric data.
• File Access: Views/downloads files after successful authentication.
5.2 ALGORITHMS
Step 1: User Registration
1. Capture Biometric Data (e.g., Fingerprint or Facial
Recognition).
2. Store Biometric Data securely in a local database or cloud
storage (here, it will be a mock database).
3. Create a User Account (username, password, and biometric
data).
4. Encrypt User Data before storing in the cloud for security.
Step 2: User Login & Authentication
1. Prompt User for Login (Username and Biometric Scan).
2. Verify Username against the stored records.
3. Authenticate Biometric Data by comparing the entered
biometric scan with stored data.
4. If Authentication Succeeds, grant access to the cloud system
(view files, upload files).
5. If Authentication Fails, deny access and prompt for re-entry.
Step 3: File Upload
1. Prompt User to Upload a File.
2. Encrypt the File before uploading to ensure data security.
3. Store the File in the cloud (simulated with a mock database).
4. Confirm Upload to the user.
—
Step 4: Performance Monitoring & Attack Detection
1. Continuously Monitor for Suspicious Activity (e.g., failed login
attempts).
2. Alert User in case of security threats (e.g., repeated failed
authentication).
3. Ensure efficient File Upload/Download Speed and Access
Time
—
5.2.1 ER DIAGRAM
—
the future, some form of method or process may also be added to; or
associated with, UML.
The Unified Modeling Language is a standard language for specifying,
Visualization, Constructing and documenting the artifacts of software
system, as well as for business modeling and other non-software
systems. The UML is a very important part of developing object-
oriented software and the software development process. The UML
uses mostly graphical notations to express the design of software
projects
UML DIAGRAMS
A UML diagram is a diagram based on the UML (Unified Modeling
Language) with the purpose of visually representing a system along
with its main actors, roles, actions, artifacts or classes, in order to
better understand, alter, maintain, or document information about the
system. The goal is for UML to become a computer language for
creating models of object-oriented computer software.
—
Fig 5.2.3: Use Case Diagram
5.2.4 CLASS DIAGRAM
In software engineering, a class diagram in the Unified Modeling
Language (UML) is a type of static structure diagram that describes the
structure of a system by showing the system's classes, their attributes,
operations (or methods), and the relationships among the classes. It
explains which class contains information
—
Fig 5.2.5: Sequence Diagram
—
5.2.6 DATA BASE DESIGN
Input Design
Input Design plays a vital role in the life cycle of software
development, it requires very careful attention of developers. The input
design is to feed data to the application as accurate as possible. So
inputs are supposed to be designed effectively so that the errors
occurring while feeding are minimized. According to Software
Engineering Concepts, the input forms or screens are designed to
provide to have a validation control over the input limit, range and
other related validations.
—
This system has input screens in almost all the modules. Error
messages are developed to alert the user whenever he commits some
mistakes and guides him in the right way so that invalid entries are not
made. Let us see deeply about this under module design.
Input design is the process of converting the user created input into a
computer-based format. The goal of the input design is to make the
data entry logical and free from errors. The error is in the input are
controlled by the input design. The application has been developed in
user-
friendly manner. The forms have been designed in such a way during
the processing the cursor is placed in the position where must be
entered. The user is also provided with in an option to select an
appropriate input from various alternatives related to the field in
certain cases.
Validations are required for each data entered. Whenever a user
enters an erroneous data, error message is displayed and the user can
move on to the subsequent pages after completing all the entries in
the current page.
Output Design
The Output from the computer is required to mainly create an
efficient method of communication within the company primarily
among the project leader and his team members, in other words, the
administrator and the clients. The output of VPN is the system which
allows the project leader to manage his clients in terms of creating new
clients and assigning new projects to them, maintaining a record of the
projet validity and providing folder level access to each client on the
user side depending on the projects allotted to him. After completion of
a project, a new project may be assigned to the client.
User authentication procedures are maintained at the initial
stages itself. A new user may be created by the administrator himself
or a user can himself register as a new user but the task of assigning
projects and validating a new user rest with the administrator only.
The application starts running when it is executed for the first time.
The server has to be started and then the internet explorer in used as
—
the browser. The project will run on the local area network so the
server machine will serve as the administrator while the other
connected systems can act as the clients. The developed system is
highly user friendly and can be easily understood by anyone using it
even for the first time.
CHAPTER 6
6.1 System Testing
The purpose of testing is to discover errors. Testing is the
process of trying to discover every conceivable fault or
weakness in a work product. It provides a way to check the
functionality of components, sub assemblies, assemblies
and/or a finished product It is the process of exercising
software with the intent of ensuring that the Software
system meets its requirements and user expectations and
does not fail in an unacceptable manner. There are various
types of tests. Each test type addresses a specific testing
requirement.
—
process performs accurately to the documented
specifications and contains clearly defined inputs and
expected results.
6.2.2 Integration Testing
Integration tests are designed to test integrated software
components to determine if they actually run as one
program. Testing is event driven and is more concerned
with the basic outcome of screens or fields. Integration tests
demonstrate that although the components were
individually satisfaction, as shown by successfully unit
testing, the combination of components is correct and
consistent. Integration testing is specifically aimed at
exposing the problems that arise from the combination of
components.
6.2.3 Functional Testing
Functional tests provide systematic demonstrations that
functions tested are available as specified by the business
and technical requirements, system documentation, and
user manuals.
Functional testing is centered on the following items: Valid
Input : identified classes of valid input must be accepted.
Invalid Input : identified classes of invalid input must be rejected.
Functions : identified functions must be exercised.
Output :identified classes of application outputs must be
exercised. Systems/Procedures: interfacing systems or
procedures must be invoked. Organization and
preparation of functional tests is focused on requirements,
key functions, or special test cases. In addition, systematic
coverage pertaining to identify Business process flows; data
fields, predefined processes, and successive processes must
be considered for testing. Before functional testing is
complete, additional tests are identified and the effective
value of current tests is determined.
—
6.2.4 System Testing
System testing ensures that the entire integrated software
system meets requirements. It tests a configuration to
ensure known and predictable results. An example of
system testing is the configurationoriented system
integration test. System testing is based on process
descriptions and flows, emphasizing pre-driven process
links and integration points.
6.2.5 White Box Testing
White Box Testing is a testing in which in which the
software tester has knowledge of the inner workings,
structure and language of the software, or at least its
purpose. It is purpose. It is used to test areas that cannot be
reached from a black box level.
Unit Testing
Unit testing is usually conducted as part of a combined code
and unit test phase of the software lifecycle, although it is
not uncommon for coding and unit testing to be conducted
as two distinct phases.
—
Test strategy and approach
Field testing will be performed manually and functional tests
will be written in detail.
Test Objectives
• All field entries must work properly.
• Pages must be activated from the identified link.
• The entry screen, messages and responses must not be
delayed.
Features to be Tested
• Verify that the entries are of the correct format
• No duplicate entries should be allowed
• All links should take the user to the correct page.
Integration Test
Software integration testing is the incremental integration
testing of two or more integrated software components on a
single platform to produce failures caused by interface
defects.
The task of the integration test is to check that components
or software applications, e.g. components in a software
system or – one step up – software applications at the
company level – interact without error.
Test Results
All the test cases mentioned above passed successfully. No defects
encountered.
Acceptance Testing
User Acceptance Testing is a critical phase of any project
and requires significant participation by the end user. It also
ensures that the system meets the functional requirements.
—
Test Results
All the test cases mentioned above passed successfully. No defects
encountered.
Unit Testing
Unit testing focuses verification effort on the smallest unit
of Software design that is the module. Unit testing exercises
specific paths in a module’s control structure to ensure
complete coverage and maximum error detection. This test
focuses on each module individually, ensuring that it
functions properly as a unit. Hence, the naming is Unit
Testing.
During this testing, each module is tested individually
and the module interfaces are verified for the consistency
with design specification. Allimportant processing path are
tested for the expected results. All error handling paths are
also tested.
Integration Testing
Integration testing addresses the issues associated with the
dual problems of verification and program construction.
After the software has been integrated a set of high order
tests are conducted. The main objective in this testing
process is to take unit tested modules and builds a program
structure that has been dictated by design.
The following are the types of integration testing:
—
1. Top-Down Integration
This method is an incremental approach to the construction
of program structure. Modules are integrated by
moving downward through the control hierarchy, beginning
with the main program module. The module subordinates to
the main program module are incorporated into the
structure in either a depth first or breadth first manner.
In this method, the software is tested from main module
and individual stubs are replaced when the test proceeds
downwards.
2. Bottom-Up Integration
This method begins the construction and testing with the
modules at the lowest level in the program structure. Since
the modules are integrated from the bottom up, processing
required for modules subordinate to a given level is always
available and the need for stubs is eliminated. The bottom-
up integration strategy may be implemented with the
following steps:
▪ The low-level modules are combined into clusters
into clusters that perform a specific Software sub-
function. ▪ A driver (i.e.) the control program for
testing is written to coordinate test case input and
output.
▪ The cluster is tested.
▪ Drivers are removed and clusters are combined moving
upward in the program structure
The bottom-up approaches test each module individually
and then each module is module is integrated with a main
module and tested for functionality.
—
OTHER TESTING METHODLOGIES
User Acceptance Testing
User Acceptance of a system is the key factor for the
success of any system. The system under consideration is
tested for user acceptance by constantly keeping in touch
with the prospective system users at the time of developing
and making changes wherever required. The system
developed provides a friendly user interface that can easily
be understood even by a person who is new to the system.
Output Testing
After performing the validation testing, the next step is
output testing of the proposed system, since no system
could be useful if it does not produce the required output in
the specified format. Asking the users about the format
required by them tests the outputs generated or displayed
by the system under consideration. Hence the output
format is considered in 2 ways – one is on screen and
another in printed format.
VALIDATION CHECKING
Validation checks are performed on the following fields.
Text Field
The text field can contain only the number of characters
lesser than or equal to its size. The text fields are
alphanumeric in some tables and alphabetic in other tables.
Incorrect entry always flashes and error message.
Numeric Testing
The numeric field can contain only numbers from 0 to 9.
An entry of any character flashes an error message. The
individual modules are checked for accuracy and what it
has to perform. Each module is subjected to test run
along with sample data. The individually tested modules
are integrated into a single system. Testing involves
—
executing the real data information is used in the program
the existence of any program defect is inferred from the
output. The testing should be planned so that all the
requirements are individually tested.
A successful test is one that gives out the defects for the
inappropriate data and produces and output revealing the
errors in the system.
Preparation of Test Data
Taking various kinds of test data does the above
testing. Preparation of test data plays a vital role in the
system testing. After preparing the test data the system
under study is tested using that test data. While testing the
system by using test data errors are again uncovered and
corrected by using above testing steps and corrections are
also noted for future use. Using Live Test Data
Live test data are those that are actually extracted from
organization files. After a system is partially constructed,
programmers or analysts often ask users to key in a set of
data from their normal activities. Then, the systems person
uses this data as a way to partially test the system. In other
instances, programmers or analysts extract a set of live
data from the files and have them entered themselves.
It is difficult to obtain live data in sufficient amounts to
conduct extensive testing. And, although it is realistic data
that will show how the system will perform for the typical
processing requirement, assuming that the live data
entered are in fact typical, such data generally will not test
all combinations or formats that can enter the system. This
bias toward typical values then does not provide a true
systems test and in fact ignores the cases most likely to
cause system failure.
—
Using Artificial Test Data
Artificial test data are created solely for test purposes,
since they can be generated to test all combinations of
formats and values. In other words, the artificial data, which
can quickly be prepared by a data generating utility
program in the information systems department, make
possible the testing of all login and control paths through
the program. The most effective test programs use artificial
test data generated by persons other than those who wrote
the programs. Often, an independent team of testers
formulates a testing plan, using the systems specifications.
The package “Virtual Private Network” has satisfied all the
requirements specified as per software requirement
specification and was accepted.
User Training
Whenever a new system is developed, user training is
required to educate them about the working of the system
so that it can be put to efficient use by those for whom the
system has been primarily designed. For this purpose, the
normal working of the project was demonstrated to the
prospective users. Its working is easily understandable and
since the expected users are people who have good
knowledge of computers, the use of this system is very
easy.
Maintenance
This covers a wide range of activities including correcting
code and design errors. To reduce the need for maintenance
in the long run, we have more accurately defined the user’s
requirements during the process of system development.
Depending on the requirements, this system has been
developed to satisfy the needs to the largest possible
extent. With development in technology, it may be possible
to add many more features based on the requirements in
—
future. The coding and designing is simple and easy to
understand which will make maintenance easier.
Testing Strategy
A strategy for system testing integrates system test cases
and design techniques into a well-planned series of steps
that results in the successful construction of software. The
testing strategy must co-operate test planning, test case
design, test execution, and the resultant data collection and
evaluation. A strategy for software testing must
accommodate lowlevel tests that are necessary to verify
that a small source code segment has been correctly
implemented as well as high level tests that validate
major system functions against user requirements.
Software testing is a critical element of software quality
assurance and represents the ultimate review of
specification design and coding. Testing represents an
interesting anomaly for the software. Thus, a series of
testing are performed for the proposed system before the
system is ready for user acceptance testing.
System Testing
Software once validated must be combined with other
system elements (e.g. Hardware, people, database). System
testing verifies that all the elements are proper and that
overall system function performance is achieved. It also
tests to find discrepancies between the system and its
original objective, current specifications and system
documentation.
Unit Testing
In unit testing different are modules are tested against the
specifications produced during the design for the modules.
Unit testing is essential for verification of the code produced
during the coding phase, and hence the goals to test the
internal logic of the modules. Using the detailed design
—
description as a guide, important Conrail paths are tested to
uncover errors within the boundary of the modules. This
testing is carried out during the programming stage itself. In
this type of testing step, each module was found to be
working satisfactorily as regards to the expected output
from the module.
In Due Course, latest technology advancements
will be taken into consideration. As part of technical build-
up many components of the networking system will be
generic in nature so that future projects can either use or
interact with this. The future holds a lot to offer to the
development and refinement of this project.
CHAPTER-7
OUTPUT AND OUTPUT SCREENS
CHAPTER-8
CONCLUSION
Biometric has its unique advantages over conventional password
and token-based security system, as evidenced by its increased
adoption (e.g., on Android and iOS devices). In this Project, we
introduced a biometric based mechanism to authenticate a user
seeking to access services and computational resources from a
remote location. Our proposed approach allows one to generate a
private key from a fingerprint biometric reveal, as It is possible to
generate the same key from a fingerprint of a user with 95.12%
accuracy. Our proposed session key generation approach using two
biometric data does not require any prior information to be shared.
A comparison of our approach with other similar authentication
—
protocols reveals that our protocol is more resilient to several
known attacks.
Future Enhancement
Future research includes exploring other biometric traits and also
multimodel biometrics for other sensitive applications (e.g., in
national security matters).
REFERENCES
1. C. Neuman, S. Hartman, K. Raeburn, “The kerberos network
authentication service (v5),” RFC 4120, 2005.
—
8. S. Zhu, S. Setia, and S. Jajodia, “LEAP: efficient security mechanisms
for large-scale distributed sensor networks,” Washington D.C., USA,
October 2003, pp. 62–72.