0% found this document useful (0 votes)
14 views27 pages

Major Report

In the sentence below, a word of four letters is hidden at the end of one word and the beginning of the next word. Find the pair of words that contains the hidden word. "My scar took weeks to fade". My scar scar took took weeks weeks to to fade

Uploaded by

rswamy1253
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views27 pages

Major Report

In the sentence below, a word of four letters is hidden at the end of one word and the beginning of the next word. Find the pair of words that contains the hidden word. "My scar took weeks to fade". My scar scar took took weeks weeks to to fade

Uploaded by

rswamy1253
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

A PROJECT REPORT

on

“PLANT DISEASE DETECTION SYSTEM”


Submitted to

KIIT Deemed to be University

In Partial Fulfilment of the Requirement for the Award of

BACHELOR’S DEGREE IN
INFORMATION TECHNOLOGY
BY
HARSHITA OLIVE AROHAN 2105718
REDDY SWAMY 2105727
NOMULA SRIMLIKA 21051911
NANCY BAKHLA 2105472
GUNUPATI THIRUMALA 21052760
REDDY

UNDER THE GUIDANCE OF PRADEEP KANDULA

SCHOOL OF COMPUTER ENGINEERING

KALINGA INSTITUTE OF INDUSTRIAL TECHNOLOGY


BHUBANESWAR, ODISHA - 751024
December 2024
KIIT Deemed to be University
School of Computer Engineering
Bhubaneswar, ODISHA 751024

CERTIFICATE
This is certify that the project entitled

“PLANT DISEASE DETECTION SYSTEM“


submitted by

HARSHITA OLIVE AROHAN 2105718


REDDY SWAMY 2105727
NANCY BAKHLA 2105472
NOMULA SRIMLIKA 21051911
GUNUPATI THIRUMALA REDDY 21052760

is a record of bonafide work carried out by them, in the partial fulfilment of the
requirement for the award of Degree of Bachelor of Engineering (Computer Sci-
ence & Engineering OR Information Technology) at KIIT Deemed to be university,
Bhubaneswar. This work is done during year 2023-2024, under our guidance.

Date: 10/12/2024 (Pradeep Kandula)


Project Guide
Acknowledgements

We are profoundly grateful to Pradeep Kandula of Kalinga Insitute of


Industrial Technology for his expert guidance and continuous encouragement
throughout to see that this project rights its target since its commencement to its
completion.

Harshita Olive Arohan

Reddy Swamy

Nomula Srimlika

Nancy Bakhla

Gunupati Thirumala Reddy


ABSTRACT
With agriculture supporting billions of people globally, it is one of the most
important sectors of the global economy. Nonetheless, crop production are still
seriously threatened by plant diseases, which can result in monetary losses and food
shortages. Conventional plant disease diagnosis techniques usually include
laboratory-based testing or manual inspection by agricultural specialists, both of
which can be expensive, time-consuming, and occasionally unreliable. The
development of automated systems that can quickly and effectively diagnose plant
diseases is urgently needed given the improvements in technology. This will enable
farmers make early decisions and avoid significant crop loss.

Through the use of machine learning and image processing techniques, this research
seeks to address the difficulties associated with plant disease identification. The
technology created here detects and identifies a variety of diseases using digital
photos of plant leaves, stems, or fruits. The first step in the procedure is taking
pictures, either with a camera or a smartphone, and then pre-processing them to
improve their quality and eliminate noise. To make sure the photos are good quality
and appropriate for analysis, methods like color normalizing, segmentation, and
histogram equalization are used.

Following preprocessing, the algorithm retrieves pertinent characteristics from the


image, including color, texture, and shape—all of which are important markers of
plant illnesses. A Convolutional Neural Network (CNN), a machine learning model
that has demonstrated remarkable efficacy in picture classification tasks, receives
the extracted characteristics as input. The algorithm can identify patterns that
differentiate healthy plants from unhealthy ones since it was trained on a sizable
dataset of tagged photos. After training, the model can accurately identify the plant
illness and classify incoming photos.

The user is subsequently presented with the analysis's findings, which include
comprehensive details about the illness, including its name, symptoms, and
suggested therapies. With the use of this diagnostic data, farmers can quickly
implement corrective measures, such as removing impacted plant portions,
changing environmental conditions, or applying pesticides. Additionally, the
technology gives farmers practical insights that might improve resource
management, including minimizing environmental effect and optimizing pesticide
use.

Because of the system's user-friendly design, even farmers with little technological
knowledge can interact with it with ease. The solution is easily accessible from any
location, particularly in remote areas with poor internet connectivity, and can be
implemented as a mobile application or on web-based platforms. The system
contributes to crop losses reduction, increased agricultural productivity, and
sustainable farming practices by offering a dependable and easily available tool for
early disease identification. By incorporating artificial intelligence into routine
farming operations, this project is a major step toward modernizing agriculture and
ultimately promoting environmental sustainability and global food security.

Feature extraction, disease classification, crop health, digital agriculture, precision


agriculture, leaf image analysis, noise reduction, morphological features, texture
analysis, color analysis, dataset annotation, real-time diagnosis, pest and disease
management, sustainable farming, user plant pathology, image processing, machine
learning (ML), deep learning (DL), convolutional neural networks (CNNs), and
more.

Keywords: Machine Learning (ML),Convolutional Neural Networks


(CNNs),Image Processing,Precision Agriculture,Sustainable Farming.
Contents
1 Introduction 1

2 Basic Concepts and Literature Review 3


2.1 Basic Concepts 3
2.1.1 OpenCV: 3
2.1.2 TensorFlow / Keras 3
2.1.3 NumPy: NumPy 3
2.1.4 Matplotlib / Seaborn 3
2.1.5 Machine Learning for Plant Disease Classification 4
2.2 Literature Review 4

3 Requirement Specifications 6
3.1 Functional Requirements 6
3.2 Non-Functional Requirements 6
3.3 Project Planning 6
3.4 Project Requirements 7
3.5 Project Analysis 7
3.6 System Design 7
3.6.1 Design Constraints 7
3.6.2 System Architecture 8

4 Implementation 10
4.1 Methodology 10
4.2 Testing Plan 11
4.3 Screenshots 12
4.4 Quality Assurance 13

5 Standard Adopted 14
5.1 Design Standards 14
5.2 Coding Standards 14
5.3 Testing Standards 14
6 Conclusion and Future Scope 17
6.1 Conclusion 17
6.2 Future Scope 17
References 18

Individual Contribution 19

Plagiarism Report 24
List of Figures

1. Architecture of plant disease detection system 2


2. Flow diagram illustrating the steps of the Plant Disease Detection 9
PLANT DISEASE DETECTION SYSTEM

Chapter 1

Introduction
The Agriculture forms the backbone of global food systems, supporting the
livelihoods of billions and contributing significantly to the world economy. In
this context, the health of crops is paramount to ensuring sustainable agricultural
practices and food security. However, plant diseases remain one of the most
persistent challenges in agriculture, leading to significant reductions in crop
yields, quality deterioration, and economic losses. Traditional methods of plant
disease detection rely heavily on manual inspection by experienced farmers or
agricultural experts, which can often be time-consuming, inconsistent, and
infeasible for large-scale farming operations.

The advent of digital technology and artificial intelligence (AI) offers a


transformative opportunity to address these challenges. Automated plant disease
detection systems, powered by advanced image processing and machine learning
techniques, have emerged as a potential solution to revolutionize how plant
health is monitored and managed. These systems use visual data, such as images
of leaves, stems, or fruits, to detect signs of diseases, enabling early diagnosis
and timely intervention.

To classify plant diseases accurately, this project explores both traditional


machine learning models and modern deep learning techniques such as
Convolutional Neural Networks (CNNs). These models are trained on datasets
comprising images of healthy and diseased plants, enabling the system to learn
the distinguishing features of various plant diseases. The goal is to achieve high
accuracy in identifying specific diseases and differentiating them from healthy
plant conditions.

This project aims to develop an automated system for plant disease detection,
leveraging the power of image processing and machine learning. The system
involves several key stages, starting with image acquisition, where high-quality
images of plants are collected using cameras or smartphones. The images
undergo preprocessing to enhance their quality by reducing noise and
normalizing features such as lighting and color. Advanced feature extraction
techniques are employed to capture critical attributes like texture, color patterns,
and morphological shapes, which are crucial for identifying diseases.
Motivation:
The Agriculture plays a pivotal role in sustaining human life and the global
economy. However, plant diseases continue to pose a significant challenge,
causing severe losses in crop yields and quality,particularly in regions
where farming is the primary source of livelihood. The problem is further
exacerbated by the lack of timely disease detection, which often results in
the excessive use of pesticides, environmental damage, and food insecurity.

Traditionally, identifying plant diseases has relied on manual inspection by


farmers or agricultural experts. While this approach has been effective in
certain scenarios, it is limited by human error, the need for expertise, and
the inability to scale across large fields or regions. Additionally, many
farmers lack access to expert advice or diagnostic tools, leaving them
vulnerable to the devastating effects of undiagnosed plant diseases. With
the rapid advancement of technology, particularly in the fields of artificial
intelligence, machine learning, and image processing, there is an
opportunity to revolutionize how plant diseases are detected and managed.
The motivation for this project stems from the desire to harness these
technologies to create an accessible, efficient, and accurate solution for
farmers and agricultural experts.iving them the tools they need to learn
more effectively online.

Objectives

1.The primary objective of this project is to develop an efficient and user-


friendly system for detecting plant diseases using advanced image
processing and machine learning techniques.
2.The system aims to accurately identify the type of disease affecting a
plant based on visual symptoms, such as spots, discoloration, or
deformities, captured in digital images.
3. By leveraging a robust machine learning model, the project seeks to
achieve high precision and reliability in disease detection, enabling
farmers to take timely and informed actions to mitigate crop losses.
4.This solution also aims to streamline the process of plant disease
identification, reducing the need for manual inspections by experts and
making it accessible to farmers in remote areas..

In addition to disease detection, the project aspires to contribute to


sustainable farming practices by promoting the judicious use of resources.
By diagnosing diseases early, the system can help minimize the excessive
use of pesticides, which can harm the environment and increase farming
costs. The ultimate goal is to create a scalable and deployable tool,
integrated into mobile or web-based platforms, that empowers farmers,
agronomists, and researchers to improve crop health management,
enhance productivity, and ensure food security.
Chapter 2

Basic Concepts/ Literature Review


The Plant Disease Detection system aims to address the challenges of identifying
plant diseases using digital images. With advancements in machine learning (ML)
and image processing, the project leverages these technologies to automatically
detect and classify diseases affecting plants. By utilizing a Convolutional Neural
Network (CNN), a type of deep learning model, the system processes images of
plant leaves, stems, or fruits, extracts relevant features like color, texture, and
shape, and classifies the plant as either healthy or diseased. Preprocessing
techniques such as noise reduction, color normalization, and segmentation are
applied to improve image quality before feeding them into the model. Once the
model is trained on a dataset of labeled images, it can accurately detect and
diagnose diseases in real-time, providing farmers with actionable insights and
recommendations for disease management.
2.1 Basic Concept:

2.1.1.OpenCV: OpenCV (Open Source Computer Vision Library) is used for


image processing tasks such as resizing, noise reduction, color normalization,
and feature extraction. It provides efficient tools to manipulate and analyze
images, which is essential for preprocessing plant images before they are fed into
the machine learning model.

2.1.2.TensorFlow / Keras: TensorFlow and its high-level API Keras are used for
building, training, and deploying the Convolutional Neural Network (CNN)
model. These libraries provide powerful tools for deep learning and allow for the
development of high-accuracy classification models, making them the core of
the disease detection mechanism in the system.

2.1.3.NumPy: NumPy is a fundamental package for scientific computing in


Python. It is used for handling large arrays and matrices, which is essential for
efficient image data manipulation and model training in deep learning tasks.

2.1.4.Matplotlib / Seaborn: These visualization libraries are used to plot graphs,


training and validation loss curves, and accuracy charts, helping to monitor the
training process and evaluate the performance of the model.

2.1.5. Machine Learning for Plant Disease Classification


Machine learning has gained significant attention in plant disease detection due
to its ability to automatically learn and classify complex patterns from data.
Traditional machine learning models, such as Support Vector Machines (SVM)
and Random Forests, have been applied to classify plant diseases based on
extracted features like color, texture, and shape. These models require careful
manual feature extraction, and their performance is highly dependent on the
quality of the features selected.

In recent years, deep learning, particularly Convolutional Neural Networks


(CNNs), has revolutionized the field. CNNs are well-suited for image-
based tasks as they automatically extract relevant features from images
through multiple layers of convolution. Studies have demonstrated that
CNNs outperform traditional machine learning models in plant disease
detection tasks, achieving high accuracy even with minimal preprocessing.
For example, a study by Mohanty et al. (2016) showed that a deep CNN
model could classify plant diseases with an accuracy of over 99% using a
dataset of 54,000 images from 14 different plant species. This has led to
the widespread adoption of deep learning techniques in recent plant disease
detection research.
In recent years, deep learning, particularly Convolutional Neural Networks
(CNNs), has revolutionized the field. CNNs are well-suited for image-
based tasks as they automatically extract relevant features from images
through multiple layers of convolution. Studies have demonstrated that
CNNs outperform traditional machine learning models in plant disease
detection tasks, achieving high accuracy even with minimal preprocessing.
For example, a study by Mohanty et al. (2016) showed that a deep CNN
model could classify plant diseases with an accuracy of over 99% using a
dataset of 54,000 images from 14 different plant species. This has led to
the widespread adoption of deep learning techniques in recent plant disease
detection research.
Chapter 3

Requirement Specifications
The requirement specifications for the Plant Disease Detection project are
divided into functional and non-functional requirements, along with hardware
and software needs.

3.1Functional Requirements:
Image Acquisition:
The system should allow users to upload or capture high-quality images of
plant leaves, stems, or fruits.
Image Preprocessing:
The system should preprocess images to reduce noise, normalize brightness, and
enhance features for better analysis.
Feature Extraction:
Extract relevant features such as texture, color, shape, and patterns from the
input images.

3.2Non-Functional Requirements:
•The system should achieve a high level of classification accuracy above 90%..
•The system should be scalable to handle multiple types of crops and diseases.
• The system should process and deliver results within a few seconds..
• Project Planning

3.3Project Planning
Effective project planning is essential to ensure that all aspects of the Plant
Disease Detection system are developed, tested, and deployed successfully.
Below is a detailed plan that includes phases, milestones, tasks, and timelines for
the project.
Project Phases
Research and Requirement Analysis
Data Collection and Preprocessing.

3.4 Project Requirements


To The success of the Plant Disease Detection project depends on meeting
both functional and non-functional requirements.
User Input: To personalize the system’s interface or provide a user-
specific diagnosis.
Text Processing: Text processing refers to the manipulation and analysis
of text data, which is crucial for extracting meaningful information from
unstructured or semi-structured data
Summarization: The Plant Disease Detection project aims to develop an
automated system that uses advanced image processing and machine
learning techniques to identify and classify plant diseases.
Note Generation: The Plant Disease Detection system utilizes image
processing and machine learning to automatically detect and diagnose plant
diseases.
User Interface: AThe User Interface (UI) is a critical component of the
Plant Disease Detection system, as it provides an easy way for users
system and access its features.
3.5 Project Analysis:
The primary objective of this project is to develop an automated plant disease
detection system that uses image processing and machine learning techniques.

The system aims to:

Identify plant diseases through digital images of plant leaves, stems, or fruits.
Provide accurate diagnoses of plant diseases in real-time, allowing for timely
intervention by farmers.

Suggest appropriate treatments based on the detected disease.


Reduce crop losses by detecting diseases early and minimizing the need for
excessive pesticide use.

Improve agricultural practices by providing data-driven insights and


recommendations to farmers, especially in remote areas with limited access to
experts.

Fig1: Architecture of plant disease detection system


3.6 System Design
The Plant Disease Detection system follows a client-server architecture where
users upload plant images for disease analysis, and the server processes the
images using image processing and machine learning techniques. The system
integrates a Convolutional Neural Network (CNN) for disease classification,
providing real-time results and actionable insights to users. The system is
designed to be accessible via both web and mobile applications, ensuring broad
accessibility for farmers.

3.6.1 System Architecture


Here is the diagram illustrating the architecture of the Plant Disease
Detection system, showing the flow between the components such as the
user interface, image preprocessing, machine learning model, database,
and the server. The diagram highlights how the client-side interacts with the
server to process images and return results.

Fig2: Flow diagram illustrating the steps of the Plant Disease Detection

Fig2 - It shows the data flow from the user uploading an image to the system,
image preprocessing, feeding the image to a machine learning model for disease
classification, and displaying the result to the user.
Fig3: Flowchart for processing in Plant Disease Detection

Fig3: A flowchart for processing video transcripts in plant disease detection is shown in the
image. Video preprocessing, which involves removing and improving frames for clarity,
comes after video input from a camera or drone. Using feature extraction techniques, the
segmented plant sections are examined to discover important characteristics such as leaf
texture and color. Potential plant illnesses are then categorized using a convolutional neural
network (CNN). Markers or heatmaps are used to illustrate the data, and a video transcript that
summarizes the illnesses found and their severity is produced, giving farmers
useful information.

Key Points:
Image Processing: Preprocessing techniques such as noise reduction,
normalization, and feature extraction ensure accurate disease detection.
Machine Learning (CNN): A deep learning model (CNN) classifies plant
diseases based on the features extracted from images.
User Feedback: Users provide feedback on the diagnosis, helping to refine the
model and improve accuracy over time.
Chapter 4
Implementation
4.1Methodology
The implementation of the Plant Disease Detection project involves
the following structured steps:
 Data Collection :

 Images of both healthy and sick plants were gathered from open-source
websites such as Kaggle and other publicly accessible repositories to
create a complete collection of plant leaves.
 The dataset serves as a basis for training and testing models since it
contains tagged photos grouped by disease kinds.

 Data Preprocessing:

 To guarantee consistency throughout the model training process, every


image was shrunk to a consistent size.
 To improve the model's capacity to generalize to new data, augmentation
techniques like as rotation, flipping, and brightness modifications were
used to artificially increase the dataset's size.

 Model Architecture:

 Because a convolutional neural network (CNN) is effective at extracting


features from picture data, it was used.
 Python frameworks like TensorFlow and Keras were used in the model's
construction, allowing layers like convolutional, pooling, and fully linked
layers to be seamlessly integrated.

 Training and Evaluation:

 To train the model and evaluate its performance, the dataset was separated
into training and validation sets.
 The model's robustness and dependability were ensured by evaluating it
using metrics including accuracy, precision, recall, and F1 score.
 The ultimate model demonstrated its efficacy in disease detection with a
95% classification accuracy.

 User Interface Development:

 To make the system usable for end users, a graphical user interface (GUI)
was created with Python's Tkinter framework.
 Through the interface, users can upload pictures of plant leaves, and the
algorithm will provide the expected disease and confidence levels.
4.2Testing Plan
During the testing process, several test cases were used to confirm the system's
functionality, correctness, and responsiveness:

Tests of Image Input: Verified that the system handles images with different
resolutions and formats correctly.
Prediction Accuracy Tests: Compared the model's categorization outcomes to
test data that had been labeled.
Interface Functionality Tests: Verified that the backend model and user interface
interacted smoothly.

Table 1: Test Cases


Test Test Case
ID Title Test Condition System Behavior Expected Result

Image
Input Provide images System accepts
Format in .jpg, .png, the images Successfully
T01 Testing and .jpeg. without error. processes the images.

Disease Upload a leaf Predicts the Correct disease


Classifica image of a disease with classification
T02 tion known disease. confidence. displayed.
Upload a
non-leaf
Invalid image (e.g., Rejects input and
Input object displays an error Error message
T03 Handling image). message. indicating invalid input.

Interact with
User buttons for GUI responds
Interface uploading without Smooth functionality
T04 Navigation. images. crashing. and response.

System
Multiple processes all
Image Upload images images Disease prediction for
T05 Processing in batch sequentially all images
4.3 Result Analysis
The following criteria were used to examine the system's output:

Accuracy: On the validation dataset, 95% classification accuracy was


attained.

Efficiency: It was appropriate for real-time use because the average


processing time per image was less than two seconds.

Reliability: The system performed consistently under various leaf conditions


and plant disease kinds.

Fig 3:The model is detecting the leaf disease as apple cider rust.

4.4Quality Assurance
 Model Validation and Testing: Using performance criteria like accuracy
and precision, the CNN model was validated and tested, yielding
consistent results with a 95% accuracy rate. Regression testing made sure
that upgrades didn't interfere with already-existing features.
 User feedback and error handling: To handle faulty inputs, strong error
handling procedures were put in place, and users were guided through
fixes via unambiguous error messages.
 Cross-Platform Compatibility: To guarantee consistent functioning and
user experience across several platforms, the system was tested on
Windows, macOS, and Linux.
 Coding Standards and Documentation: Readability and maintainability
were guaranteed by following PEP 8 coding standards. Thorough
documentation was kept up to date for troubleshooting and future
development.
Standards Adopted
5.1 Design Standards
 Convolutional Neural Network Architecture: The Convolutional Neural
Network (CNN) is built and trained using industry-standard frameworks
such as TensorFlow and Keras. These frameworks guarantee the model
design's scalability and modularity.
 User Interface Design: The interface is accessible to non-technical users
by adhering to usability concepts including clarity, simplicity, and ease
of navigation. It was created with Python's Tkinter module.
 System Integration: To guarantee smooth operation, data flow
consistency, and performance efficiency, all parts—including the CNN
model, preprocessing module, and user interface—were integrated.

5.2 Coding Standards


For code quality and maintainability, we adopted some general coding
standards:

 PEP 8 Compliance: By using appropriate formatting, indentation, and


naming conventions, the project complied with Python's PEP 8
requirements, guaranteeing understandable and maintainable code.
 Modular programming improved maintainability and made debugging
and future development easier by breaking up the code into reusable
modules and functions.
 Error Logging: To help with prompt resolution and system stability,
strong logging procedures were put in place to monitor faults.

5.3 Testing Standards


 Unit Testing: To guarantee accuracy and dependability, individual parts,
including the CNN model and preprocessing modules, underwent
extensive testing.
 Integration Testing: To ensure seamless functioning, the interaction
between the backend model, user interface, and other modules was
verified.
 Regression testing was used to make sure system updates and
modifications didn't adversely affect already-existing functionalities.
 Cross-Platform Testing: To ensure compatibility and consistent behavior,
the system was tested across many operating systems.
Chapter 6
Conclusion and Future Scope
6.1 Conclusion
The Plant Disease Detection project effectively illustrates how machine
learning can be applied in agriculture by offering a practical and effective
tool for early plant disease detection. The method achieves a high accuracy
of 95% by using Convolutional Neural Networks (CNNs) for image-based
disease classification, guaranteeing accurate findings.

By reducing crop losses, encouraging prompt treatments, and allowing


farmers to identify plant illnesses early, this method has major advantages
for the agriculture industry. The system is feasible for general usage
because of its user-friendly interface, which guarantees accessibility for
people with no technical knowledge.

The project's accomplishments highlight AI's potential to solve important


agricultural problems and open the door to more creative and sustainable
farming methods.

6.2 Future Scope

 Extension to More Crops and Diseases: To improve the system's


applicability, the dataset will be expanded to encompass a greater range of
crops and diseases.
 Real-Time Monitoring: Using drones or Internet of Things technologies to
allow for extensive, real-time field monitoring.
 Development of Mobile and Web Applications: Developing web-based and
mobile versions of the system to improve accessibility for agricultural
professionals and farmers.
 Multilingual Support: Including multilingual features to serve a wide range
of users in various geographical areas.
 Integration with Agricultural Systems: Working together with currently
available agricultural platforms and technologies to offer a complete farm
management ecosystem.
References

[1] S. P. Mohanty, D. P. Hughes, and M. Salathé (2016). Deep learning is being used to detect plant diseases
through image recognition. Frontiers in Plant Science, 7: 1419.
[2] K. P. Ferentinos (2018). Deep literacy approaches for identifying and forming opinions on manufacturing
complaints. Agriculture and Computers & Electronics, 145, 311–318.
[3] M. Brahimi, K. Boukhalfa, and A. Moussaoui (2017). Deep learning for tomato diseases: classification and
symptom visualization. Applied Artificial Intelligence, 31(4), 299–315.
[4] A. Fuentes, S. Yoon, S. Kim, D. S. Park, and J. J. Sena (2018). A powerful deep-learning-based detector for
detecting illnesses and pests on tomato plants in real time. Sensors, 18(11), 3731.
[5] Picon, A., A. Alvarez-Gila, M. Seitz, A. Ortiz-Barredo, and J. Echazarra (2019). Deep convolutional neural
networks for crop disease classification using mobile capture devices in the outdoors. Computers and
Electronics in Agriculture, 161, 280–290.
[6] Singh, A., Sarkar, S., Ganapathysubramanian, B., and Singh, A. K. (2016). High-throughput plant stress
phenotyping using machine learning. Plant Science Trends, 21(2), 110–124.
[7] Sladojevic, S., Culibrk, D., Stefanovic, D., Anderla, A., & Arsenovic, M. (2016). Plant disease detection
using deep neural networks based on categorization of leaf images. Computational Intelligence and
Neuroscience, 2016, 1-11.
[8] Sun, Z., Zheng, F., Chu, J., Zhang, L., Du, K., and Ma, J. (2018). A segmentation approach for greenhouse
vegetable foliar disease spots photos utilizing color information and region-growing. Electronics and
Computers in Agriculture, 142, 110–117.
[9] In 2019, Zhang, S., Huang, W., and Zhang, C. Convolutional neural networks with three channels for
identifying crop leaf diseases. Research on Cognitive Systems, 53, 31–41.
[10] Camargo, A., Caicedo, J., and Reyes, H. (2015). optimizing deep convolutional networks for the
identification of plants. 1391, CLEF, 467-475.
[11] Boulent, J., St-Charles, P. L., Foucher, S., & Théau, J. (2019). Convolutional neural networks for automated
plant disease detection. Plant Science Frontiers, 10, 941.
[12] Ullah, M. S., Hossain, M. A., & Hossain, E. (2019). A KNN classifier-based method for identifying and
categorizing plant leaf diseases based on color and texture. Agriculture Computers and Electronics, 159,
233-244.
[13] Momen, S., and Shuvo, S. B. (2020). hybrid deep learning method for the identification of plant diseases
using images. 11(1), 289-298, International Journal of Advanced Computer Science and Applications.
[14] Zhang, H., Li, Y., and Li, Y. (2020). An overview of computer vision tools for identifying plant diseases.
135–153 in Biosystems Engineering, 195.
[15] In 2017, Sun, Y., Liu, W., Ma, J., and Liu, J. Crop disease diagnosis using feature selection based on deep
learning. 241-250 in Neurocomputing, 237.
[16] Amara, J., Algergawy, A., & Bouaziz, B. (2017). A deep learning-based method for classifying illnesses of
banana leaves. BTW, 79–88.
[17] Yingchun, L., Njuki, S., Yujian, L., & Too, E. C. (2019). A comparative analysis of improving deep
learning models for identifying plant diseases. Electronics and Computers in Agriculture, 161, 272-279.
[18] Saleem, M. H., Arif, K. M., & Potgieter, J. (2019). Deep learning for the identification and categorization of
plant diseases. Plants, 8(11), 468.
[19] Uçar, M., Akyol, K., Atila, Ü., & Uçar, E. (2021). Using the EfficientNet deep learning model, plant leaf
diseases are classified. 61, 101182, Ecological Informatics.
[20] Hossain, E., Hossain, M. A., & Ullah, M. S.( 2019). A color and texture grounded approach for the
discovery and bracket of factory splint conditions using KNN classifier. Computers and Electronics in
Agriculture, 159, 233- 244.
INDIVIDUAL CONTRIBUTION REPORT:

Plant Disease Detection System

Gunupati Thirumala Reddy


21052760

Abstract: This project creates an innovative and intelligent solution design to revolutionize
agricultural practices by providing early and accurate detection of plant diseases, this system
addresses the critical need for timely and effective intervention in agricultural management.
The system leverages cutting-edge technologies, including image processing, machine
learning, and deep learning algorithms, to analyze plant health based on visual symptoms such
as discoloration, spots, or structural deformities on leaves, stems, or fruits.

Individual contribution and findings: I focused on developing the project’s user


interface (UI) using Streamlit. My responsibilities included:
 UI Design and Layout: I designed a user-friendly interface with a clear layout for the user
interaction. This included elements like the uploading the image of the plant, after which
the disease can be detected.
 User Interaction Design: I implemented functionalities for user to upload images, created
a dashboard, and get the disease of the plant.

Individual contribution to project report preparation: I primarily contributed to


the following chapter of the project report:
 Chapter 2: Literature Review
 Chapter 3: Requirement Specification (Focusing on Project Requirements)
 Chapter 5: Standards Adopted (focusing on coding standards)

Individual contribution for project presentation and demonstration: During


the project presentation, I will be responsible for:
 Highlighting the user interface design aspects and functionalities .
 Walking through a user scenario demonstrating how to interact with the application to get
the disease of the plant.

Full Signature of Supervisor: Full signature of the student:


……………………………. ……………………………..
INDIVIDUAL CONTRIBUTION REPORT:

Plant Disease Detection System

HARSHITA OLIVE AROHAN


2105718

Abstract: This project creates an innovative and intelligent solution design to revolutionize
agricultural practices by providing early and accurate detection of plant diseases, this system
addresses the critical need for timely and effective intervention in agricultural management.
The system leverages cutting-edge technologies, including image processing, machine
learning, and deep learning algorithms, to analyze plant health based on visual symptoms such
as discoloration, spots, or structural deformities on leaves, stems, or fruits.

Individual contribution and findings: I focused on developing the project’s user


interface (UI) using Streamlit. My responsibilities included:
 UI Design and Layout: I designed a user-friendly interface with a clear layout for the user
interaction. This included elements like the uploading the image of the plant, after which
the disease can be detected.
 User Interaction Design: I implemented functionalities for user to upload images, created
a dashboard, and get the disease of the plant.

Individual contribution to project report preparation: I primarily contributed to


the following chapter of the project report:
 Chapter 2: Literature Review
 Chapter 3: Requirement Specification (Focusing on Project Requirements)
 Chapter 5: Standards Adopted (focusing on coding standards)

Individual contribution for project presentation and demonstration: During


the project presentation, I will be responsible for:
 Highlighting the user interface design aspects and functionalities .
 Walking through a user scenario demonstrating how to interact with the application to get
the disease of the plant.

Full Signature of Supervisor: Full signature of the student:


……………………………. ……………………………..
INDIVIDUAL CONTRIBUTION REPORT:

Plant Disease Detection System

NANCY BAKHLA
2105472

Abstract: This project creates an innovative and intelligent solution design to revolutionize
agricultural practices by providing early and accurate detection of plant diseases, this system
addresses the critical need for timely and effective intervention in agricultural management.
The system leverages cutting-edge technologies, including image processing, machine
learning, and deep learning algorithms, to analyze plant health based on visual symptoms such
as discoloration, spots, or structural deformities on leaves, stems, or fruits.

Individual contribution and findings: I focused on data analysis and report


writing for this project. My responsibilities included:
 Data Analysis: I analyzed the diseases generated by the application using various
techniques to assess their quality, clarity, and information coverage.
 Report Writing and Documentation: I played a key role in compiling the project report,
including writing chapters related to:
o Literature review on existing solutions for detecting disease.
o Results analysis, presenting the effectiveness of the application in
generating appropriate disease.
o Conclusion and future scope, summarizing the project's achievements
and outlining potential areas for further development.
 Test Plan Development: I collaborated with the team to develop a
comprehensive test plan outlining various test cases covering core functionalities.
 Test Case Execution: I conducted rigorous testing of the application, including unit testing
of individual functionalities, integration testing of component interaction, and user
acceptance testing (UAT) to gather user feedback on usability and functionality.

Individual contribution to project report preparation: I contributed to


the overall structure and organization of the project report and ensured all sections
followed consistent formatting and referencing styles.
 Chapter 2: Related Work (Focusing on Literature Review)
 Chapter 5: Implementation (Focusing on Testing Plan)
 Chapter 6: Conclusion and Future Scope (analysis of results and future directions)

Individual contribution for project presentation and demonstration:


Duringthe project presentation, I will be responsible for:
 Discussing the data analysis methods used to evaluate the application's performance.
 Presenting key findings from the analysis of generated diseases.
 Highlighting the project's overall effectiveness and potential future advancements.

Full Signature of Supervisor: Full signature of the student:


……………………………. ……………………………..
INDIVIDUAL CONTRIBUTION REPORT:

Plant Disease Detection System

NOMULA SRIMLIKA
21051911

Abstract: This project creates an innovative and intelligent solution design to revolutionize
agricultural practices by providing early and accurate detection of plant diseases, this system
addresses the critical need for timely and effective intervention in agricultural management.
The system leverages cutting-edge technologies, including image processing, machine
learning, and deep learning algorithms, to analyze plant health based on visual symptoms such
as discoloration, spots, or structural deformities on leaves, stems, or fruits.

Individual contribution and findings: I assumed the role of project manager


and facilitated effective communication throughout the development process. My
responsibilities included:
 Project Planning and Scheduling: I collaborated with the team to create a
project plan with clear milestones, task allocation, and timelines for development,
testing, and report writing.
 Communication and Coordination: I facilitated regular team meetings to
ensure everyone was aligned with project goals, addressed any roadblocks or
challenges, and ensured smooth collaboration.
 Error Handling: I incorporated error handling mechanisms to gracefully
handle potential issues like invalid user input, API errors, or unexpected model
responses. Informative error messages were displayed to guide users in case of
problems.

Individual contribution to project report preparation: I contributed to


the overall structure and organization of the project report and ensured all sections
followed consistent formatting and referencing styles.

Individual contribution for project presentation and demonstration:


Duringthe project presentation, I will be responsible for:
 Providing a high-level overview of the project, including its goals,
methodology, andkey achievements.
 Introducing the team members and highlighting their individual contributions.
 Addressing any general project-related questions and ensuring a smooth
presentationflow.

Full Signature of Supervisor: Full signature of the student:


……………………………. ……………………………..
INDIVIDUAL CONTRIBUTION REPORT:

Plant Disease Detection System

REDDY SWAMY
2105727

Abstract: This project creates an innovative and intelligent solution design to revolutionize
agricultural practices by providing early and accurate detection of plant diseases, this system
addresses the critical need for timely and effective intervention in agricultural management.
The system leverages cutting-edge technologies, including image processing, machine
learning, and deep learning algorithms, to analyze plant health based on visual symptoms such
as discoloration, spots, or structural deformities on leaves, stems, or fruits.

Individual contribution and findings: I focused on developing the project’s user


interface (UI) using Streamlit. My responsibilities included:
 UI Design and Layout: I designed a user-friendly interface with a clear layout for the user
interaction. This included elements like the uploading the image of the plant, after which
the disease can be detected.
 User Interaction Design: I implemented functionalities for user to upload images, created
a dashboard, and get the disease of the plant.

Individual contribution to project report preparation: I primarily contributed to


the following chapter of the project report:
 Chapter 2: Literature Review (Focusing on Streamlit)
 Chapter 3: Requirement Specification (Focusing on Project Requirements)
 Chapter 5: Standards Adopted (focusing on coding standards)

Individual contribution for project presentation and demonstration: During


the project presentation, I will be responsible for:
 Highlighting the user interface design aspects and functionalities .
 Walking through a user scenario demonstrating how to interact with the application to get
the disease of the plant.

Full Signature of Supervisor: Full signature of the student:


……………………………. ……………………………
School of Computer Engineering, KIIT, BBSR 8

TURNITIN PLAGIARISM REPORT


(This report is mandatory for all the projects and plagiarism
must be below 25%)

School of Computer Engineering, KIIT, BBSR

You might also like