IOT BASED PRECISION
FARMI NG
Elvin Pau l K S1 Sudha B2
paulkselvin@g mail.com sudha.bhuvanagiri89@g mail.co m
UG Scholar-ETE Assistant professor -ETE
Bangalore institute of Technology Bangalore Institute of Technology
Bangalore,Karnataka Bangalore,Karnataka
Abhishek S3 Supreeth Raj M 4 Nagababau A V5
abhishek.sh30082002@g mail.co m Supreethrajm3@g mail.co m avnagababu029@g mail.co m
UG Scholar-ETE UG Scholar-ETE UG Scholar -ETE
Bangalore Institute of Technology Bangalore Institute of technology Bangalore Institute of Technology
Bangalore,Karnataka Bangalore,Karnataka Bangalore,Karnataka
Abstract:- This project focuses on leveraging drone images of This system has two main sections, monitoring station and
the pests equipped with advanced sensors for pest detection in control station, which are interco mmunicated using/aided by
crops, combined with methods for image processing to identify the wireless Zigbee or Wi-Fi co mmun ication technologies.
diseases. The ultimate goal is to enhance crop health and The control station as well as robotic station possesses the
productivity through timely and targeted pesticide application. amen ities which is soil mo isture sensor, ultravio let sensor ,
Image processing techniques are used to detect signs of diseases robotic system with motors, ARM microcontroller, and
and pests in the captured images. The use of machine learning
CNN algorithm enhances the system’s ability to accurately power supply. Next, the sick plants will be categorized, and a
classify and diagnose crop heath issues. Upon detection of pests, camera equipped with Internet of Things technology will be
the IOT platform triggers a response mechanism to deploy a used to take pictures of the afflicted areas of the plants [2].
precision pesticide spraying system. This ensures targeted and After that, pre-processing, modification, and grouping are
localized treatment, reducing the overall use of pesticides and applied to these pictures. The processor then receives these
minimizing environmental impact. In this project, the pests are images as input, and using the CNN algorith m, co mpares the
captures as images using a camera. These images are then images with the set of tested and pre-trained pests [3].
processed further using various methods and the dominant An automatic pesticide sprayer is used to apply the pesticide
features are extracted from them using various methods.
to the specific area of the leaf if the UA V [4] image that was
Comparison of the features is done using various algorithms
mainly CNN that detect the variance in color and its dominance provided shows damage. If not, it will be automatically
in the recorded samples. This will help in faster and cost effective discarded by the processors, and the spraying robot [5] will
addressing of such diseases . move further. This project aims in making the automated
system easily available for the farmer’s using the device for
Keywords: CNN, IOT, Sprayer robot , Image Processing early detection of the diseases in plants. Roboticis included in
,ZigBee Module, Precision.. this system where the images captured using the drone and
processing of the image is done using the processor that is
integrated in it. After the evaluation of the diseases the real-
I. INTRODUCTION
time results are sent to the farmer/owner of the field in the
form of Bluetooth HC-05 android app [6] and displayed in
In recent years, robotics in agriculture sector with its
the LCD.
implementation based on precision agriculture concept is
Dig ital image capture, image pre-processing (noise removal,
the newly emerg ing technology. he primary drivers of
color transformation, and histogram equalization), K-means
farming process automation are the reduction of time and
segmentation, feature extract ion, and classificat ion using the
energy spent on monotonous farming operations and the
support vector machine algorith m—a supervised learning
enhancement of yield productivity through the application
algorith m—are the steps involved in disease detection. There
of precision farming principles to each crop on an
are two stages to the processing that is done with these
individual basis. Designing of such robots is modeled based
components. Training Phase, often known as the offline
on particular approach and certain considerations of
phase, is the first processing stage. During this stage, an
agriculture environ ment in wh ich it is going to work. Also,
image analy zer examined a set of input photographs of leaves
prototype of an autonomous agriculture robot is presented
(both damaged and healthy), and specific features were
which is specifically designed for spraying pesticides.
extracted. Subsequently, the classifier received these
The robotic systems play an immense role in all sections of
attributes as input together with the information identify ing
societies, organization and industrial units. The main
whether the image depicts a healthy or d iseased leaf.
concept of this research is to develop a mechanized device
Subsequently, the classifier ascertains the relationship
that helps in on farm operations like identifying pests using
between the retrieved features and the potential conclusion
mach ine learning [1] and spraying at pre-designated
about the existence of the disease [7]. The system is trained
distances and depths with all applicab le sensors for
as a result.
controlling hu mid ity, temperature.
II. LITERATURE S URVEY Problems Recognized
Fro m the Literature Survey carried out, several problems
Fro m Table 1, PAPER-1 refers to The paper discusses the were identified in the existing technology of pest detection
key components of IoT-based smart agricu lture systems, by drones and spraying methods. They are as follows
such as sensor nodes, communication protocols, data Pest Infestation in Crops
analytics platforms, and control mechanis ms. It highlights Farmers often face challenges in identifying and
the importance of each component and their functioning to monitoring pest infestations in their crops efficiently.
optimize agricu ltural practices. Paper 2 refer to the role of Traditional methods of pest detection can be time-
OpenCV in analy zing images or video feeds captured fro m consuming and may not provide real- time
agricultural fields to identify pests or signs of pest informat ion.
infestation. It covers various image processing techniques
Inefficient Pesticide Use
such as segmentation, feature extraction, and classification. Conventional methods of pesticide application may result
Paper-3 refers to the integration of AI technology with
in over use or under use, leading to increased costs and
UA Vs for pest recognition purposes. It discusses the
potential environ mental harm.
hardware and software components of the AI drone system,
Lack of p recision in pesticide application can harm non -
including the UAV platform, on board cameras,
target organisms and soil health
computational resources, and software algorith ms for image
analysis. Paper-4 refers to the process of training the deep Manual Monitoring and Treatment.
learning model on the dataset and fine-tuning its parameters Monitoring and treating crops manually are labor -
to optimize performance. It also presents the results from intensive tasks, and the effectiveness of pest management
may vary.
performance evaluations, including metrics such as
accuracy and precision. These papers have respective Timely intervention is crucial, and delays in identifying
drawbacks such as connectivity issues, improper detection, and addressing pest issues can lead to crop losses
outdated technology and also problem in retrival o f the
given data.
Table 1: Comparison of various IEEE papers
Suggested Solution
Following the difficulties noted in the current system, the The classificat ion software helps in classifying various
following solutions are proposed: pests while the hardware consis ts of a sprayer bot
Creat ion and advancement of an efficient and automatic communicated through wireless module.
system to detect diseases in affected plants, by use of This in turn increases efficiency and precision as
image processing of the pests, which minimize the required, the efficiency in the software is done by the
required professional interference and thus reduce the CNN machine learning and precision is done by the
expenditure involved in the spraying the pests using the nozzle of the sprayer
traditional methods.
Hardware and Software components: The motor d rive circu it is responsible for shaft direction.
The hardware and software requirements are depicted The below Fig 2 is the practical model implementation
in the Co mponent Diagram of Figure 1 where the yellow long vertical pole is the shaft which is
The Project’s Hardware requirements are: responsible for the movement of the pesticide spraying
Arduino Uno nozzle. The practical model is created wh ich is seen in
DC motor the connection diagram.
Water Pump
Power Supply Module
Relay
ZigBee Module
Ultrasonic Sensor
Bluetooth Module
Soil Moisture Sensor
Solar Panel
Battery
LCD (16*2)
Motor Drive Circu it
The Project’s Software requirements are:
Arduino IDE
Tensor Flow
OpenCV
Flask
Python
Embedded c Fig 2 practical model
The sensors and controlling all the motion of the robot is
processed and sensed by the microcontroller Arduino
UNO.
The Arduino UNO is connected with the soil mo isture
sensor and ultrasonic sensor, where the soil mo isture
sensor is employed to ascertain the percentage of the
water content in the soil and the ultra sonic is emp loyed
to ascertain the amount of water present in the pesticide
tank.
The power is supplied to the Arduino UNO through an
external rechargeable battery source which has the
supply of 12V.
The battery is the connected with the power module,
the power module is utilized in this project since various
electronic gadgets require varying voltage values. (for
example: Arduino UNO requires 5V, wh ile the dc motor
requires 12V).
The solar panel is connected with the battery , the solar
panel produces a good amount of 12V voltage during
the sunny day which helps in recharging of the battery.
Fig 1 Connection Diag ram A Zener diode is connected between the solar panel and
III . METHODOLOGY the battery in order to avoid the voltage to flow in
reverse direction fro m battery to solar panel.
A. HARDWARE
The Lcd screen of 16* 2 display is connected to the
Proposed Methodology:
microcontroller in order to provide the visible output to
The primary goal of this project is to provide precise
the user.
farming equipment. The hardware co mponent helps to
minimize the use of pesticides and aids in precise spraying Motor drivers are used to control the DC motors. Because
since it uses a full-cone nozzle, while the software dc motors require a 12V supply, motor drivers are
component gives an accurate and precise classification of emp loyed. The Arduino board is connected to the motor
the pests. The project's block diagram,depicted in Figure 1, driver, which enables Arduino to control the dc motor that
is the connection diagram, which includes the different operates on a 12V supply (the motor drive controller uses
motors and sensors needed for movement and sensing. an L290D integrated circuit).
The Arduino Uno controls the DC motors by the motor The L290D IC is responsible for power amplificat ion
drive circuit. and also helps in bidirectional.
The Zigbee module is the device used for wireless Image Pre-Processing:
communicat ion between the robot and the laptop. The images obtained from camera are subjected to pre-
Three DC motors are utilized in the project; two of processing for increasing the quality of the images.
them are emp loyed to move the robot, while the third The pre-processing steps may include color
one moves the sprayer's shaft. where the sprayer's transformation, noise removal, histogram equalization,
height can be changed. green masking etc.
The dc motors are controlled using motor drivers. The The technique of color transformat ion is used for
motor driver are used because the dc motor require 12V increasing the quality of the image. Conversion of RGB
supply and hence the Arduino board is in turn image into Grey and also HSI to increase the quality.
integrated with motor driver where it helps Arduino to
control the dc motor which work in 12V supply Feature Extraction:
(L290DICis used in the motor drive controller). There are many features of an image mainly colo r,
The L290DIC is responsible for power amplification texture and shape. Mainly three features are
and also helps in bidirectional movement of the robot. considered: Co lor histogram, Texture which resemb les
The Bluetooth module is in turn integrated with color, shape and texture
Arduino uno in order for the movement control of the
robot. Classification Using CNN:
The Bluetooth module helps in wireless control of the In this module, these images are classified by
robot where it can be controlled by the user’s phone Convolution Neural Net work classifier.
The relay is connected with Arduino in order to To assess the relevant elements and identify
maintain the control on the water pu mp, here relay is distinguishing characteristics for crack identificat ion,
used to overcome the voltage difference between both a number of features are co mbined..
the Arduino and the water pump. The pump end is
connected with the nozzle for spraying . Results And Analysis:
B. Software: The outcomes are far more effective than those of the
earlier models and come fro m a suitable training of the
CNN model.
The study shows the kinds of pests that are found, and
the model's effectiveness is gauged by how well it can
detect the pests.
Software applications
Arduino IDE :
The Arduino IDE software is used to program the
hardware co mponents of the project which include the
coding for the sensors, movement of the sprayer bot and
also the shaft movement of the sprayer.
The software is programmed using embedded C, where
the Arduino IDE is the main interface for the hardware
and the user, the user can program his required needs in
the hardware by Arduino IDE.
Fig 3 Steps Involved In Image Processing
Dataset collection:
The dataset collection is done by which the images
are obtained using the digital camera using drone
that is connected to the Laptop.
The images captured are subjected to further pre-
processing.
The data augmentation is also a part of the dataset Fig 4 A rduino IDE Soft ware Sketch Screen
collection where the taken set of the images are
augmented as per the required resolution by the user
TensorFlow :
A free and open-source software library for artificial
intelligence and machine learn ing is called
TensorFlow. While it may be applied to many
different tasks, its main focus is on deep neural
network training and inference.
The Google Brain team created it fo r internal usage
in manufacturing and research at Google. 2015 saw
the debut release of the product under the Apache
License 2.0.In September 2019, Google published
TensorFlow 2.0, an improved version.
TensorFlow is compatib le with a large number of
programming languages, such as Python, JavaScript,
C++, and Java, wh ich makes it useful for a wide
range of applications across numerous industries.
aaaaaaaaaaaaa Fig 6 OpenCV program snippet
Flask :
Python-based Flask is a micro web framewo rk. It is
categorized as a microframework since it doesn't
need any specific libraries or tools.
It is devoid of any form validation, database
abstraction layer, or other elements wherein third-
party libraries that already exist offer common
functionalities. However, Flask enables extensions to
give the application capabilities much like it would if
it were Flask native.
Multiple open authentication protocols, object-
relational mappers, form validation, upload
processing, and utilities for shared frameworks are all
supported by extensions.
The Flask features include:
Fig 5 TensorFlow program snippet
Debugging and development server
The Fig 5 shows the python code for the mach ine Uses Jinja templat ing
learning CNN where we can see that many libraries Offers comprehensive assistance for unit testing.
are imported as required.
RESTful request dispatching.
OpenCV : assistance with safe cookies
Programming functions for real-time co mputer the Fig 7 gives us the code snippet of the web driven
vision are mostly found in the OpenCV (Open application built in Flask for the project
Source Co mputer Vision Library) library. It was
initially created by Intel and later sponsored by
Itseez (wh ich Intel later purchased) and Willow
Garage.
The cross-platform library is availab le under the
Apache License 2 as free and open-source software.
OpenCV has GPU acceleration fo r real-t ime
operations since 2011.
Among the application areas of Open CV are:
Toolkits for 2D and 3D features
Assessment of egomotion
System for facial recognition
The Fig 6 shows the code snippet for the required
OpenCV, this shows how the real t ime application of
the program is ran, the smaller code snippet shows how
the OpenCV is imp lemented. Fig 7 code snippet for web applicat ion of Flask
FLOW CHART
A flow chart depict ing the stages involved in IOT Based Precision Farming is illustrated in Fig 8.
This Flowchart g ives a detailed explanation of the working of the model
Fig.8. Flow Chart of the Process of IOT Based Precision Farming
III. RES ULTS
Figure 9(a) depicts the software page of the project.
The place where the pests images are inserted for
testing and Fig 9(b) depicts the output of the given
image along with the precision required while Fig
9(c) depicts the sprayer level with the help of
ultrasonic sensor where it tells the tank level als o
Fig 9(d) tells the pesticide detection and Fig 9(e)
tells the soil moisture level with the help of sensor.
Fig 9(d) shows pest detected by hardware
Fig 9 (a) web application page
Fig 9(e) shows the soil mo isture level
In this model the IOT Based Precision Farming is
done by both hardware and software methods. The
software parts and their results are depicted in Fig
9(a) and Fig 9(b) where it identifies and classifies the
pests while the rest of the figure 9(c),9(d) and 9(e)
depicts the hardware results the hardware model is
shown below in Fig 9(f).
Fig 9(b) software result in classification
Fig 9(f) The hardware p ractical model
Fig 9(c) shows the sprayer tank level
IV. CONCLUS ION
The prototype is non-invasive, low-cost and user-
friendly.
• This prototype may be substantially less expensive
than the goods that are sold commercially. The
prototype's efficacy and efficiency result in
improved power control and little energy loss.
• This self-powered device uses solar power and is
connected to the central gateway using the
communicat ion protocol.
• The experiment effectively illustrates how precision
agriculture and image processing may revolutionize
pest detection and management. by applying
pesticides precisely and intervening early.
• The automated and data-driven method encourages
environmental sustainability in addition to imp roving
efficiency and cost-effectiveness.
ACKNOWLEDGEMENT
We would like to convey our heartfelt appreciation to
our Head of Department, Dr. M Rajeswari, and Project
Gu ide, Prof. Sudha B also Dr. Boraiah, for their
invaluable assistance and advice throughout this
project.
REFERENCES
[1] Dipti. D. Desai., Priyanka .B. Patil , “A rev iew of
the literature on IOT based Smart agriculture
monitoring and control system on IOT” published
in the year 2023.
[2] Aishwarya M S, Karthik k, Rachana N ,Nandan
D,“A Survey on Pest Detection system”,on Open
CV, IOT , 5G Technology published in the year
2022.
[3] Abhishek Kamal, Adarsh, Aviral Ku mar Gopal ,
“Pest Recognition on UAV: AI Drone” on Cutting
edge Recognition Technology published in the year
2022.
[4] A.G.Mazare, L.M .Lonescu, D.Visan, “Pest
Detection system for agricu ltural crops using
intelligent image analysis” on Deep Learn ing
Artificial Neural Net works published in the year
2021.
[5] Ankit Singh, Abhishek Gupta, Akash Bhosale,
Sumeet Poddar, “Agribot: An Agriculture Robot”,
International Journal of Advanced Research in
Co mputer and Co mmunication Engineering Vo l. 4,
Issue 1, January 2015.
[6] N. Firthous Begum, P. Vignesh ,“Design and
Implementation of Pick and Place Robot with
Wireless Charging Application”, International
Journal of Science and Research (IJSR-2013).
[7] Buniyamin N,Wan Ngah, W.A. J Sariff N,Mohamad
Z, “A Simp le Local Path Planning Algorith m For
Autonomous Mobile Robots”, International Journal
Of Systems Applications, Engineering &
Develop ment Issue 2, Volu me 5, 2011.