Final Blackbook
Final Blackbook
of the degree of
By
University of Mumbai
2016-17
CERTIFICATE OF APPROVAL
is a bonafide work of
Submitted to the University of Mumbai in partial fulfillment of the requirement for the award of the
degree of
ii
PROJECT REPORT APPROVAL FOR B. E.
Examiners
1.--------------------------------------------
-
2.--------------------------------------------
-
Date:
Place:
iii
Declaration
I declare that this written submission represents my ideas in my own words and where others'
ideas or words have been included, I have adequately cited and referenced the original sources. I
also declare that I have adhered to all principles of academic honesty and integrity and have not
misrepresented or fabricated or falsified any idea/data/fact/source in my submission. I understand
that any violation of the above will be cause for disciplinary action by the Institute and can also
evoke penal action from the sources which have thus not been properly cited or from whom
proper permission has not been taken when needed.
Date:
iv
Acknowledgement
A project is said to be a planned set of interrelated tasks to be executed over a fixed period and
within certain cost and other limitations. The success of every project depends on a number of
factors like the team working on it, the guide and people who help giving proper support and
inspiration.
We would like to take the opportunity to express our sincere thanks to our project guide Prof.
Santosh Tamboli for his valuable guidance and support throughout the process. He has always
been there to help us with doubts and difficulties. His guidance has played a very important role
for us.
Also, for arranging timely reviews and giving us proper prior instruction about the project events,
we would like to thank Prof. Deepali Nayak and Prof. Kanchan Dhuri.
Lastly, we would genuinely like to thank all those people who have been a part of this project and
have helped make this ongoing project a successful one till date.
SHUBHANGEE KANADE
POOJA SINGH
v
Table of Contents
Chapter no. Page title Page No.
Abstract vii
List of figures viii
List of tables ix
1 Project overview 1
2 Introduction and Motivation 3
2.1. Theory behind the project 4
2.2. Problem definition 6
2.3. Need for project 7
3 Literature Survey 8
4 Analysis and Design 12
4.1. Process Model 13
4.2. Flow of project 16
4.3. Feasibility study 17
4.4. Cost analysis 19
4.5. Data Flow Diagrams 20
4.6. UML Diagrams 25
4.7. Technologies used 31
4.8. System architecture 34
5 Implementation 36
5.1. Graphical user interface 37
5.2. Code 43
5.3. Test cases 62
6 Conclusion and future scope 64
6.1. Conclusion 65
6.2. Future scope 66
7 References 67
Abstract
The proposed system presents an application on mobile device which works based on speech
synthesis and speech recognition. It guides the blind people in travelling via local trains by
providing features like train schedule, ticket fare information, access to current location and
notification. When the application starts it speaks out all the menus (features mentioned previously)
listed on menu screen and then waits for the user input. To get train schedule information user
speaks the source and destination stations of journey and will get information of train one by one
coming at source station and will be going to particular destination provided by user. User needs to
follow the same procedure to get ticket fare information i.e. again here user only needs to provide
source and destination station input. In order to know the current location, user just requires to
speak the option my current location and user will get speech output of his/her whereabouts. There
is another interesting feature called Notification which notifies the user about the next coming
station before train reaches that particular station.
vii
List of Figures
viii
List of Tables
ix
PROJECT
OVERVIEW
1
1.Project Overview:
Project helps the blind or visually impaired people to get better experience while travelling through
local trains as application provides voice based response. When the matter comes for blind people
to travel via local trains they find theirselves in difficulty and they need someone’s assistance or
help in order to know which train is coming on platform, they need help to find the correct platform
until and unless there is an announcement for the same. Sometimes they also need to ask someone
in surrounding for the next coming station so that they can get down at their desire destination. So
this project finds a solution to this problem by giving all voice based features of M-Indicator such
as Train schedule related information i.e. train arrival time, platform on which the train will arrive
on source station as well as on destination station and train reaching time on destintion, ticket fare
information including single journey ticket, one month pass and quarterly pass all for both first
class and second class. Along with these two features Some additional features such as information
about the current location, and notification of upcoming railway station comes under this project
scope. So Overall the project provides completely new experience for the users to use application
which serves as a guide for them while travelling through local trains.
2
INTRODUCTION
AND
MOTIVATION
3
2.Introduction & Motivation
2.1. Introduction
Android powers hundreds of millions of mobile devices in more than 190 countries around the
world. It's the largest installed base of any mobile platform and growing fast—every day another
million users power up their Android devices for the first time and start looking for apps, games,
and other digital content. Android gives us a world-class platform for creating apps and games for
Android users everywhere, as well as an open marketplace for distributing to them instantly.
As per the World Health Organization (WHO) statistics of June 2012, there lives a sum of 285
million visually impaired individuals across the globe with 39 million blind and 246 million with
low vision. Approximately 90% of these people live in bdeveloping countries and 82% of blind
people are aged 50 and above.
One of the most common problems that many blind and visually impaired people experience is
their day-to-day challenge in coping with their impairment. Equipment such as Braille, reading glasses,
or a walking stick are just some of the few things that help visually impaired people get along with their
lives.
When talking about transportation there are many more problems that blind people face
especially when the time comes to use local transport. For sighted people living in Mumbai like
city, use of local transport specifically local trains has become simpler after launching an
application like M-Indicator. M-Indicator provide us a lots of features few of them are Schedules
for suburban trains( includes all four lines – Central line, western line, harbor line and trans-harbor
line), Platform number at which a particular train will halt and position of door from which to exit
4
etc. All these features helps us to travel via trains. But the problem with M-Indicator is it is totally
GUI based which makes it almost impossible to be used by blind people and visually impaired.
With the advancement of technology, a common Android smartphone equipped with specific
applications can aid visually impaired and blind people in functioning. Now-a-days android provide
built-in screen reader and voice recognition feature making it easy for blind people to use smart phones.
But still these features do not help blind people to use applications like M-Indicator. So we are
making an application named B-Indicator, here B stands for Blind, which converts the GUI based
application into a convenient application which blind people can use. It will help blind people and
visually impaired by providing better user friendly environment which will help in user satisfaction.
It helps blind people to get all related information such as train schedule, current location etc. and
make them travel as normal people via train. Not only blind people but normal people can use this
application to reduce the effort they put in typing manually.
The Project provides all voice-based features, By using Speech recognition and speech
synthesis approach, the proposed system will guide and help the blind and visually impaired as M-
Indicator helps sighted people.
Our project objective is to make the blind people to travel and use local transport just like a
normal people by overcoming all the difficulties which blind people usually face while traveling in
Mumbai through local trains.
5
2.2. Problem Definition:
The B-Indicator is a concept which converts the GUI based application into a convenient
application which blind people can use. It will help blind people and visually impaired by providing
better user friendly environment which will help in user satisfaction. It helps blind people to get all
related information such as train schedule, current location etc. and make them travel as normal
people via train. Not only blind people but normal people can use this application to reduce the
effort they put in typing manually.
Since interaction will be easy, both the blind and the visually impaired are benefited as follows:
1. Number of blind people using local trains to travel may increase as they already will have an a
application to guide them.
6
2.3.Need for the project
If we are at new place and we are travelling through local transport especially local trains,
then usually people in Mumbai take help of m-indicator and travel easily. But blind people are not
able to use m-indicator as it is GUI based. Blind people new to Mumbai or want to travel may use
transportation services like train, bus, auto, taxi, metro, mono, ferry etc. But while travelling
through train they may not know when the train will come and on which platform until and unless
there is an announcement or they ask someone for it.
So we have made an android application named “B-Indicator” for blind people to overcome all the
difficulties which they face while they travel via local trains by providing speech input and output
making blind people to feel ease while travelling.
7
LITERATURE
SURVEY
8
3. Literature Survey
Android Text Messaging Application For Visually Impaired People:
This paper presents the application of android’s Voice-To-Text and Text-To-Voice using android
platform for mobile devices. Messaging System presented in this paper is Voice enabled. The
application listens to your messages and then responds with voice commands by talking. The
application converts your text into voice and voice into text.
Implementing speech synthesis and speech recognition functions are quite easy in Android specific
application due to in-built support provided by android platform Google APIs. Speech synthesis is
used for converting text into voice. Similarly speech recognition is used for converting voice taken
as input into text.
The application provides total voice interaction i.e. application provides guide that consists of
voice based instructions, i.e. in that voice commands are explained to user for performing various
operations. User will interact with application completely through voice commands, so it will
provide better user interface and interaction facility. All notifications and alerts received from SMS
are processed in voice by the application.
ARM7 based Smart bus Passenger-Alert System using GSM with GPS based Location
Identification:
In this paper solution to the main problems are presented which are faced by passengers who use
bus as a means of their transportation. The main reasons for the discomfort are unpredictable bus
schedules have made it virtually impossible to estimate the time of arrival at the required
destination. This paper present an intelligent real time alarming system which senses the destination
location which is taken as input from the passenger. This system also includes android mobile alert
application. After the passenger reached the destination, bus system alert the passenger through
9
SMS with android application in passenger mobile and also assisted with alarming system in case
of excessive amount of temperature in case of smoke identification which leads to the fire accidents
.this has become a serious problem in ac buses now a days.
In this alert system the input is given by the passenger to device connected to the bus which in
turn is linked with the GPS system for the purpose of location identification .Once the destination
of the passenger is reached the alarm is triggered in passenger’s mobile through an SMS which is
done with the help of GSM system.
In other words we can say that we can make use of GPS to access user’s current location and
can let the user notified by SMS notification with the help of GSM.
This paper presents the selection of the menu using hand gestures. The method X, Y plane
translation of the finger is used by holding them across the boundary. The fingers are to be moved
in arc shape for particular menu selection. It also allows selection of menu in multi-level data of
menus and provides ease of navigation between and through the menus by free hand gestures.
Menu format enables to learn and use hand gestures to perform actions onscreen, spot icons and
their meanings and navigating the menu. Navigations through and between the graphical user
interface of the menu structure is done using gesture states like X and Y translations of the finger of
user with boundary crossing.
This paper presents passenger bus alert system for blind in which the blind people at the bus station
is provided with a ZigBee unit which is recognized by the ZigBee in the bus and the indication is
made in the bus that the blind person is present at the station. So the bus stops at the particular
10
station. The desired bus that the blind want to take is notified to him with the help of speech
recognition system HM2007. The blind gives the input about its destination using microphones and
the voice recognition system recognizes it. The input is then analyzed by the microcontroller which
generates the bus numbers corresponding to the location provided by the blind. These bus numbers
are converted into audio output using the voice synthesizer APR 9600. The ZigBee transceiver in
the bus sends the bus number to the transceiver with the blind and the bus number is announced to
the blind through the headphones. The blind takes the right bus parked in front of him and when the
destination is reached it is announced by means of the GPS-634R which is connected with the
controller and voice synthesizer which produces the audio output. This project is also aimed at
helping the elder people for independent navigation.
11
ANALYSIS
AND
DESIGN
12
4. Analysis and Design
4.1. Process Model Used for the Project:
The reason for using the Iterative model is that, the proposed system basically consist of many
development and testing cycle. In Iterative model, iterative process starts with a simple
implementation of a small set of the software requirements and iteratively enhances the evolving
versions until the complete system is implemented and ready to be deployed.
After the completion of one cycle we had got a better idea about the implementations and the
improvements to be made in the next cycle.
An iterative life cycle model does not attempt to start with a full specification of requirements.
Instead, development begins by specifying and implementing just part of the software, which is
then reviewed in order to identify further requirements. This process is then repeated, producing a
new version of the software at the end of each iteration of the model.
13
Why Iterative model?
Also we had a clear overall idea of the project after the first cycle itself.
Since our project had many models to be implemented, after every iterative cycle it
was clear as to how the features will appear and what are the necessary changes to
be implemented in the coming cycles.
14
Iterative and Incremental development is a combination of both iterative design or iterative
method and incremental build model for development. "During software development, more than
one iteration of the software development cycle may be in progress at the same time." This process
may be described as an "evolutionary acquisition" or "incremental build" approach."
In this incremental model, the whole requirement is divided into various builds. During each
iteration, the development module goes through the requirements, design, implementation and
testing phases. Each subsequent release of the module adds function to the previous release. The
process continues till the complete system is ready as per the requirement.
15
A flowchart is a type of diagram that represents an algorithm, workflow or process, showing the
steps as boxes of various kinds, and their order by connecting them with arrows. This diagrammatic
representation illustrates a solution model to a given problem. Flowcharts are used in analyzing,
designing, documenting or managing a process or program in various fields.
16
4.3. Feasibility Study:
Technical Feasibility
The implemented proposed system i.e., B-Indicator – An Android Application for blind and
visually impaired, is totally based on the android technology and our whole concept is based on
recognition of user’s voice using microphone.
Android is the most popular operating system for mobile devices and is easy to use. Among
operating system of all the mobile devices of current market 80% operating system are of Android.
So we have used Android as the Operating System platform. To store the data of train schedule and
ticket fare information we have used SQLite database as android provides in-built support for
SQLite databases. The other requirements for the proposed system would be laptop and the android
enabled smart phone.
Economic Feasibility
Start-Up Costs:
In our system, the equipment needed would be Smart phone, laptop. Total Cost of acquiring
these equipment would be up to Rs.37000. (Cost will be low if one posses these things.)
Source of Financing:
The source of financing would be the two students working on the project itself.
17
User Feasibility
From user perspective, our system is very user-friendly, just the user has to carry this smart
phone and open an application. Once our application gets open it will speak out all the features of
our application and tell the user to select one of them. The moment user calls he feature the
application will navigate the user to the corresponding next activity and similarly keeps on working
user command. In short the application recognizes each and every command of user and provide
instant response in the form of speech.
18
4.4. Cost Analysis:
LOC Calculation:
Folder SLOC
5 TOTAL 6039
19
4.5. DFD (DATA FLOW DIAGRAM)
A data flow diagram (DFD) is a graphical representation of the "flow" of data through
an information system, modelling its process aspects. A DFD is often used as a preliminary step to
create an overview of the system without going into great detail, which can later be elaborated.
DFDs can also be used for the visualization of data processing (structured design).
A DFD shows what kind of information will be input to and output from the system, how the data
will advance through the system, and where the data will be stored.
LEVEL 0 DFD
DFD Level 0 is also called a Context Diagram. It’s a basic overview of the whole system or
process being analyzed or modeled. It’s designed to be an at-a-glance view, showing the system
as a single high-level process, with its relationship to external entities. It should be easily
understood by a wide audience, including stakeholders, business analysts, data analysts and
developers.
In this level 0 basic flow of the B-Indicator system is shown, that is the user request the query
and as a output of query it gets the query response.
20
LEVEL 1 DFD
21
DFD Level 1 provides a more detailed breakout of pieces of the Context Level Diagram.
You will highlight the main functions carried out by the system, as you break down the
high-level process of the Context Diagram into its sub-processes.
Here in Level 1 of the The B-Indicator, the main functions of the system are highlighted,
like the getting information related to train schedule and related to the user’s current
location.
22
LEVEL 2 DFD
23
DFD Level 2 then goes one step deeper into parts of Level 1.It may require more text to
reach the necessary level of detail about the system’s functioning.
Here in the level 2 the system remains same as level 1 but goes one step deeper as it
describes that train related query can be train schedule information or it can be ticket fare
information. Similarly location based query can be further divided in current location access
and notification of coming station.
24
4.6.UML diagrams
UML stands for Unified Modeling Language which is used in object oriented software
engineering. Although typically used in software engineering it is a rich language that can be used
to model an application structures, behavior and even business processes.
25
A use case is a list of actions or event steps, typically defining the interactions between a role
(known in the Unified Modeling Language as an actor) and a system, to achieve a goal. The actor
can be a human or other external system
Following is the explanation of the use case diagram of B-Indicator System :
Title (goal): To see the 3D model of the food item on the screen of the phone before eating.
Primary Actor: User.
Scope :Used for railway transportation by blind people in mumbai.
Level :User level.
Story: The user can open the application anywhere anytime to get the related information
of locals. User can open the application and choose any feature among train schedule
information, ticket fare, current location information and notification for the next coming
station. In order to use features like current location and notification user needs internet
connection and location service on in his/her smartphone.
26
Figure 4.7.1: Activity for notification Figure 4.7.2: Activity for train schedule
27
Figure 4.7.4: Activity diagram for Project(B-Indicator)
28
Activity diagram is basically a flow chart to represent the flow from one activity to another
activity.
Following is the explanation of the Use case diagram of The Rolling Cuisine:
• After initialization of the app, the user is given a choice of feature selection.
• The application takes the input from user and if input is valid the corresponding
command gets executed..
• After processing user input, output of the processing is displayed and spoken by the
speaker.
29
4.6.3: Class Diagram
30
4.7. Technologies Used
4.7.1. Hardware
Smart Phone
4.7.2. Software
SQLite Browser
31
4.6.2.1. SQLite
"Stand-alone" or "Self-contained"
Provides a lightweight disk-based database that doesn’t require a separate server process
"zero-configuration" database engine.
Multiplatform
4.6.2.2. Android
32
JAVA Programming Language.
Android applications are developed using the Java language. As of now, that’s really your only
option for native applications. Java is a very popular programming language developed by Sun
Microsystems (now owned by Oracle). Developed long after C and C++, Java incorporates many of
the powerful features of those powerful languages while addressing some of their drawbacks. Still,
programming languages are only as powerful as their libraries. These libraries exist to help
developers build applications.
Android relies heavily on these Java fundamentals. The Android SDK includes many standard Java
libraries (data structure libraries, math libraries, graphics libraries, networking libraries and
everything else you could want) as well as special Android libraries that will help to develop
awesome Android applications.
33
4.8. System Architecture
34
A system architecture is a conceptual model that defines the structure, behavior, and more views of
a system. An architecture description is a formal description and representation of a system,
organized in a way that supports reasoning about the structures and behaviors of the system. A
system architecture can comprise system components that will work together to implement the
overall system.
Hardware Component:
The Hardware component consists of Smart phone Mic as a Voice recognizer (Input) and Screen of
the phone and speaker are used to get output.
Mic:
Mic here is used as a Input /Sensor for recognizing the voice of user .and once the voice is
recognized by the mic it is processed and the corresponding query is fired and output of the same
spoke by the speaker of the android smart phone.
Speaker:
The speaker of the android smart phone is used to speak the output of the query execution.
35
IMPLEMENTATION
36
5.Implementation
37
2) Displaying Train schedule input screen
38
3) Displaying Train schedule information
39
4) Ticket Fare option screen 5) Ticket Fare input screen
40
6) Displaying Ticket fare information 7) Displaying current location of user
41
8) Displaying Notification of coming station
42
5.2.Code
AndroidManifest.xml
<application
android:allowBackup="true"
android:icon="@drawable/app_icon"
android:label="@string/app_name"
android:supportsRtl="true"
android:theme="@style/AppTheme">
<activity android:name=".Dao.Splash_Screen">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
<activity android:name=".Dao.Train_Src_Dest" />
<activity android:name=".Dao.MainActivity" />
<activity android:name=".Dao.ShowTrainSchedule" />
<activity android:name=".Dao.ShowFare" />
<activity android:name=".Dao.UserLocation" />
<activity android:name=".Dao.Notification_Enable" />
</application>
</manife
43
activity_main.xml
<ListView
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:id="@+id/list1"
android:layout_alignParentTop="true"
android:layout_alignParentStart="true" />
<TextView
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:inputType="text"
android:id="@+id/opt_txt"
android:textColor="#FFFFFF"
android:layout_alignParentBottom="true"
android:layout_alignParentStart="true" />
</RelativeLayout>
44
MainActivity.java
package com.example.mahesh.bindicator.Dao;
import android.annotation.TargetApi;
import android.content.ActivityNotFoundException;
import android.content.Intent;
import android.content.res.TypedArray;
import android.os.Build;
import android.os.Bundle;
import android.os.Handler;
import android.widget.TextView;
import com.example.mahesh.bindicator.R;
import com.example.mahesh.bindicator.pojo.RowItem;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Locale;
ListView myListView;
List<RowItem> rowItems;
String[] text1;
TypedArray icon;
TextToSpeech textToSpeech; //text to speach object
String txt_speech;
private final int SPEECH_RECOGNITION_SRC_CODE = 1;
private TextView opt_txt;
@Override
45
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
rowItems = new ArrayList<RowItem>();
text1 = getResources().getStringArray(R.array.titles);
icon = getResources().obtainTypedArray(R.array.icon);
myListView.setOnItemClickListener(this);
46
}
}
@Override
public void onDestroy() {
textToSpeech.shutdown();
super.onDestroy();
}
@Override
public void onInit(int Text2SpeechCurrentStatus) {
if (Text2SpeechCurrentStatus == TextToSpeech.SUCCESS) {
textToSpeech.setLanguage(Locale.US);
// button.setEnabled(true);
txt_speech = "Welcome to B Indicator. press volume up button and speak one of the option
Train Schedule Fare my current location Notification User Query ";
TextToSpeechFunction(txt_speech);
}
}
//End text to speach
//voice recognization
@Override
public boolean dispatchKeyEvent(KeyEvent event) {
int action = event.getAction();
int keyCode = event.getKeyCode(); //get volume event
switch (keyCode) {
case KeyEvent.KEYCODE_VOLUME_UP:
if (action == KeyEvent.ACTION_DOWN opt_txt.setFocusable(true);
opt_txt.setEnabled(true);
47
opt_txt.setCursorVisible(true);
startSpeechToTextOpt();
//delay function
Handler delayHandler= new Handler();
Runnable r=new Runnable()
{
@Override
public void run() {
48
return super.dispatchKeyEvent(event);
}
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data)
}
//after choosing option goto next activity
private void goToNextActivity(String str){
if(str.equalsIgnoreCase("Train Schedule")){
Intent intent=new Intent(MainActivity.this, Train_Src_Dest.class);
startActivity(intent);
}
else if(str.equalsIgnoreCase("Fare") || str.equalsIgnoreCase("Fair")){
Intent intent=new Intent(MainActivity.this, FareOption.class);
startActivity(intent);
}
else if(str.equalsIgnoreCase("My Current Location")){
Intent intent=new Intent(MainActivity.this, UserLocation.class);
startActivity(intent);
}
else if(str.equalsIgnoreCase("Notification")){
Intent intent=new Intent(MainActivity.this, Notification_Enable.class);
startActivity(intent);
}
else if(str.equalsIgnoreCase("User Query")){
Intent intent=new Intent(MainActivity.this, ClientSocket.class);
startActivity(intent);
}
}
}
49
Train_Src_Dest.java
package com.example.mahesh.bindicator.Dao;
import android.annotation.TargetApi;
import android.content.ActivityNotFoundException;
import android.content.Intent;
;
import android.speech.RecognizerIntent;
import android.support.v4.view.GestureDetectorCompat;
import android.support.v7.app.AppCompatActivity;
import android.speech.tts.TextToSpeech;
import android.util.Log;
import android.view.GestureDetector;
import android.view.KeyEvent;
public class Train_Src_Dest extends AppCompatActivity
implements TextToSpeech.OnInitListener,
GestureDetector.OnGestureListener,
GestureDetector.OnDoubleTapListener {
TextToSpeech textToSpeech;
private final int SPEECH_RECOGNITION_SRC_CODE = 1;
private final int SPEECH_RECOGNITION_DEST_CODE = 2;
private EditText src_txt, dest_txt;
int curr_time;
String txt_speech;
String weekDay;
Calendar calendar = Calendar.getInstance();
Cursor c = null;
Cursor c1 = null;
String side,dist_table_name;//used while data retrieve
50
String src;
String dest;
// GestureDetector variable
private static final String DEBUG_TAG = "Gestures";
private GestureDetectorCompat mDetector;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.train_src_dest);
51
String curr_hr = dfHr.format(calendar.getTime());
String curr_min = dfMin.format(calendar.getTime());
int currHour = Integer.parseInt(curr_hr)*60;
int currMin = Integer.parseInt(curr_min);
curr_time = currHour + currMin;
//day of week
SimpleDateFormat dayFormat = new SimpleDateFormat("EEEE", Locale.US);
weekDay = dayFormat.format(calendar.getTime());
@Override
public void onDestroy() {
textToSpeech.shutdown();
super.onDestroy();
}
@Override
public void onInit(int Text2SpeechCurrentStatus) {
if (Text2SpeechCurrentStatus == TextToSpeech.SUCCESS) {
textToSpeech.setLanguage(Locale.US);
// button.setEnabled(true);
txt_speech = "Enter Source by pressing volume up button";
TextToSpeechFunction(txt_speech);
}
}
//voice recognization
@Override
public boolean dispatchKeyEvent(KeyEvent event) {
52
int action = event.getAction();
int keyCode = event.getKeyCode(); //get volume event
src = src_txt.getText().toString();
dest = dest_txt.getText().toString();
String sql,sql1;
int dist = 0;
Intent intent=new Intent(Train_Src_Dest.this, ShowTrainSchedule.class);
/*String[] separated = curr_time.split(":");
int CThour =Integer.parseInt(separated[0]); // this will contain "H"
int CTmin = Integer.parseInt(separated[1]); // this will contain " mm"
String[] separated1 = curr_time.split(" ");
String CTam_pm = separated1[1];//this will contain "AM/PM" */
if(dest.equalsIgnoreCase("Panvel")){
side="towards_panvel";
}
else if(src.equalsIgnoreCase("Panvel") || dest.equalsIgnoreCase("Thane") ||
src.equalsIgnoreCase("Vashi")){
side="towards_thane";
}
else if(dest.equalsIgnoreCase("Vashi")){
side="towards_vashi_v";
}
else if(src.equalsIgnoreCase("Thane")){
53
if(dest.equalsIgnoreCase("Airoli") || dest.equalsIgnoreCase("Rabale") ||
dest.equalsIgnoreCase("Ghansoli") || dest.equalsIgnoreCase("Koparkhairne") ||
dest.equalsIgnoreCase("Turbhe") || dest.equalsIgnoreCase("Juinagar") ||
dest.equalsIgnoreCase("Nerul") || dest.equalsIgnoreCase("Seawood") ||
dest.equalsIgnoreCase("Belapur") || dest.equalsIgnoreCase("Kharghar") ||
dest.equalsIgnoreCase("Manasarovar") || dest.equalsIgnoreCase("Khandeshwar"))
side="towards_panvel";
}
else if(src.equalsIgnoreCase("Airoli")){
if(dest.equalsIgnoreCase("Thane"))
side="towards_thane";
if(side.equalsIgnoreCase("towards_vashi_v") ||
side.equalsIgnoreCase("towards_thane_v")){
dist_table_name="Dist_min_vashi_thane";
}
else{
dist_table_name="Dist_in_min";
}
}
54
UserLocation.java
package com.example.mahesh.bindicator.Dao;
import android.Manifest;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.os.Build;
import android.os.Handler;
import android.os.Bundle;
import android.view.KeyEvent;
import android.widget.Toast;
import com.example.mahesh.bindicator.R;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.Locale;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.user_location);
textToSpeech = new TextToSpeech(UserLocation.this, UserLocation.this);
55
try {
if (ActivityCompat.checkSelfPermission(this, mPermission)
!= PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, new String[]{mPermission},
REQUEST_CODE_PERMISSION);
}
} catch (Exception e) {
e.printStackTrace();
}
}
@Override
public void onDestroy() {
textToSpeech.shutdown();
super.onDestroy();
}
}
56
Noyification.java
package com.project.afinal.location_finder;
import android.annotation.TargetApi;
import android.database.Cursor;
import android.location.Address;
import android.location.Geocoder;
import java.io.IOException;
import java.util.Arrays;
import java.util.HashMap;
import java.util.Iterator;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Locale;
import java.util.Map;
import java.util.Set;
import java.util.Timer;
import java.util.TimerTask;
@Override
57
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_notification);
textToSpeech = new TextToSpeech(notification.this, notification.this);
s1 = (Switch)findViewById(R.id.switch1);
tv = (TextView) findViewById(R.id.tv1);
tvs = (TextView) findViewById(R.id.tv2);
linkedHashMap.put("thane_lat", new Double(19.1864587));
linkedHashMap.put("thane_lon", new Double(72.9754684));
linkedHashMap.put("airoli_lat", new Double(19.1585059));
linkedHashMap.put("airoli_lon", new Double(72.9988718));
linkedHashMap.put("rabale_lat", new Double(19.1366935));
linkedHashMap.put("rabale_lon", new Double(73.0030762));
linkedHashMap.put("ghansoli_lat", new Double(19.1164622));
linkedHashMap.put("ghansoli_lon", new Double(73.0068541));
linkedHashMap.put("koparkhairane_lat", new Double(19.1032275));
linkedHashMap.put("koparkhairane_lon", new Double(73.0114058));
linkedHashMap.put("turbhe_lat", new Double(19.0762177));
linkedHashMap.put("turbhe_lon", new Double(73.0177009));
/* linkedHashMap.put("sanpada_lat", new Double(19.0661156));
linkedHashMap.put("sanpada_lon", new Double(73.0093485));
linkedHashMap.put("vashi_lat", new Double(19.0630606));
linkedHashMap.put("vashi_lon", new Double(72.9989048));*/
linkedHashMap.put("juinagar_lat", new Double(19.0558078));
linkedHashMap.put("juinagar_lon", new Double(73.0182052));
linkedHashMap.put("nerul_lat", new Double(19.033509));
linkedHashMap.put("nerul_lon", new Double(73.0181354));
linkedHashMap.put("seawoods-darave_lat", new Double(19.0219794));
linkedHashMap.put("seawoods-darave_lon", new Double(73.0193129));
linkedHashMap.put("belapurCBD_lat", new Double(19.0189644));
linkedHashMap.put("belapurCBD_lon", new Double(73.0392069));
58
linkedHashMap.put("kharghar_lat", new Double(19.0264714));
linkedHashMap.put("kharghar_lon", new Double(73.0595161));
linkedHashMap.put("mansarovar_lat", new Double(19.0167379));
linkedHashMap.put("mansarovar_lon", new Double(73.0805558));
linkedHashMap.put("khandeshwar_lat", new Double(19.0074616));
linkedHashMap.put("khandeshwar_lon", new Double(73.0947554));
linkedHashMap.put("panvel_lat", new Double(18.990894));
linkedHashMap.put("panvel_lon", new Double(73.1207371));
linkedHashMap.put("hanuman mandir vadala_lat", new Double(19.0221322));
linkedHashMap.put("hanuman mandir vadala_lon", new Double(72.8660294));
linkedHashMap.put("sangam nagar taxi stand_lat", new Double(19.0216567));
linkedHashMap.put("sangam nagar taxi stand_lon", new Double(72.8695176));
linkedHashMap.put("v i t_lat", new Double(19.0214));
linkedHashMap.put("v i t_lon", new Double(72.87072));
}
@Override
public void onDestroy() {
textToSpeech.shutdown();
super.onDestroy();
}
@Override
public void onInit(int Text2SpeechCurrentStatus) {
if (Text2SpeechCurrentStatus == TextToSpeech.SUCCESS) {
textToSpeech.setLanguage(Locale.US);
if(s1.isChecked()){
txt_speech = "Notification service is already running.Now before train reaches next
station you will be notified. To disable notification press volume down button";
chk_speaker(txt_speech);
double lat = gps.getLatitude();
59
double lon = gps.getLongitude();
get_notification(lat,lon);
}
else {
txt_speech = "press volume up button to enable notification";
TextToSpeechFunction(txt_speech);
}
}
}
void enable_notification() {
String sp = "You are at " + result;
chk_speaker(sp);
if (p1 <= lat && lat <= t1 && t2 <= lon && lon <= p2 ) {
s1.setChecked(true);
txt_speech = "You have enabled notification.Now you will be notified before train reaches
next station .";
chk_speaker(txt_speech);
new Handler().postDelayed(new Runnable() {
@Override
public void run() {
Log.d("at line 187", "ho gaya run lat="+lat+" lon="+lon);
get_notification(lat,lon);
// show_tv2();
}
}, 4000);
}
else{
txt_speech= "You can enable notification only when you will aboard the train.";
60
chk_speaker(txt_speech);
}
}
@Override
public boolean dispatchKeyEvent(KeyEvent event) {
int action = event.getAction();
int keyCode = event.getKeyCode(); //get volume event
switch (keyCode) {
case KeyEvent.KEYCODE_VOLUME_UP:
if (action == KeyEvent.ACTION_DOWN) {
if(s1.isChecked()) {
txt_speech = "Notification service is already running.Now before train reaches next
station you will be notified.";
chk_speaker(txt_speech);
double lat = gps.getLatitude();
double lon = gps.getLongitude();
get_notification(lat,lon);
}
else{
enable_notification();
}
}
}
return true;
}
61
5.3.Test Cases
3. Display and Enough After taking source Train Train schedule Pass
speak train volume and destination input schedule information is
schedule train schedule is information to be displayed
displayed from is displayed and spoken by
current time and spoken the speaker.
onwards and spoken by the
by the speaker. speaker.
4. Display and Enough After taking source Ticket fare is Ticket fare is Pass
speak ticket volume and destination input displayed to be
fare ticket fare is and spoken displayed
information displayed and spoken by the
62
by the speaker speaker.
63
CONCLUSION
AND
FUTURE SCOPE
64
6. Conclusion and future scope
6.1. Conclusion
The System is very useful for blind and visually impaired people who are residing in mumbai, and
use local trains as a mean of their transportation .The System provides ease to see train schedule,
information about ticket fare, access to their current location and provide location based
notifications. The System is highly scalable and user friendly. Almost all the system objectives
have been met. The System has been tested under all criteria. The System resolves the GUI problem
of existing application i.e. M-Indicator and provides a better user experience. All phases of
development were conceived using methodologies. The system executes successfully by fulfilling
the objectives of the project.
65
6.2. Future Scope
Adding other features of M-Indicator such as Metro, Mono, Bus, express, taxi etc.
Along with local trains blind people and visually impaired also use other local
transport such as Metro, Mono, Buses, taxi, Auto rickshaw etc. User will be able to use
these features as provided by M-Indicator too.
Multilingual
Making application multilingual will enable user to use language they are
comfortable with.
66
REFERENCES
67
7. References
[1] WHO. (2014, August) Visual impairment and blindness. Archived at
http://www.webcitation.org/6YfcCRh9L. [Online]. Available:
http://www.who.int/mediacentre/factsheets/fs282/en/
[2] Sinora Ghosalkar , Saurabh Pandey, Shailesh Padhra, Tanvi Apte , “Android Application on
Examination Using Speech Technology for Blind People” , International Journal of
Research in Computer and Communication Technology, Vol 3, Issue 3, March- 2014
[3] Jae Sung Cha, Dong Kyun Lim and Yong-Nyuo Shin, “Design and Implementation of a
Voice Based Navigation for Visually Impaired Persons”, in International Journal of Bio-
Science and Bio-Technology Vol. 5, No. 3, June, 2013.
[4] Siddhesh R Baravkar, Mohith R Borde, Mahendra K Nivangune, “Android Text Messaging
Application For Visually Impaired People”, IRACST-an International Journal(ESTIJ), Vol.
3, No. 1, February 2013.
[5] Poornima.P , V.Sriteja Reddy , “ARM7 based Smart bus Passenger-Alert System using
GSM with GPS based Location Identification” , International Journal of Engineering
Development and Research , Vol 4, Issue 2, 2016
[6] Renu Tarneja, Huma Khan, Prof. R. A. Agrawal, Prof. Dinesh. D. Patil, “Voice Commands
Control Recognition Android Apps”, International Journal of Engineering Research and
General Science Volume 3, Issue 2, March-April, 2015
68
[8] Prof. Manisha Bansode , Student Shivani Jadhav and Anjali Kashyap , “Voice Recognition
and Voice Navigation for Blind using GPS” , International Journal Of Innovative Research
In Electrical, Electronics, Instrumentation And Control Engineering Vol. 3, Issue 4, April
2015.
[9] Akshay Khatri , “Assistive Vision for the Blind " , International Journal of Engineering
Science Invention ISSN (Online): 2319 – 6734, ISSN (Print): 2319 – 6726 www.ijesi.org
Volume 3 Issue 5ǁ May 2014 ǁ PP.16-19
[10] http://whatis.techtarget.com/definition/speech-synthesis
[11] www.youtube.com
[12] https://www.sqlite.org/
[13] http://www.sitesbay.com/java/features-of-java
[14] http://www.bespecular.com/blog/how-does-a-visually-impaired-person-use-a-
smartphone/
[15] https://developer.android.com/about/index.html
69
International Journal for Research in Engineering Application & Management (IJREAM)
ISSN : 2454-9150 Vol-02, Issue 12, Mar 2017
Abstract— It is estimated that 285 million people globally are visually impaired with 39 million blind and 246 million
with low vision [1].The large number of blind and visually impaired individuals in the society has motivated research
groups to search for smart solutions that use vision-based technologies to improve their quality of life. The mobile
phones have become an essential element of any communication media. In the past few years, many standard mobile
devices have started to include screen reading software that allows blind people to use them. For instance, Google’s
Android platform and the Apple iPhone (starting with the 3GS) now include free screen readers. The iPhone has proven
particularly popular among blind users, which is why they developed VizWiz Social for it. With the availability of an
accessible platform, a number of applications has been developed for blind people, including GPS navigation
applications, OCR readers, and color recognizers etc. In this paper, a study on various methodologies has been done
along with survey on papers related to Android applications for Blind or visually impaired.
Keywords: Blind, Android Application, Visually impaired, Speech recognition, GPS, Navigation, location.
number of acceptable word combinations based on the rules preparatory satellites in case of malfunction. CS (Control
of language and statistical information from different texts. Segment) represents a general observation post that manages
Speech recognition systems, based on hidden Markov and tracks GPS satellites. US (User Segment) represents
models are today most widely applied in modern GPS users and GPS receiver [3].
technologies. They use the word or phoneme as a unit for
III. LITERATURE SURVEY
modeling.
This section presents the survey on Android applications for
The model output is hidden probabilistic functions of state
blind or visually impaired people.
and can’t be deterministically specified. State sequence
through model is not exactly known. Speech recognition In this paper [4] Siddhesh R Baravkar, Mohith R Bord and
systems generally assume that the speech signal is a Mahendra K Nivangune developed Android Text Messaging
realization of some message encoded as a sequence of one Application. The messaging can be completely voice based.
or more symbols [2]. The proposed application is a messaging system which is
voice enabled. The application listens to your messages and
then responds with voice commands by talking. The
A. SPEECH SYNTHESIS
application converts your text into voice and voice into text.
Speech synthesis is the computer-generated simulation of
For android it is voice to text technology to listen to what
human speech [9]. Speech synthesis is a speech generated by
you send and gets you connected with people.
a computer for people with physical disabilities or visually
impaired or the people facing difficulties in reading small
Limitation: This application always run in the background
sized text. Speech synthesis is also referred to as text-to-
and hence drains a lot of battery.
speech (TTS). In Speech Synthesis process first the text is
analyzed by using natural language rules. Analysis of text is
In paper [5] Poornima.P, V.Sriteja Reddy proposed ARM7
done character by character to determine the grammatical
based Smart bus Passenger-Alert System using GSM with
details and parts of speech. For example, where the
GPS based Location Identification. In which they have
sentences begin and end, tense of sentence, which words are
presented an intelligent real time alarming system which
proper noun, pronoun, number and so on. Clearly
senses the destination location which is taken as input from
understanding how a word or a phrase is being used is a
the passenger. This system also includes android mobile
critical aspects of speech synthesis. Some non-trivial
alert application. After the passenger reached the
analysis is used to generate the appropriate sound for the
destination, bus system alert the passenger through SMS
text.
with android application in passenger mobile and also
assisted with alarming system in case of excessive amount
B. GPS
of temperature in case of smoke identification which leads
GPS is a radio navigation system using satellites and it is
to the fire accidents.
developed by USA Department of Defense for military use
navigation but it can be used by citizens with a limited
Limitation: In this system development, tools like IR sensor
range. It predicts radio coverage from satellites to a receiver,
and gas sensor has been used for navigation, making the
then it shows the exact 3D location, speed and time. This
application expensive which are not required in our
system can be universally used for 24 hours, and many
application for navigation.
people can use it. This GPS system can be dived into 3
different segments; SS (Space Segment), CS (Control
In paper [6] Renu Tarneja, Huma Khan, Prof. R. A.
Segment), and US (User Segment). SS (Space Segment)
Agrawal, Prof. Dinesh. D. Patil proposed to develop
represents the location of 24 satellites that rotate around the
interactive application which can run on the tablet or any
Earth every 12 hours. As of April, 2007, there is a total of 36
android based phone. The application helps the user to open
GPS satellites with 30 of them are active and 6 of them are
any application as well as call any contact through voice. Passenger NO YES HIGH Zigbee provides a
Bus Alert limited coverage
Users can command a mobile device to do something via
System For and sometimes a
speech. These commands are then immediately executed.
Easy congestion may
Navigation also occur.
Limitation: This proposed application needs to run always in Of Blind