Software Engineering - CH 2 - Requirement Analysis - Handout
Software Engineering - CH 2 - Requirement Analysis - Handout
Chapter Two
Modeling Software Systems
Chapter Objectives
Chapter Contents
2.1. Requirement Engineering
2.1.1 Activities in requirement analysis
2.1.2 Classical operational analysis Vs. Object oriented analysis
2.2. System Design
2.2.1. Software design activity and its objective
2.2.2. Modularization techniques
2.2.3. Top-down versus bottom-up design
2.2.4. Design patterns: classical vs. object oriented :UML
2.3. System Implementation
This provides the appropriate mechanism for understanding what customer wants,
analyzing need, assessing feasibility, negotiating a reasonable solution, specifying the
solution unambiguously, validating the specification and managing the requirements as
they are transformed into an operational system.
The requirements engineering process can be described into six distinct steps:
Requirements elicitation
Requirements analysis
Requirements specification
Requirement modeling
Requirements validation
Requirements management
1
The most commonly used requirements elicitation technique is to conduct a meeting or
interview. Analyst starts by asking a set of (context-free) questions that will lead to a
basic understanding of the problem, the people who want the solution, the nature of the
solution that is desired and the effectiveness of the first encounter. They focus on the
customer, overall goals and the benefits.
These questions help to identify all stakeholders who will have interest in the
software to be built. In addition, the questions identify the measurable benefit of
a successful implementation and possible alternatives to custom software
development.
The next set of questions enables the analyst to gain better understanding of the
problem and the customer to voice his/her perceptions about the solution
The final set of questions focuses on the effectives of the meeting. These
questions will help to ‘break the ice’ and initiate the communication that is
essential to successful analysis.
The Q&A session should be used for the first encounter only and then replaced
by a meeting format that combines elements of problem solving, negotiation
and specification.
2
completed, each FAST attendee makes a list of validation criteria for the
product / system.
Finally, one or more participants assigned the task of writing the complete draft
specification using all inputs.
The team approach in FAST provides the benefits of many points of view, instantaneous
discussion and refinement and is a concrete step toward the development of a
specification.
QFD is a quality management technique that translates the needs of the customer into
technical requirements for software. It concentrates on maximizing customer satisfaction
from the software engineering process. IT emphasizes an understanding of what is
valuable to the customer and then deploys these values throughout the engineering
process.
Initially the analyst studies the system specification and the software project plan, which
is important to understand software in a system context and to review the software scope
that was used to generate planning estimates.
Next, communication for analysis must be established so that the problem recognition as
perceived by the customers is ensured.
Problem evaluation and solution synthesis is the next major area of effort for analysis.
The analyst must define all externally observable data objects, evaluate the flow and
content of information, define and elaborate all software functions, understand software
behavior in the context of events that affect the system, establish system interface
3
characteristics and uncover additional design constraints. Each of these tasks serves to
describe the problem so that an overall approach or solution may be synthesized.
For e.g., an inventory control system is required for a major supplier of auto parts. The
analyst finds problems with current manual system include
Inability to obtain the status of the component rapidly,
Two-or-three day turnaround to update a card file,
Multiple reorders to the same vendor because there is no way to associate vendors
with components and so on.
Upon evaluating current problems and desired information, analyst begins to synthesize
one or more solutions:
To start, the data objects, processing functions and behavior of the system are
defined in detail.
Once this information is established, basic architectures for implementation are
considered.
The process of evaluation and synthesis continues until both analyst and customer feel
confident that software can be adequately specified for subsequent development steps.
During the evaluation and solution synthesis activity, analyst creates models of the
system in an effort to better understand data and control flow, functional processing,
operational behavior and information content. The model serves as a foundation for
software design and as the basis for the creation of specifications for the software.
The detailed specifications may not be possible at this stage. Customer may be unsure of
precisely what is required. The developer may be unsure that specific approach will
properly accomplish function and performance. For these reasons, the alternative
approach to requirement analysis called prototyping is considered.
Each analysis method has a unique point of view. All analysis methods are related by
asset of operational principles:
The information domain of a problem must be represented and understood.
The function that the software is to perform must be defined.
The behavior of the software must be represented.
The models that depict information function and behavior must be partitioned in a
manner that uncovers detail in a layered fashion.
The analysis process should move from essential information toward
implementation detail.
4
Guiding principles for requirements analysis:
Understand the problem before you begin to create the analysis model.
Develop prototypes that enable user to understand how human/machine
interaction will occur.
Record the origin of and the reason for every requirement.
Use multiple views of requirements.
Rank the requirements.
Work to eliminate ambiguity.
A software engineer who takes these principles can develop a software specification that
will provide an excellent foundation for design.
This may be used as representation process regardless of the mode through which we
accomplish it. Requirements are represented in a manner that leads to successful software
implementation.
Specification Principles
A number of specification principles can be proposed:
Separate functionality from implementation.
Develop a model of the desired behavior of a system that encompasses data and
the functional responses of a system to various environments.
Specify the manner in which other system components interact with software.
Define the environment in which the system operates.
Design a model as perceived by its user rather than a design or implementation
model.
Recognize the specifications, as that is an abstraction of some real situation that is
normally quite complex.
Content and structure of a specification should be established in such a way that
will enable it to be amenable to change.
Background (about the subject area you want to work with, about the organizational
background), Statement of the problem (state the problem clearly in terms of
quantitatively), Justification of the problem (show that the efforts made to solve a
problem and your project contribution), Objectives of the projects (general and specific
objectives), Methodology of the project (data collection methods and why , sample
selection and why, selection of requirement analysis method and why, selection of
design tools and why, implementation issues , programming language selection and why,
testing tools and methodology), Scope of the project (boundary of your project, state the
things your project will do), Application of the project (to whom and how your project
work can be applicable), Project management (group formation, group management,
5
group report structure), Project budget (resources with estimations), Time management
(activities with time schedule), References (site your references using standardization
techniques), Appendixes (attach necessary documents and code scripts).
I. DFD
Data flow diagram is a network representation of a system. It portrays the system in terms
of its components pieces with all interfaces among the components indicated. If there is
an existing system, study the current system: identify the data flows, data stores and
processes. The analysis should ask the users repeatedly until the analyst gets the required
knowledge of the current system. There are different methods to develop the current
system dataflow diagram. Use the current organizational unit to easily identify
information flows, data stores and process such as yellow pages, model 19, black carbon
copy etc. When you develop the current system DFD, use the existing physical names.
This helps to establish good communication between the analyst and the users.
As analysis proceeds the physical considerations become a burden. Use of the physical
terms also limits the readership of the documents to those who are familiar with the
details. Then transform all the physical information in the current logical model. The
logical DFD should show only what the system performs but not how the system
operates. The DFD is made up of only four basic elements:
a) Data flow
A data flow is a pipeline through which packets of information of known composition
flow. It portrays some interface among components or a data flow diagram. Most data
flows move between processes, but they can just as well flow into or out of files, and to
and from destination boxes and sources boxes respectively. It is represented by named
vectors.
6
b) Processes
A process is a transformation incoming data flow(s) into outgoing data flow(s). It
invariably show some amount of work performed on data. It is represented by circles or
bubbles or rectangles.
c) Files
A file is a temporary repository of data or information. It may be a tape, or an area of
disk, or a card data set, or a chart in a wall, or an index file in someone’s drawer, or the
little book if deadbeat cardholders that the credit card companies issue from time to time.
It might even be a wastebasket. As long as it is a temporary repository of data, it qualifies
as a file.
Databases qualify as files under this definition. The term database caries along with it
even more connotations about physical implementation than a file does. It is represented
by unclosed rectangles or lines.
The analysis model encompasses representation of data objects, function and control.
Thus, it is necessary to provide an organized approach for representing the characteristics
of each data object and control item. This is accomplished by data dictionary.
“Data dictionary is an organized listing of all data elements that are pertinent to the
system, with precise, rigorous definitions so that both user and system analyst will have a
common understanding of inputs, outputs, components of stores and intermediate
calculations”
Data dictionary is always implemented as part of a CASE structured analysis and design
tools. The information contained in the dictionary is:
Name – the primary name of the data or control item, the data store or an external
entity.
Alias – other names used for first entry.
Where used / how used – a listing of the process that uses the data or control item and
how it is used (e.g., input to the process, output from the process, as a store, as an
external entity).
Content description – a notation for representing content.
Supplementary information – other information about data types, preset values (if
known), restrictions or limitations and so forth.
As an example if the data item telephone number is specified as input, the data
dictionary provides with a precise definition of telephone number for the DFD in
7
addition it indicates where and how this data item is used and any supplementary
information that is relevant to it.
The content description is expanded until all composite data items have been represented
as elementary items or until all composite items are represented in terms that would be
well known. The data dictionary grows rapidly in size and complexity for large
computer-based systems. CASE tools should be used as it is extremely difficult to
maintain dictionary manually.
Data modeling and ERD provide the analyst with a notation for examining data within
the context of a software application. This approach is used to create one piece of the
analysis model that can also be used for database design and to support other
requirements analysis methods.
An entity is any real world existing object or event, which should know the concept of
entity records. An attribute is an item of information recorded about an object, for
example borrower name or borrower’s department. A key attribute is the principal
identifier of an object. Borrower name is the key attribute of borrowers’ entity. A key
attribute has a unique value for each occurrence of a record. A relationship is an
indication that one object is associated with one or more other objects.
The process of removing internal repeating groups from complex files and setting them
up separately is called normalization. It is used to avoid data redundancy and to build a
stable data model, which can be expanded in terms of the user’s requirement. If data is
8
accessed by a data flow and is not used by any process, remove that the form which
contains that data item from your DFD.
All the Data flows (DFD), Entity Relationship Diagrams (ERD) should be written in
descriptive English. This gives the reader of the requirement engineering documents a
clear picture and creates understanding between the requirement analyst and the user of
the software.
There are two general ways of verifying a specification. One consists of observing the
dynamic behavior of the specified system in order to check whether it conforms to the
intuitive understanding we had of the behavior of the ideal system. The other consists of
analyzing the properties of the specified system that can be deduced from the
specification. The properties that are deduced are then checked against the expected
properties of the system. The effectiveness of both techniques increases depending on the
degree of formality of the specification.
The conventional specification methods which are based on the structured approach view
the software in two ways: either in data-oriented view or in process or action
oriented view. Nevertheless, data and action are two sides of the same coin; a data item
cannot change unless an action is performed on it, and actions without associated data are
equally meaningless.
In object oriented approach, both data and actions are bounded together to form objects
which are instance of classes. Classes may also be related to each other through
hierarchical relationships such as inheritance. Thus, in object-oriented approach, data and
actions are considered to be of the same importance, and neither takes precedence over
the other. This approach is superior to the structured approach.
a) Use Case
Use case shows the interaction between external users and the system. It is used to
capture business requirement for the system. It is used to build a process model, which
defines the business processes in a more formal manner. It is a set of activities that
9
produce some output results. Each use case describes how the system reacts to an event
that triggers the system. A use case contains a fairly complete description of all the
activities that occur in response to a trigger event. It is similar to context diagram in
structured approach.
b) Class Diagram
Class diagram shows the static nature of a system at the class level and is used to
illustrate the relationship between classes modeled in the system. It is similar with data
model i.e., entity relationship diagram (ERD).
c) Object Diagram
It shows the static nature of a system at the class level. It is used to illustrate the
relationship between object modeled in the system. It is used when actual instances of the
class will better communicate the model. It is similar with data model i.e., Entity
Relationship Diagram (ERD).
d) Sequence Diagram
It shows the interaction between classes for a given use case, arranged by time sequence.
It models the behavior of classes with in a use case. It is similar with process model
(DFD).
e) Collaboration Diagram
It shows the interaction between classes for a give use case, not arranged by time
sequence. It is used to model the behavior of classes with in a use case. It is similar with
process model (DFD).
g) Activity Diagram
Is a specific business process, or the dynamics of a group of object provides a view or
flows and what is going on inside a use case or among several classes. It illustrates the
flow of activities in a use case.
h) Component Diagram
The physical components (i.e., ex files, dll files) in a design and where they are located. It
illustrates the physical structure of the software.
i) Deployment diagram
Shows the structure of the run time system, for example it can show how physical
modules of code are distributed across various hardware platforms. Shows the mapping
of software to hardware components.
Advantages of OO Approach
10
1) Better organization of inherent complexity: use of inheritance ensures related
concepts, resources, and other objects can be efficiently defined and used.
2) Reduced development effort through reuse: reusing objects classes that have been
written, tested, and maintained by other cuts development, testing and
maintenance time.
3) More extensible and maintainable systems: the use of OOP software helps limit
the number of potential interactions of different parts of software, ensuring that
changes to the implementation of a class can be made with little impact on the rest
of the system.
4) Objects enable programmers to customize an operating system to meet new
requirements without disrupting system integrity.
5) Because objects communicate by means of messages, it matters not whether two
communicating objects are on the same system or on two different systems in a
network.
6) Objects pave the road to distributed computing.
In the system design, the major decisions are whether to buy or make in house the
software. The major questions raised to answer these questions are:
The main task of the system design is to provide details specifications of the system that
can be implemented by computer programmers and technicians.
In order to achieve the above goals, the system analyst and designer should perform the
following considerations during the system design stage.
1. Involve end users in the system design such as in the design of outputs and inputs.
Because it is the users who remain working with the physical system.
2. Fulfill current and projected functional requirements. The designed system should
answer how the identified functions are met. Additional functions are also added
when we automate the manual system such as entering data into the computer
11
system, editing input data, and security and performance functions such as
protecting sensitive data through password.
3. Design all information system components
Data and Information: each data and information flows were documented
during the analysis phase and you specified the media during the system
selection phase. Now it is time to design the style, organization and format
of all inputs and outputs.
Data Store: specify format, organization and access methods for all files
and databases to be used in the computer based system.
End user: the roles people must play in the new system must be specified
such as who will capture and input data, who will receive outputs and so
on.
4. Methods and Procedures: the sequence of steps and flow of control through the
new system must be specified. The processing methods and intermediate manual
procedures must be also documented.
5. Computer Equipment:
Specify the type of hardware to be purchased.
Computer Programs: complete programming specifications must be
prepared for every program that must be written internal controls, specify
internal controls to ensure the security and reliability of the system.
Architectural Design
Detailed Design
12
Design Testing
The output of a computer system is the primary contact between the system and most
users. The quality of this output and its usefulness determines whether the system will be
used, so it is essential to have the best possible output. Output design considerations
include:
13
The timing of computer outputs is important. Outputs must be received by their
recipients while the information is pertinent to transactions or decisions. The
can affect how the output is designed and implemented.
The distribution of computer outputs must be sufficient to assist all relevant end
users.
The computer outputs must be acceptable to the end users who will receive it.
There are different media to present the output such s paper media, screen output and
secondary storage media. Hence we should specify which media to use to present the
output. On the other hand format refers to the way the information is displayed on that
medium. There are several formats you can consider for communicating information on a
medium.
Tabular columns of text and numbers are the oldest and most common format for
computer outputs.
Graphics output is coming more popular as high capacity computers and
specialized graphics software comes on the market. The end user a picture can be
more valuable that words. Bar charts, pie charts, line charts, step charts,
histograms, and other graphs can help end user grasp trends and data relationships
that can not be easily seen in tabular numbers.
c. Internal Controls
Internal controls ensure information is delivered to the right person and protect
information from unnecessary misuse and fraud. The following guidelines are offered for
output controls:
The timing and the volume of each output must be precisely specified.
The distribution of all outputs must be specified. For each output, the recipients of
all copies must be determined. A distribution log, which provides an audit trial for
the outputs, is frequently required.
Access controls are used to control accessibility of video (on line) output. For
example, a password may be required displayed a certain output on a CRT
terminal.
Control totals should be incorporated into all reports. The number of records input
should be equal the number of records output.
Review existing system outputs to easily identified the outputs generated by the
system and the data elements in each output.
Add new data elements to existing outputs to meet new user requirements
Prototype the layout for end users. There are different tools for rapid prototyping
such as Microsoft Excel Spreadsheet, CASE tools and using DBMS report
generator facilities etc.
III.Input Design
14
Input design serves to easily enter data into the compute system. What to be outputted
depends on what have been inputted to the system. Important terms in input design
include:
Data Capture: refers to collecting the relevant data from the source documents
and filling on computer input form.
Data Entry: is the process of converting data into machine readable form such as
data encoding through a compute keyboard.
Data Input: refers to data in machine readable form.
Batch methods: The sources documents are collected and sent periodically say once in a
week or month for data encoders.
Online methods: data is directly entered at its origin through a computer terminal. The
most common online medium is the display terminal, which includes at least a monitor
and a keyboard connected to a compute system. No form is used to collect data from the
source documents for later data entry. There is no also data clerk. Data entry errors are
detected immediately during data entry by computer edit program and notify the CRT
(Cathode Ray Tubes) operator to make correction. Of course the edit program does not
detect all data entry errors, human checking still important. Online data entry method is
common in retail shops and grocery at sales terminal point.
Input controls ensure the accuracy of data input to the computer. This includes:
The number of inputs should be monitored. For any missing or misplaced source
documents.
Care must be taken to ensure that the data is valid such as typing 123 instead of
132. such checking includes:
o Completeness checks
o Limit and range checks
15
o Combination checks
o Self checking digits
Data validation requires that special edit programs be written to perform checks.
However, the input validation requirements should be designed when the inputs
themselves are designed.
Follow the following steps to prototype and design your compute inputs:
Review input requirements. Check your DFD. Any data flow that enters the
machine side of the system is an input to be designed. We should also check the
data elements whether they are sufficient to produce the required output.
Design how the input data flow will be implemented.
o Identify source documents that need to be designed or modified.
o Determine the input method to be used.
o Determine the timing and volume of input.
o Specify internal controls and special instructions to be followed for the
input.
o Study the input data elements to determine which data really needs to be
input.
Design or prototype the source document. If the source document is used to
capture data, we prefer to design that document first.
Prototype online input screens. You can produce a sketches or prototypes and
show for end users to comment on it and finalize your input screen designs. You
can also use different tools such as Database management software to produce a
prototype screen design. This is usually designed for online input and remote
batch inputs.
Methods and procedures define the sequence of events that produce outputs from their
requisite inputs. Specifically, a method is a way of doing something. A procedure is a
step-by-step plan for implementing the method. Methods and procedures can also be
described as answering the questions “who does that and when do they do it? And “how
will it be done?”
The overall process for designing a user interface begins with the creation of
different models of system function.
The human-and-computer-oriented tasks that are required to achieve system
function are then outlined
Design issues that apply to all interface designs are considered
Tools are used to prototype and implement the design model
The result is evaluated for quality.
16
Initial analysis activity focuses on the profile of the users who will interact with
the system i.e., skill level, business understanding and general receptiveness of
the new system are recorded.
Once general requirements have been defined, a more detailed task analysis is
conducted. Those tasks that the user performs to accomplish the goals of the
system are identified, described and elaborated.
The information gathered as part of the analysis activity is used to create an
analysis model for the interface. The goal is to define a set of interface objects and
actions that enable the user to perform all defined tasks in the manner that meets
every usability goal defined for the system.
Validation focuses on
o Ability of the interface to implement every user task correctly, to
accommodate all task variations and to achieve all general user
requirements;
o Degree to which the interface is easy to use and easy to learn and
o The user’s acceptance of the interface as a useful tool in their work.
a) Modularization Techniques
When we decompose a system into modules, we must be able to describe the overall modular
structure precisely and state the relationships among the individual modules.
Functional Independence
17
b) Top-down Vs bottom-up
What strategy should we follow when we design a system? Top-down or bottom-up. Both
strategies have strong and weak points. Criticisms on top-down strategy include:
Information hiding proceeds mainly bottom up. It suggests that we should first recognize
what we wish to encapsulate within a module and then provide an abstract interface to
define the module’s boundaries as seen from the clients. Note, however, that the
decision of what to hide inside a module may depend on the result of some top-down
activity. Since information hiding is proven to be highly effective in supporting
design for change, program families, and reusable components, its bottom-up
philosophy should be followed in a consistent way.
Design, however, is a highly critical and creative human activity. Good designers do not
proceed in a strictly top-down or strictly bottom-up fashion. A typical design strategy
may proceed partly top down and partly bottom up, depending on the phase of design or
the nature of the application being designed, called yo-yo design. For example, we might
start decomposing a system top down in terms of subsystems and, at some later point,
synthesize subsystems in terms of a hierarchy of information-hiding modules.
The top–down approach, however, is often useful as a way to document a design even if
the system has not been designed in top-down fashion.
Software design is both a process and a model. The design process is a sequence of steps
that enable the designer to describe all aspects of the software to be built. The design
model is the equivalent of an architect’s plans for a house. It begins by representing the
totality of the thing to be built and slowly refines the thing to provide guidance for
constructing each detail. Similarly, the design model that is created for software provides
a variety of different views of the computer software.
18
to ensure the best modular design for the program.
The modules are implemented using structured programming principles.
Structured Chart
In the structured design methods the popular one is structured chart. Structured
chart illustrates a modular design of a program. This shows how the program has
been partitioned into smaller more manageable modules- the hierarchy and
organization of those modules, and the communication interface between
modules.
a. Identifying Objects
19
Look at other systems with which the system under consideration interacts
as a way of prompting potential class and objects.
Ask what physical devices the system interacts with
Examine the events that must be remembered and recorded. e.g. date, the
roles that people play like owner, manager, client
Examine the physical or geographical locations of relevance and also the
organizational unit. e.g. Divisions and teams.
b. Define attributes
After objects are identified attributes should be defined. Attributes are data
variables contained with in an object. It represents properties that describe the
state of an object. The fact that the internal processing and the details of the data
are hidden (or private) is known as encapsulation.
c. Defining services
A class represents a kind of person, place, or thing about which the systme must
capture and stor information. Organize the basic classes and objects into
hierarchies that will enable the benefits of inheritance to be realized. It involves
the identification of those aspects or objects that are common or generalized, and
separating them from those that are specific. So we are doing the identification of
structures on classes.
Generalization e.g.
student
21
Error return conventions and exception handling mechanisms: The way error
conditions are reported by different functions in a program and the way common
exception conditions are handled should be standardized within an organization.
For example, different functions when they encounter an error condition should
either return a 0 or 1 consistently.
A code walk-through is an informal technique for analysis of the code. A code walk-
through of a module is undertaken after the coding of the module is complete. In this
22
technique, after a module has been coded, members of the development team select some
test cases and simulate execution of the code by hand. Even though a code walk-through
is an informal analysis technique, several guidelines have been evolved over the years for
making this naive but useful analysis technique more effective. Of course, these
guidelines are based on personal experience, common sense, and several subjective
factors, and hence should be considered more as examples than rules to be applied
dogmatically. Some of these guidelines are given below:
The team performing a code walk-through should not be either too big or too
small. Ideally, it should consist of between three to seven members.
Discussions should be focused on the discovery of errors and not on how to fix
the discovered errors.
In order to foster cooperation and avoid the feeling among the engineers that
they are being evaluated, managers should not participate in the discussions.
III. Code Inspections
When we develop a software product we not only develop the executable files and the
source code but also develop various kinds of documents such as users' manual, software
requirements specification (SRS) document, design document, test document, installation
manual, etc. All these documents are a vital part of any good software development
practice. Good documents enhance understandability and maintainability of software
product. Different types of software documentation can be broadly classified as:
• Internal documentation, and
• External documentation (supporting documents).
23
Internal documentation is the code comprehension features provided as part of the source
code itself. Internal documentation is provided through appropriate module headers and
comments embedded in the source code. Internal documentation is also provided through
the use of meaningful variable names, code indentation, code structuring, use of
enumerated types and constant identifiers, use of user-defined data types, etc. Most
software development organizations usually ensure good internal documentation by
appropriately formulating their coding standards and coding guidelines.
24