Lab 6: Information Gathering – Unobtrusive
Methods
Course Code: CSE 4408
Course Title: System Analysis and Design Lab
Group Name: Team Melancholy
Group Members:
- Mahiul Kabir (220041109)
- Nayeem Hossain (220041139)
- Sohom Sattyam (220041141)
Sampling Strategy Design
Team Melancholy identified two key stakeholder groups for the ReliefNet
project. Older, Educated Villagers can provide crucial insights into
community needs, local dynamics, and long-term recovery requirements[1].
Experienced Volunteers offer valuable operational knowledge from the
field, identifying practical challenges and effective solutions in aid
distribution[2]. Both groups are targeted to ensure diverse perspectives. We
will employ a blended sampling approach combining Purposive and
Convenience methods[3]. Purposive sampling will intentionally select
knowledgeable informants (e.g. community elders and long-term volunteers)
for deep insights, while convenience sampling will include readily available
participants for broader representation[3]. This combined strategy “ensures
both rich, detailed insights and a representative understanding of broader
experiences”[4].
The sampling procedure will follow these four steps, as outlined in the
slides[5][6]:
1. Defining Inclusion Criteria: Specify participant demographics (e.g.
age, education), experience levels (e.g. years volunteering), and
geographic areas of interest[7].
2. Sourcing Participant Lists: Obtain contact lists from local NGOs,
community leaders, and disaster response coordinators[8].
3. Selecting Participants: From these lists, choose 15–20 individuals
ensuring diversity across affected regions and backgrounds[9].
4. Inviting and Scheduling: Contact potential participants via phone or
local coordinators and schedule interviews or field visits[10].
This procedure ensures we capture a variety of perspectives while focusing
on those most likely to provide relevant information[5][6].
Investigation Plan for Existing Data
Our investigation will use three existing data sources:
System Error Logs (Quantitative): Structured logs from ReliefNet
and related tools. These include numeric records of system
performance (e.g. error counts, timestamps, system uptime)[11]. As
such, they yield quantitative data (numerical metrics of system
health).
Volunteer Field Reports (Qualitative): Narrative reports written by
volunteers in the field[12]. These contain descriptive accounts of on-
the-ground challenges and successes. By nature they are qualitative
(capturing experiences and observations)[13].
NGO Aid Reports (Quantitative): Official reports from NGOs
detailing aid distribution (e.g. quantities of supplies delivered by
location)[14]. These present aggregated figures and patterns of
resource utilization, making them quantitative records of overall aid
activity.
Volunteer field reports will be the primary focus for deeper analysis. These
qualitative documents can reveal common operational issues (for example,
difficulties with manual verification or communication) and volunteer-
reported needs. For instance, an observed anecdote notes that “paper lists
slow verification”[15], suggesting volunteers face bottlenecks with manual
record-keeping. Such insights inform system requirements: for example, they
imply a need for mobile data entry and offline syncing to replace paper
forms. Reports also highlight the importance of connectivity; volunteers
frequently mention that Internet/radio communication is vital for
updates[16], which suggests the system UI should prioritize real-time
notification and simple communication tools. In designing UI/UX, we must
ensure the interface is extremely clear and intuitive; as one UX analysis
notes, in emergencies the system should “do the thinking for [users]” so that
“a poorly designed user interface… could lead to hesitation” or delays[17].
Finally, we note limitations of field reports: they are anecdotal and may be
incomplete or biased (only reflecting volunteers’ perspectives at specific
times), so they must be corroborated with other data. Nonetheless, they
provide valuable context that can shape requirements and interface design
to address actual field challenges[15][16].
Observation Planning – Analyst’s Playscript
For our analysis, we assume the role of a Field Aid Worker operating in a
disaster field office. This temporary coordination hub (near the disaster zone)
brings together NGOs, volunteers, and officials[18]. From that perspective,
the current manual workflow unfolds as follows:
1. Verbal Instructions – Tasks are conveyed orally (word-of-mouth). Aid
workers receive verbal directions from coordinators, which often leads
to miscommunications and delays[19].
2. Uncentralized Collection – Supplies are gathered without a central
inventory. Workers collect items from multiple locations or donors,
risking duplication of effort or shortages[20].
3. Inefficient Navigation – Transport routes rely on local, ad-hoc
navigation. Field workers use personal experience to reach sites,
causing slow travel and difficulty reaching remote areas[21].
4. Manual Verification – Beneficiary registration is done via paper lists.
Checking recipients against handwritten lists is time-consuming and
error-prone[22].
5. No Real-Time Tracking – Distributed aid has no immediate digital
record. Because aid distribution is logged on paper, there is no real-
time oversight of where resources have been sent[23].
6. Hand-Written Incident Logging – Problems and incidents are noted
by hand in notebooks. This fragmented record-keeping causes delays
in addressing issues, as notes must be later processed[24].
7. Delayed Reporting – Field information is reported later via phone
calls or paper forms. End-of-day reports are compiled manually,
resulting in significant delays and potential data-entry errors[25].
Each of these steps is inefficient and fragmented in the current workflow[26]
[27]. The playscript highlights the need for digitized, real-time tools to
replace these cumbersome manual processes.
Observation Planning – STROBE Analysis
Our field observation of the disaster management office reveals several
organizational themes:
Real-Time Coordination: The team emphasizes immediate
collaboration. For example, open-plan desks and whiteboards are used
for shift planning and strategy[28]. Internet and radio links provide
continuous updates, enabling rapid response. This reflects a culture
prioritizing coordination.
Manual Process Inefficiency: Despite this, many processes remain
paper-based. For instance, staff rely on handwritten beneficiary lists
and logs, causing slow verification and data fragmentation[15]. This
theme indicates outdated processes that hamper efficiency.
Resource Constraints: Physical limitations are apparent. Observers
noted that storage space for relief kits is limited[29], meaning supplies
must often be kept in flexible configurations. Such constraints lead to
improvised logistics.
These themes suggest the organization values quick communication and
planning, but is hindered by manual record-keeping and limited physical
resources[30][15]. Table 1 summarizes our STROBE observations of the field
office environment.
Table 1: STROBE Observation of the Disaster Field Office
Aspect Observation Detail
Office location Temporary field office near the
disaster zone, facilitating
coordination among NGOs,
volunteers, and agencies[18].
Desk placement Open desks and whiteboards
arranged for real-time
planning and
collaboration[28].
Stationery Whiteboards, flip charts,
printed maps, and paper forms
present for task tracking and
planning.
Props Relief kits are stacked and
ready for deployment[31];
radios, laptops, and phones
are on desks for
communication.
External sources Continuous Internet and radio
communication provide
updates[32]; official guidelines
may be posted on walls.
Lighting/Color Bright fluorescent lighting and
white walls/boards ensure
clear visibility in the
workspace.
Clothing Volunteers and staff wear
identifiable vests or uniforms
for role recognition and team
coordination.
Table 1 highlights how the environment is organized to support collaborative
planning (open layouts, communication tools) and the physical realities of a
field office (limited space, basic equipment)[30]. These observations will
guide design considerations (e.g. the need for digital displays and alerts
rather than paper charts).
Synthesis with Interactive Methods
Integrating the STROBE observations with our Lab 5 interviews yields the
following anecdotes, each aligned with prior findings:
Whiteboards for shift planning (✓ confirms): We observed wall-
mounted whiteboards detailing schedules. The notes emphasize they
are “crucial for effective shift assignments and real-time
coordination”[33]. This confirms Lab 5 survey feedback that volunteers
rely heavily on visual planning tools for organizing tasks.
Limited storage for kits (✗ negates): The field office had minimal
storage space, requiring kits to be moved frequently. The report notes
“insufficient storage necessitates more flexible solutions”[29]. This
contradicted earlier assumptions (from interviews) that supplies were
pre-staged adequately. Instead, volunteers expressed surprise at the
ad-hoc arrangements.
Paper lists slow verification (✗ negates): Manual beneficiary lists
were used at the office, and staff noted these were a “major
bottleneck, causing delays in beneficiary verification”[15]. This finding
challenges any prior expectation that on-site checks were efficient;
interviewees now emphasize a need to digitize this process.
Internet/radio vital for updates (✓ confirms): The observation
highlighted constant radio and internet use. Field notes state this “is
critical for timely communication from remote areas”[16]. This aligns
with volunteer interviews: respondents confirmed that connectivity
tools are essential and that any system must support real-time alerts.
Open desk layout aids collaboration (✓ confirms): The office’s
open plan with no cubicles encouraged teamwork. Observers noted
this setup “significantly fosters team collaboration and quick decision-
making”[34]. This confirms Lab 5 findings that team members valued
the ability to communicate face-to-face across shifts.
Each anecdote above is marked with ✓ (confirms) or ✗ (negates) to indicate
whether it supports or contradicts our Lab 5 insights. Together, they
illustrate how unobtrusive observations (STROBE) complement interactive
interviews and surveys, validating some findings (✓) and revealing new
issues (✗) for the ReliefNet requirements and design.
[1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [14] [15] [16] [18] [19] [20] [21]
[22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] Lab-6.pdf
file://file-184rK3pr4wMnMPXHJtV59S
[13] Qualitative Data in Nonprofit Program Evaluation: FAQ Guide — Mission
Capital
https://www.missioncapital.org/blog/qualitative-data-in-nonprofit-program-
evaluation-faq-guide
[17] UX Design for Crisis Situations: Lessons from the Los Angeles Wildfires ::
UXmatters
https://www.uxmatters.com/mt/archives/2025/03/ux-design-for-crisis-
situations-lessons-from-the-los-angeles-wildfires.php