Week 2 Assignment Proposal Scenario: Your Team Has Been Task

Week 2 Assignment ProposalScenario your Team Has Been Tasked With De

Your team has been tasked with delivering a Section 508-validated application for U.S. Department of Homeland Security field officers to check potential immigrants for connections to criminal or terrorist behavior. This application must meet these requirements: conform with Section-508 accessibility guidelines, allow wireless real-time bidirectional data transfer and database queries with secure servers, have a secure login with autolocking after two minutes of non-use, and accommodate officers working any day and time, in offices, vehicles, or on foot along the U.S. border and across cities nationwide.

The first step is to define user needs and expectations to ensure a positive user experience. You will prepare a proposal for your company's IT executive that details user applicability and needs.

Paper For Above instruction

The purpose of this proposal is to delineate the user environment and platform requirements essential for developing a secure, accessible, and efficient application for DHS field officers. Understanding these parameters ensures the technology accommodates the diverse scenarios in which officers operate, providing reliable access and functionality tailored to their operational contexts.

Operating Environments and User Profiles

The primary users of this application are DHS field officers deployed across various environments, including urban settings, border patrol routes, and remote inland locations. These users operate in conditions that often lack stable internet connectivity, necessitating robust offline capabilities with secure data synchronization when connections are available. Officers work in vehicles, on foot, at border checkpoints, and within office settings, each environment presenting unique challenges such as variable lighting, noise levels, and mobility constraints. Their operational hours span 24/7 shifts, demanding an application that supports continuous, uninterrupted access regardless of time zone or location.

Given these diverse environments, the supported platforms for the application must include tablets and rugged laptops capable of running Windows or iOS/Android operating systems. The interfaces should be designed for simplicity, with large touch targets and clear visuals to accommodate gloves or limited visibility conditions. Responsive design principles should be employed to adapt the interface seamlessly across devices, ensuring ease of use in rapid-response situations. Accessibility features must align with Section 508 standards, including screen reader compatibility, high-contrast modes, and keyboard navigability.

Methods for Determining User Needs

To accurately capture user needs, we will sample DHS officers through a combination of surveys and semi-structured interviews. The survey will be distributed to a representative sample of officers from different operational contexts (urban, border, remote areas) to gather quantitative data on device preferences, common tasks, and perceived usability issues. Follow-up interviews will provide qualitative insights into specific challenges faced during field operations, communication needs, and accessibility requirements.

The rationale for choosing this mixed-method approach is to obtain comprehensive data that merges broad statistical trends with in-depth contextual understanding. Surveys allow us to gather data from a large, diverse group efficiently, while interviews help clarify nuanced operational challenges that are not easily captured through questionnaires.

Criteria for Data Collection and Analysis

Data collection will focus on device preferences, frequency of use, task complexity, and accessibility considerations. Questions for the survey may include: “What device do you most frequently use in the field?” “What are your biggest challenges when accessing the application?” and “Which accessibility features are most critical to your use?” For interviews, we will explore themes such as user frustrations, safety concerns, and suggestions for interface improvements.

Data analysis will involve both descriptive and inferential statistics. Descriptive statistics such as mean, median, and mode will summarize device preferences and common usability issues. Inferential analysis will be performed using chi-square tests to determine if differences in device preferences are statistically significant across different user groups, and t-tests to assess variations in perceived usability scores. These tests are chosen for their effectiveness in analyzing categorical and continuous data, respectively, providing insights into whether observed differences are statistically meaningful or due to chance.

Interpreting these results allows us to tailor the application’s design features to user preferences, optimize usability, and enhance accessibility. For example, if data indicate significant usability issues with certain devices or interface features, targeted modifications can be implemented accordingly.

Supporting Resources

This proposal draws on established principles of user-centered design (Norman, 2013), accessibility standards (W3C, 2018), and usability testing methodologies (Dumas & Redish, 1999). These references inform our approach to designing an inclusive, effective application that enhances safety and operational efficiency for DHS officers.

References

  • Norman, D. A. (2013). The Design of Everyday Things: Revised and Expanded Edition. Basic Books.
  • W3C. (2018). Web Content Accessibility Guidelines (WCAG) 2.1. https://www.w3.org/TR/WCAG21/
  • Dumas, J. S., & Redish, J. C. (1999). A Practical Guide to Usability Testing. Intellect Books.
  • Shneiderman, B., Plaisant, C., Cohen, M., Jacobs, S., & Elmqvist, N. (2016). Designing the User Interface: Strategies for Effective Human-Computer Interaction. Pearson.
  • Gulliksen, J., Göransson, B., Boivie, I., Blomkvist, S., & Bengtsson, J. (2009). Key principles for user-centred systems design. Behaviour & Information Technology, 28(6), 599–609.
  • Sears, A., & Jacko, J. A. (Eds.). (2009). Human-Computer Interaction: Design Issues, Solutions, and Opportunities. CRC Press.
  • McGrenere, J., & Ho, W. (2000). Affordances: Clarifying and evolving a concept. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 179–186.
  • Hassenzahl, M. (2010). Experience Design: Technology for All the Right Reasons. Synthesis Lectures on Human-Centered Informatics, 3(1), 1-95.
  • Carroll, J. M. (1997). Human-Computer Interaction: Psychology as a Science of Design. Annual Review of Psychology, 48, 61–83.
  • ISO. (2018). ISO 9241-210:2010 Ergonomics of human-system interaction — Part 210: Human-centred design for interactive systems. International Organization for Standardization.