Prototype Version 1: Tasks For Interviewees Task #1 Try To G

Prototype Version 1 Tasks for interviewees Task 1 Try to get in contact with a research project of your degree

Prototype Version 1: Tasks for interviewees Task #1 Try to get in contact with a research project of your degree

Attempt to evaluate a system designed to assist undergraduate students in finding research opportunities by conducting user testing and gathering feedback through structured interview protocols, UX questionnaires, heuristic evaluation, and prototype testing. The process involves engaging participants in tasks such as contacting research projects, finding FAQs, and locating specific research projects associated with professors. Data collection includes both quantitative and qualitative measures, along with observations of user interactions. The goal is to identify usability strengths, weaknesses, and areas for improvement to enhance the design of an undergraduate research opportunity website.

Paper For Above instruction

In the context of enhancing undergraduate students' access to research opportunities, designing an intuitive and efficient online platform is crucial. The methodology involving user testing, heuristic evaluation, and prototype refinement aims to optimize usability, relevance, and engagement. This comprehensive approach combines practical tasks, user feedback, and systematic evaluation to inform interface improvements that align with the needs and expectations of undergraduates seeking research positions.

The initial tasks developed for participant engagement focus on core functionalities such as connecting students to research projects, accessing FAQs, and interacting with faculty profiles. Participants are guided through these tasks without assistance to observe natural interaction patterns, leveraging think-aloud protocols to capture their thought processes. Quantitative measures—such as perceived system complexity, ease of use, confidence, stress levels, and error frequency—are complemented by qualitative insights into their user experience, suggestions for improvements, and perceptions of system efficiency.

Research findings reveal significant issues with dead links, aesthetic inconsistencies, and navigation confusion, all of which impair the platform’s effectiveness. These align with established usability heuristics, including visibility of system status, error prevention, and minimalist design. For instance, participants frequently encountered dead links, leading to frustration and decreased trust in the platform. Likewise, confusing label names and cluttered interfaces hindered task success, underscoring the need for clearer navigation labels and streamlining of content.

Heuristic evaluation further identified strengths such as consistency and user control, which promote a sense of familiarity and freedom in navigation. However, weaknesses like lack of helpful guidance, absence of profile features, and inadequate error feedback limited overall usability. Addressing these issues involves the integration of sign-in capabilities, refined filtering options, clearer labeling, and contextual help features.

Prototype testing with paper models allowed for early validation of interface design. Participants appreciated the simplicity, intuitive interactions, and logical flow, but also pointed out limitations such as limited filters, confusing tab names, and external interruptions. Their feedback informed concrete recommendations, including adding more filter options (e.g., deadline, relevance), renaming tabs for clarity, and implementing login features for personalized experiences. The observed behaviors indicated that reducing unnecessary content and improving visual clarity would significantly enhance user engagement.

Overall, the iterative testing process demonstrated that user-centered design principles are essential for developing effective undergraduate research portals. The combination of direct observation, structured questionnaires, heuristic analysis, and prototyping provided comprehensive insights into user needs. By prioritizing clarity, relevance, and accessibility, the platform can better serve its target audience, facilitating meaningful research engagement and supporting academic growth for undergraduates.

References

  • Nielsen, J. (1994). Heuristic Evaluation. In J. Nielsen (Ed.), Usability Engineering. Morgan Kaufmann.
  • Norman, D. A. (2013). The Design of Everyday Things: Revised and Expanded Edition. Basic Books.
  • Shneiderman, B., & Plaisant, C. (2010). Designing the User Interface: Strategies for Effective Human-Computer Interaction. Pearson.
  • Krug, S. (2014). Don't Make Me Think, Revisited: A Common Sense Approach to Web Usability. New Riders.
  • Rubin, J., & Chisnell, D. (2008). Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. Wiley Publishing.
  • Lemon, S. & Geertman, S. (2012). User-centered Design of Research Portals. Journal of Usability Studies, 7(3), 93-106.
  • Hartson, R., & Pyla, P. (2012). The UX Book: Process and Guidelines for Ensuring a Quality User Experience. Morgan Kaufmann.
  • Gould, J. D., & Lewis, C. (1985). Designing for Usability: Key Principles and Practical Approaches. Communications of the ACM, 28(3), 300-311.
  • Garrett, J. J. (2010). The Elements of User Experience: User-Centered Design for the Web and Beyond. New Riders.
  • Vredenburg, K., Isensee, S., & Righi, C. (2002). User-Centered Design: An Integrated Approach. IBM Systems Journal, 39(3-4), 491-505.