NUST Wants To Create Awareness About Its Strategic Plan ✓ Solved
NUST wants to create awareness about its Strategic Plan, whi
NUST wants to create awareness about its Strategic Plan, which outlines goals and KPIs. The university community and public do not effectively read or engage with the plan. In Software Engineering and Human-Computer Interaction you are required to create a prototype (a game) that assists stakeholders to learn, memorize, and understand the strategic direction of the plan. You are free to propose your own game ideas. Deliver a functional prototype description, software architecture, user interface and HCI considerations, gameplay mechanics, assessment and feedback mechanisms, implementation plan, and sample code snippets for the prototype.
Paper For Above Instructions
Executive summary
This paper proposes a web-based gamified prototype, "NUST Quest", designed to make the NUST Strategic Plan accessible, memorable, and engaging for staff, students, alumni and the public. NUST Quest employs microlearning, narrative mapping, quizzes, and spaced repetition to communicate goals and KPIs. The prototype includes a clear software architecture, HCI-driven UI, gameplay mechanics tied directly to strategic-plan content, assessment and analytics for measuring learning outcomes, and implementation guidance with sample code snippets for a minimal viable product (MVP).
Learning objectives mapped to game goals
Primary learning outcomes: (1) Stakeholders can recall the five strategic goals and key KPIs; (2) Stakeholders can explain two flagship programmes and at least three KPIs; (3) Increased engagement with the full strategic plan document.
Game goals mirror these: progress through the Strategic Map, unlock Goal Badges, complete KPI Challenges, and produce a personal Action Pledge.
Core concept and mechanics
NUST Quest is a progressive narrative map where each node represents a strategic goal or KPI cluster. Players navigate the map via short sessions (5–10 minutes) composed of: interactive micro-lessons (text + visuals), scenario-based multiple-choice questions, mini-simulations (choose actions to meet KPIs), and reflection prompts requiring a written pledge. Mechanics include points, badges, streaks (spaced repetition reminders), and optional leaderboards. Correct answers yield points and reveal a short evidence card linking to the relevant section of the Strategic Plan (transparency and learn-more affordance).
HCI and accessibility considerations
Design follows ISO 9241-210 human-centered principles and Nielsen’s usability heuristics: clear affordances, consistent navigation, error prevention, and low cognitive load (ISO, 2010; Nielsen, 1994). The UI uses high-contrast color schemes, scalable fonts, keyboard navigation, and ARIA roles for screen readers to ensure accessibility. Content is chunked into micro-units and employs multimedia (icons, short animations) to support dual-coding (visual + verbal) for memory retention (Norman, 2013).
Software architecture and tech stack
Recommended stack for rapid prototyping: a single-page application (SPA) using React or Vue for UI, Phaser or custom canvas for lightweight minigames, Node.js + Express backend, and MongoDB for content and user profiles. Architecture layers:
- Presentation: SPA with responsive design and PWA capabilities for offline microlearning.
- Application: REST API for authentication, content delivery, and analytics.
- Data: Document DB storing strategic-plan content, question bank, user progress, and analytics events.
- Integration: OAuth for campus SSO; optional LMS (Moodle) integration via LTI.
Content model and mapping
Represent strategic-plan nodes as JSON documents. Each node contains title, summary (50–120 words), media (icon, short image), question set, and suggested action. Spaced-repetition metadata (interval, ease) supports reminders. Example content structure ensures content authors can update goals/KPIs without code changes.
Assessment, analytics and evaluation
Assessment is formative and summative: micro-quiz scores give instant feedback; milestone tests measure recall. Analytics capture time-on-task, correct/incorrect responses, and retention via follow-up quizzes at intervals (1 day, 7 days, 30 days). Usability and learning effectiveness measured via SUS and pre/post knowledge tests (Sauro & Lewis, 2016). Success metrics: increase in strategic-plan webpage visits, quiz completion rate, improvement in recall scores, and voluntary Action Pledge submissions.
Interaction flows and sample UI
Onboarding: brief tutorial, select stakeholder type (student/staff/alumni/public), and choose a learning path (overview / deep dive). Core flow: Map → Node (micro-lesson) → Challenge → Feedback + Evidence Card → Badge/Progress. UI emphasizes breadcrumbs, progress bars, and consistent placement of help and glossary.
Implementation plan and timeline
MVP (8–10 weeks): content mapping, SPA skeleton, question engine, one mini-game, basic analytics, and accessible UI. Pilot (weeks 11–14): deploy to small campus cohort, collect SUS and learning data, iterate. Scale (months 4–12): integrate SSO and LMS, add languages, industry partnerships, and mobile app packaging.
Prototype demonstration: sample code snippets
Below is a minimal JavaScript snippet illustrating a question-check and local progress save using localStorage (suitable for an MVP web prototype):
const question = {
id: 'goal1-q1',
text: 'Which KPI measures student retention?',
options: ['Graduation rate','First-year retention','Research income'],
answer: 1
};
function checkAnswer(q, selectedIndex) {
const correct = q.answer === selectedIndex;
const score = Number(localStorage.getItem('nust_score') || 0) + (correct ? 10 : 0);
localStorage.setItem('nust_score', score);
// simple feedback
return { correct, score };
}
// Usage
const result = checkAnswer(question, 1);
console.log(result);
Example JSON for a strategic node:
{
"id": "goal-1",
"title": "Building a Vibrant Learning Environment",
"summary": "Develop blended learning, modern pedagogy, and student support to increase retention.",
"kpis": [
{"id":"kpi-1","label":"First-year retention 83%"},
{"id":"kpi-2","label":"Graduation rate 43.2%"}
],
"questions": ["goal1-q1","goal1-q2"]
}
Evaluation and iterative improvement
Run an A/B test comparing the narrative map vs. a linear quiz to determine which yields better retention. Combine quantitative data (quiz scores, analytics) with qualitative feedback (interviews) and iterate. Use SUS and a bespoke knowledge test at 0, 7, and 30 days post-intervention to track memory decay and optimize spaced repetition (Sauro & Lewis, 2016; Hamari et al., 2014).
Risks and mitigation
Risk: Low uptake. Mitigation: integrate with orientation, staff CPD credits, and alumni incentives. Risk: content updates out-of-sync. Mitigation: authoring dashboard and content-review workflow. Risk: privacy concerns. Mitigation: minimal data collection, opt-in analytics, and data protection compliance.
Conclusion
NUST Quest leverages gamification and human-centered design to make the Strategic Plan discoverable and memorable. The proposed architecture and MVP prioritize accessibility, measurable learning outcomes, and scalability. With iterative testing and stakeholder involvement, the prototype can significantly raise engagement with the Strategic Plan and support institutional change.
References
- Deterding, S., Dixon, D., Khaled, R., & Nacke, L. (2011). From game design elements to gamefulness: defining “gamification”. Proceedings of CHI 2011. https://doi.org/10.1145/1979742.197978
- Hamari, J., Koivisto, J., & Sarsa, H. (2014). Does gamification work? — A literature review of empirical studies on gamification. Proceedings of the 47th Hawaii International Conference on System Sciences. https://doi.org/10.1109/HICSS.2014.377
- Kapp, K. M. (2012). The Gamification of Learning and Instruction. Wiley.
- Papastergiou, M. (2009). Digital game-based learning in higher education: A case study. Computers & Education, 52(1), 1–12. https://doi.org/10.1016/j.compedu.2008.06.004
- Norman, D. A. (2013). The Design of Everyday Things (Revised edition). Basic Books.
- International Organization for Standardization. (2010). ISO 9241-210:2010 Ergonomics of human-system interaction — Human-centred design for interactive systems. ISO.
- Nielsen, J. (1994). Usability Engineering. Morgan Kaufmann.
- Prensky, M. (2001). Digital Game-Based Learning. McGraw-Hill.
- Schell, J. (2014). The Art of Game Design: A Book of Lenses (2nd ed.). CRC Press.
- Sauro, J., & Lewis, J. R. (2016). Quantifying the User Experience: Practical Statistics for User Research. Morgan Kaufmann.