Week 5 Discussion 1: Questionnaire Design Using The Textbook

Week 5 Discussion 1questionnaire Designusing The Textbook Required

Week 5 - Discussion 1 Questionnaire Design Using the textbook, required articles and recommended resources, construct a 5-6 item questionnaire on a topic of your choice. Your questionnaire can include either open-ended or closed-ended (fixed format) questions. Submit your completed questionnaire to this discussion forum. Be sure to consider the following when responding: What are the strengths and weaknesses of the questionnaire overall? Consider how the items are worded and the response choices, if applicable, are provided. Are the questions clear and concise? Are there any questions that are vague or unclear? If applicable, are the response choices effective? Is there only one correct response choice? Are there any unintentional cues to the correct answer? Do the items appear to be a good measure of what the student is wanting to assess? How could the questionnaire be improved?

Paper For Above instruction

In this discussion, I will design a concise questionnaire aimed at assessing college students' attitudes towards online learning, a topic of increasing relevance in contemporary education. The goal of this questionnaire is to gather insightful data on students' perceptions, challenges, and preferences related to online education modalities. I will develop six items, including a mix of closed-ended and open-ended questions, to ensure a comprehensive understanding of the subject matter while maintaining clarity and focus.

The first item is a closed-ended question asking about the frequency of students' participation in online classes: "How often do you attend online classes?" with response options ranging from "Never" to "Always." This item is straightforward, providing clear, non-ambiguous options that are easy for respondents to choose. Its strength lies in quantifying online engagement, although a potential weakness might be that it does not specify "attend" in terms of participation quality or engagement level.

The second question is also closed-ended: "What is your primary device for attending online classes?" with options including "Laptop," "Tablet," "Smartphone," and "Other." This question aims to identify the typical devices used, which can influence accessibility and user experience. While effective, it could be improved with an open-ended "Other" option for more detailed responses.

The third item introduces a Likert-scale question: "On a scale of 1 to 5, how effective do you find online classes compared to in-person classes?" where 1 represents "Much less effective" and 5 signifies "Much more effective." This response format enables nuanced insights into perceptions of online learning's efficacy. It is clear and concise, with effective response choices that allow for easy interpretation.

The fourth question shifts to an open-ended format: "What challenges have you experienced while participating in online classes?" This encourages respondents to describe their difficulties freely, capturing rich qualitative data. While beneficial for depth, this item relies on self-report and may result in varied responses that require qualitative analysis.

The fifth item asks about preferences: "Would you prefer to continue with online classes or return to in-person learning?" with simple response options: "Continue online" or "Return to in-person." This question directly gauges student preferences and helps institutions understand potential future trends. Its simplicity and clarity are strengths, though including an "unsure" option could capture indecisiveness.

The sixth and final question is open-ended: "Please suggest any improvements that could enhance your online learning experience." This invites constructive feedback, offering insights into specific areas needing enhancement. Though open-ended responses can be more challenging to analyze, they are invaluable for qualitative improvements.

Overall, this questionnaire balances quantitative and qualitative items, providing a well-rounded view of students' attitudes toward online learning. The questions are generally clear and concise, with response choices designed to minimize bias and unintentional cues. To improve the questionnaire, optional demographic questions such as age or year of study could be added for more detailed analysis. Additionally, ensuring all response options are mutually exclusive and exhaustive can enhance accuracy. Avoiding leading language and providing balanced response options are essential for capturing genuine attitudes.

References

  • Creswell, J. W., & Creswell, J. D. (2017). Research Design: Qualitative, Quantitative, and Mixed Methods Approaches. SAGE Publications.
  • Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method. John Wiley & Sons.
  • Fowler, F. J. (2013). Survey Research Methods. Sage publications.
  • Groves, R. M., et al. (2009). Survey Methodology. Wiley.
  • Mohammad, S., & Islam, M. S. (2020). Challenges and opportunities of online learning during COVID-19 pandemic. Educational Research and Reviews, 15(4), 148-155.
  • Schiffman, L. G., & Kanuk, L. L. (2010). Consumer Behavior. Pearson Education.
  • Tourangeau, R., Rips, L. J., & Rasinski, K. (2000). The Psychology of Survey Response. Cambridge University Press.
  • Wright, K., et al. (2019). The effect of questionnaire design on data quality: A review. Public Opinion Quarterly, 83(2), 393-415.
  • Zikmund, W. G., et al. (2013). Business Research Methods. Cengage Learning.
  • Yuan, Q., et al. (2021). Students' perceptions of online learning during COVID-19: An exploratory study. Computers & Education, 168, 104216.