International Association of Educators   |  ISSN: 1949-4270   |  e-ISSN: 1949-4289

Original article | Educational Policy Analysis and Strategic Research 2023, Vol. 18(4) 256-291

Acceptance Scale of Interactive E-Books By Secondary School Students as a Digital Learning Resource: A Validity and Reliability Study

Ömer Kırbaş, Fatma Demirtaş, Fatih Doğan & Alptürk Akçölteki̇n

pp. 256 - 291   |  DOI: https://doi.org/10.29329/epasr.2023.631.11   |  Manu. Number: MANU-2211-20-0002.R1

Published online: December 30, 2023  |   Number of Views: 39  |  Number of Download: 53


Abstract

Today, the impact of information technologies can be seen in all areas of life. It is a necessity to use this technology to increase the quality of the field of education. A new learning environment can be created by transforming books, which are a part of education, into digital resources through information technologies. Accordingly, it is necessary to determine student acceptance in such learning environments. For this reason, this study, it is aimed to develop a measurement tool to determine the level of acceptance and use of interactive digital e-books by secondary school students. With the acceptance scale towards interactive e-books of secondary school students (IE-BAS), it is envisaged to determine both in-class activities and activity efficiency. The technology acceptance model (TAM) was used as a basis for the development process of IE-BAS. This process was carried out in two steps. Validity analyzes were provided in study 1 and reliability analyzes in study 2. In Study 1, analyzes regarding the content validity and construct validity of draft IE-BAS were carried out. Accordingly, an item pool of 39 items was created in content validity studies and presented to the opinion of a team of 14 experts. Here, four items were excluded from scope validity because they did not provide sufficient kappa values. Afterward, the 35-item draft IE-BAS, whose content validity was ensured, was subjected to exploratory factor analyzes (EFA) for structural validity. As a result of the EFA analysis, 2 items were removed from the draft IE-BAS. EFA results revealed that the 33-item IE-BAS clustered in four factors indicating cognitive, affective, and behavioral responses. These factors correspond to the perceived usefulness (PU), perceived ease of use (PEU), attitude towards use (UA), and behavioral intention (BI) variables in the TAM model. These four components explain 52.94% of the total variance. Confirmatory factor analysis (CFA) was performed for the 33-item IE-BAS to confirm the EFA results. In this statistic, 13 items were removed from IE-BTS. The fit indices obtained for the 20-item IE-BAS proved the accuracy of the proposed model in EFA. As a result of CFA, the fit indices for the model-data fit were quite good.Accordingly, the RMSEA, SRMR, GFI, AGFI, NFI, CFI and RFI value was determined as 0.043, 0.058, 0.924, 0.901, 0.909, 0.966 and 0.892, respectively. In the last validity study, the effects of the variables in the 20-item IE-BAS on each other were examined by structural equation modeling (SEM). Accordingly, the behavioral intention variable, which shows the behavioral response of the students, is significantly affected by the PEU, UA, and PU variables, that is, the motivation elements, which show the affective and cognitive responses. In Study 2, reliability analyzes of 20-item IE-BAS were performed. In this context, the internal consistency, item-total score reliability coefficient, and two-half test reliability of the 20-item IE-BAS were examined. Accordingly, Cronbach’s alpha coefficient for the overall IE-BAS was determined to be 0.914. Also the Spearman-Brown and Guttman half-split coefficients were found to be .740 and .850 for the first and second parts, respectively. According to these results, the reliability coefficients of Spearman-Brown, Guttman split-half, and Cronbach α were found to be sufficient. The results showed that the developed 20-item productive IE-BAS is a valid and reliable scale tool to measure secondary school students' acceptance towards interactive e-books as a digital learning environment.

Keywords: Validity, reliability, interactive e-book, technology acceptance model


How to Cite this Article?

APA 6th edition
Kirbas, O., Demirtas, F., Dogan, F. & Akcolteki̇n, A. (2023). Acceptance Scale of Interactive E-Books By Secondary School Students as a Digital Learning Resource: A Validity and Reliability Study . Educational Policy Analysis and Strategic Research, 18(4), 256-291. doi: 10.29329/epasr.2023.631.11

Harvard
Kirbas, O., Demirtas, F., Dogan, F. and Akcolteki̇n, A. (2023). Acceptance Scale of Interactive E-Books By Secondary School Students as a Digital Learning Resource: A Validity and Reliability Study . Educational Policy Analysis and Strategic Research, 18(4), pp. 256-291.

Chicago 16th edition
Kirbas, Omer, Fatma Demirtas, Fatih Dogan and Alpturk Akcolteki̇n (2023). "Acceptance Scale of Interactive E-Books By Secondary School Students as a Digital Learning Resource: A Validity and Reliability Study ". Educational Policy Analysis and Strategic Research 18 (4):256-291. doi:10.29329/epasr.2023.631.11.

References
  1. Adıgüzel, A. (2010). The status of instructional technology in the primary schools and classroom teachers’ level of using these technologies. Journal of Dicle University Ziya Gökalp Faculty of Education, 15, 1-17. [Google Scholar]
  2. Ajzen, I. (1985). From ıntentions to actions: A theory of planned behavior,” in action control. J. Kuhl, & J. Beckmann (Dü.), SSSP Springer Series in Social Psychology. in Berlin: Heidelberg: Springer. [Google Scholar]
  3. Akbaş, O., & Pektaş, H. (2011). The effects of using an interactive whiteboard on the academic achievement of university students. Asia-Pacific Forum on Science Learning and Teaching, 12(2), 1-19. [Google Scholar]
  4. Akpınar, B., & Aydın, K. (2010). Eğitimde değişim ve öğretmenlerin değişim algıları. Education and Science, 32(144), 71-80. Retrieved Eylül 24, 2022, from https://en.wikipedia.org/wiki/Project_Gutenberg [Google Scholar]
  5. Aktaş, I., Gökoğlu, S., Turgut, Y., & Karal, H. (2014). Teachers’ opinions about FATIH project: Awareness, foresight and expectations. Necatibey Education Faculty Electronic Journal of Science and Mathematics Education, 8(1), 257-286. doi:10.12973/nefmed.2014.8.1.a11 [Google Scholar] [Crossref] 
  6. Altunışık, R., Coşkun, R., Bayraktaroğlu, S., & Yıldırım, E. (2012). Research methods in social sciences: SPSS Applied (7th Edition). Istanbul: Avcı Ofset. [Google Scholar]
  7. Altun Yalçın, S., Kahraman, S., & Yılmaz, Z. A. (2020). Development and validation of robotic coding attitude scale. International Journal of Education in Mathematics, Science and Technology (IJEMST), 8(4), 342-352. [Google Scholar]
  8. Alur, P., Fatima, K., & Joseph, R. (2002). Medical teaching websites: Do they refect the learning paradigma. Med Teach 24(4), 422–424. [Google Scholar]
  9. Annamalai, S. (2016). Implementing ARCS model to design a motivating multimedia e-book for polytechnic ESL classroom. Journal of Telecommunication, Electronic and Computer Engineering, 8(8), 57-60. http://journal.utem.edu.my/index.php/jtec/article/view/1319/799 adresinden alındı [Google Scholar]
  10. Arı, M. (n.d.). Chamber of Electrical Engineers. Retrieved September 2022, 24, from TMMOB Chamber of Electrical Engineers Website: http://www.emo.org.tr/ekler/f9b5ec26abebe62_ek [Google Scholar]
  11. Arnone, M., Small, R., Chauncey, S., & McKenna, H. (2011). Curiosity, interest and engagement in technology-pervasive learning environments: A new research agenda. Educational Technology Research and Development, 59(2), 181–198. doi:10.1007/s11423-011-9190-9 [Google Scholar] [Crossref] 
  12. Ayre, C. &. Scarlly A. J. (2014). Critical values for Lawshe’s content validity ratio: revisiting the original methods of calculation. Measurement and Evaluation in Counseling and Development, 79-86. [Google Scholar]
  13. Bagozzi, R., & Yi, Y. (1988). On the evaluation of structural equation models. Journal of the Academy of Marketing Science, 16, 74-94. [Google Scholar]
  14. Barate, A., Ludovico, L., & Mangione, G. (2014). A new paradigm for music education: Creating active e-books through the IEEE 1599 Standard. IEEE 14th International Conference. Athens: Advanced Learning Technologies (ICALT). [Google Scholar]
  15. Bentler, P., & Bonett, D. (1980). Significance tests and goodness of fit in analysis of covariance structures. Psychological Bulletin, 88(3), 588-606. [Google Scholar]
  16. Bozkurt, A. (2015). Mobile learning: a seamless learning experience anytime, anywhere. AUAd, 1(2), 65-81. [Google Scholar]
  17. Browne, M., & Cudeck, R. (1993). Alternative ways of assessing model fit. Sage Focus Editions, 154, 136. [Google Scholar]
  18. Buyukozturk, S. (2009). Handbook of data analysis for social sciences. (10th ed.). Ankara: Pegem. [Google Scholar]
  19. Büyüköztürk, Ş. (2007). Manual of data analysis for social sciences. Ankara: Pegem A Publishing. [Google Scholar]
  20. Büyüköztürk, Ş. (2007). Manual of data analysis for social sciences. Ankara: Pegem Academy Publishing. [Google Scholar]
  21. Büyüköztürk, Ş., Çakmak, E., Akgün, Ö., Karadeniz, Ş., & Demirel, F. (2012). Scientific research methods. Ankara: Pegem Academy. [Google Scholar]
  22. Cambridge (Ed.). (2014, January 24). Dictionary Cambridge. Retrieved September 24, 2022, from http://dictionary.cambridge.org/dictionary/british/interaction?q=interaction [Google Scholar]
  23. Chen, C., & Chen, C. (2011). The application of interactive media display technology in environmental science learning. HCI International 2011. Japan. [Google Scholar]
  24. Choi, J., & Ji, Y. (2015). Investigating the importance of trust on adopting an autonomous vehicle. International Journal of Human-Computer Interaction, 31(10), 692-702. doi:10.1080/10447318.2015.1070549 [Google Scholar] [Crossref] 
  25. Curry, L., & Nunez-Smith, M. (2005). Mixed methods in health sciences research: A practical primer. Sage. [Google Scholar]
  26. Dağhan, G., Kibar, P., Akkoyunlu, B., & Baskan, G. (2015). Approaches and opinions of teachers and administrators on the use of ınteractive board and tablet computers. Turkish Journal of Computer and Mathematics Education, 6(3), 399-417. doi:10.16949/turcomat.42868 [Google Scholar] [Crossref] 
  27. Davis, F. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 13, 319–340. doi:10.2307/249008 [Google Scholar] [Crossref] 
  28. Davis, F. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q., 13, 319. [Google Scholar]
  29. Dow, K., Jackson, C., Wong, J., & Leitch, R. (2008). A comparison of structural equation modeling approaches: The case of user acceptance of ınformation systems. Journal Of Computer Information Systems (Summer). [Google Scholar]
  30. Emre, İ., Kaya, Z., Özdemir, T., & Kaya, O. (2011). The effects of the use of smart boards on the success of science and technology teacher candidates on the structure of the cell membrane and their attitudes towards information technologies. 6th International Advanced Technologies Symposium (IATS’11). Elazig, Turkey. [Google Scholar]
  31. Emron, S., & Dhindsa, H. (2010). Integration of interactive whiteboard technology to improve secondary science teaching and learning. International Journal of Research and Engineering (IJRE), 28, 1-24. [Google Scholar]
  32. Farjon, D., Smith, A., & Voogt, J. (2019). Technology integration of pre-service teachers explained by attitudes and beliefs, competency, access, and experience. Computer and Education,130, 81-93. doi:10.1016/j.compedu.2018.11.010 [Google Scholar] [Crossref] 
  33. Fleiss, J. L. (1971). Measuring nominal scale agreement among many raters. Psychological bulletin, 378. [Google Scholar]
  34. Fornell, C., & Larcker, D. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research, 18(1), 39–50. [Google Scholar]
  35. Gadbois, S., & Haverstock, N. (2012). Middle years science teachers voice their first experiences with interactive whiteboard technology. Canadian Journal of Science, Mathematics and Technology Education, 12(1), 121-135. [Google Scholar]
  36. Garrison, R. (1990). An analysis and evaluation of audio teleconferencing to facilitate education at a distance. The American Journal of Distance Education 4(3), 13-24. [Google Scholar]
  37. George, D., & Mallery, M. (2010). SPSS for windows step by step: A simple guide and reference, 17.0 update (10a ed.) . Boston: Pearson. [Google Scholar]
  38. Globisch, J., Dütschke, E., & Schleich, J. (2018). Acceptance of electric passenger cars in commercial fleets. Transportation Research Part A: Policy and Practice 116, 116. 122-129. doi:10.1016/j.tra.2018.06.004 [Google Scholar] [Crossref] 
  39. Gottfried, A., Preston, K., Gottfried, A., Oliver, P., Delany, D., & Ibrahim, S. (2016). Pathways from parental stimulation of children’s curiosity to high school science course accomplishments and science career interest and skill. International Journal of Science Education, 38(12), 1-23. doi:10.1080/09500693.2016.1220690 [Google Scholar] [Crossref] 
  40. Güntepe, E., & Keles, E. (2022). Evaluation of technology ıntegration process in the faculty of education by concentric circles model. Journal of Ahi Evran University Kirsehir Education Faculty, 23(2), 1639-1690. doi:10.29299/kefad.863503 [Google Scholar] [Crossref] 
  41. Güzeller, C., & Korkmaz, Ö. (2007). Evaluation of course software in computer based instruction. Journal of Kastamonu Education 15(1), 155-168. [Google Scholar]
  42. Hair, J., Black, W., Babin, B., & Anderson, R. (2010). Multivariate Data Analysis. N.J.: Prentice Hall. [Google Scholar]
  43. Hall, I., & Higgins, S. (2005). Primary school students' perceptions of interactive whiteboards. Journal of Computer assisted learning, 21(2), 102-117. [Google Scholar]
  44. Hennessy, S., Deaney, R., Ruthven, K., & Winterbottom, M. (2007). Pedagogical strategies for using the interactive whiteboard to foster learner participation in school science. Learning, Media and Technology, 32(3), 283-301. [Google Scholar]
  45. Herianto, H., & Wilujeng, I. (2020). The correlation between students’ curiosity and generic science skills in science learning. Jurnal Inovasi Pendidikan IPA, 6(2), 237–245. doi:10.21831/ jipi.v6i2.37382 [Google Scholar] [Crossref] 
  46. Hew, K., & Brush, T. (2007). Integrating technology into K-12 teaching and learning: current knowledge gaps and recommendations for future research. Educational Technology Research and Development, 55(3), 223-252. [Google Scholar]
  47. Hooper, D., Coughlan, J., & Mullen, M. (2008). Structural equation modelling: Guidelines for determining model fit. The Electronic Journal of Business Research Methods, 6(1), 53-60. [Google Scholar]
  48. Hox, J., & Bechger, T. (1995). An ıntroduction to structural equation modeling. Family Science Review, 11. [Google Scholar]
  49. Hsieh, Y., Hsueh, C., & Hsu, C. (2015). The effects of using interactive e-book on English learning effectiveness of different proficiency students. International Journal of Mobile Learning and Organisation, 9(1), 86-99. [Google Scholar]
  50. Hu, L.-T., & Bentler, P. (1999). Cut off criteria for fit ındexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1-55. [Google Scholar]
  51. Hwang, G., Sung, H., & Chang, H. (2016). Effects of concept mapping-based interactive e-books on junior high school students' learning performance in law courses. Interactive Learning Environments, 1-12. doi:10.1080/10494820.2016.1224253 [Google Scholar] [Crossref] 
  52. Itzkovitch, A. (2012). Interactive ebook apps: the reinvention of reading and interactivity. Retrieved September 24, 2022, from Ux Magazine, 816: http://uxmag.com/articles/interactive-ebook-apps-the-reinvention-ofreading-and-interactivity [Google Scholar]
  53. Johnson, R., & Wichern, D. (2014). Applied multivariate statistical analysis (Vol. 4). New Jersey:: Prentice-Hall. [Google Scholar]
  54. Jones, B. (2020). The experiences of elementary teachers regarding technology integration in the classroom. Unpublished doctoral dissertation. Minnesota: Walden University. [Google Scholar]
  55. Karadağ, R., & Bayrak, Ö. (2013). Education of Turkish. Adıyaman University Journal of Social Sciences Institute, 6(11), 269-307. [Google Scholar]
  56. Kirbag-Zengin, F., Kirilmazkaya, G., & Keçeci, G. (2012). The effect of smart board use on success and attitude in science and technology lesson. E-Journal of New World Sciences Academy, 7(2), 526-537. [Google Scholar]
  57. Kline, R. B. (2011). Principles and practice of structural equation modeling (3rd ed.). New York London: The Guilford Press. [Google Scholar]
  58. Korkmaz, O., & Çakıl, I. (2013). Teachers’ difficulties about using smart boards. Procedia Social and Behavioral Sciences, 83, 595-599. [Google Scholar]
  59. Lawshe, C. H. (1975). A quantitative approach to content validity. Personnel psychology, 536-575. [Google Scholar]
  60. Lebert, M. (2009). A short history of e-books. NEF, University of Toronto. [Google Scholar]
  61. Leech, N., Barrett, K., & Morgan, G. (2005). SPSS for intermediate statistics, use and interpretation. (2nd ed.). Mahwah: Lawrence Erlbaum Associates Inc. [Google Scholar]
  62. Madden, T., Ellen, P., & Ajzen, I. (1992). A comparison of the theory of planned behavior and the theory of reasoned action. Pers. Soc. Psychol. Bull. 18, 3-9. doi:10.1177/0146167292181001 [Google Scholar] [Crossref] 
  63. Marangunic, N., & Granic, A. (2015). Technology acceptance model: a literature ´ review from 1986 to 2013. Univers. Access Inf. Soc. 14, 81-95. doi:10.1007/ s10209-014-0348-1 [Google Scholar] [Crossref] 
  64. Murphy, K., & Davidshofer, C. (1998). Psychological testing: Principles and applications ( 4th edn.). New Jersey: Prentice-Hall. [Google Scholar]
  65. Niess, M., Ronau, R., Shafer, K., Driskell, S., Harper, S., Johnston, C., & Kersaint, G. (2009). Mathematics teacher TPACK standards and development model. Contemporary Issues in Technology and Teacher Education, 9(1), 4-24. [Google Scholar]
  66. Odabaş, H. (2003). Internet-based distance education and information and document management departments. Turkish Librarianship, 17(1), 22-36. [Google Scholar]
  67. Orts-Cortés, M. I., Moreno-Casbas, T., Squires, A., Fuentelsaz-Gallego, C., Maciá-Soler, L., & González-María, E. (2013). Content validity of the Spanish version of the practice environment scale of the nursing work index. Applied Nursing Research, 5-9. [Google Scholar]
  68. Öçal, M., & Şimşek, M. (2017). Opinions of pre-service mathematics teachers on the FATIH project and the use of technology in mathematics education. Turkish Online Journal of Qualitative Inquiry (TOJQI), 8(1), 91-121. doi:10.17569/tojqi.288857 [Google Scholar] [Crossref] 
  69. Özdamar, K. (2010). Biostatistics with PASW. Eskişehir: Kaan Bookstore. [Google Scholar]
  70. Özdamar, K. (2013). Statistical data analysis with package programs-II, (Multivariate Methods), 9th Edition. Eskişehir: Nisan Bookstore. [Google Scholar]
  71. Özer, S., & Türel, Y. (2015). ICT teacher candidates’ metaphoric perceptions of e-book and ınteractive e-book. Turkish Online Journal of Qualitative Inquiry, 6(2), 1-23. [Google Scholar]
  72. Pallant, J. (2007). SPSS survival manual: A step by step guide to data analysis using SPSS for windows (3rd ed.). Berkshire: Open University Press. [Google Scholar]
  73. Polit, D. F., & Beck, C. T. (2006). The content validity index: are you sure you know what's being reported? Critique and recommendations. Research in nursing & health, 489-497. [Google Scholar]
  74. Raturi, A., & Jack, E. (2006). Lessons learned from methodological triangulation in management research. Management research news 29(6), 345-357. [Google Scholar]
  75. Ridout, M., Demetrio, C., & Firth, D. (1999). Estimating intraclass correlation for binary data. Biometrics, 55 (1), 137-148. [Google Scholar]
  76. Rukancı, F., & Anameriç, H. (2003). E-book technology and its use. Turkish Librarianship, 17(2), 147-166. [Google Scholar]
  77. Saklan, H., & Cezmi, Ü. (2018). Technology friendly science teachers’ views of educational information network (EBA). Necatibey Faculty of Education Journal of Electronic Science and Mathematics Education, 12(1), 493-526. [Google Scholar]
  78. Sarı, U., & Güven, G. (2013). The Effect of interactive whiteboard supported inquiry based learning on achievement and motivation in. Necatibey Faculty of Education, Electronic Journal of Science and Mathematics Education, 7(2), 110-143. [Google Scholar]
  79. Schermelleh, E., & Moosbrugger, H. (2003). Evaluating the fit of structural equation models: Tests of significance and descriptive goodness-of-fit measures. Methods of Psychological Research Online, 8. [Google Scholar]
  80. Sharma, B. (2016). A focus on reliability in developmental research through cronbach’s alpha among medical, dental and paramedical professionals. Asian Pacific Journal of Health Sciences, 3, 271-278. Doi: 10.21276/apjhs.2016.3.4.43. [Google Scholar]
  81. Shi, J., Mo, X., & Sun, Z. (2012). Content validity index in scale development. Zhong nan da xue xue bao. Journal of Central South University. Medical Sciences, 152-155. [Google Scholar]
  82. Shih, B., Chen, T., Cheng, M., Chen, C., & Chen, B. (2013). How to manipulate interactive e-book on learning natural catastrophe—an example of structural mechanics using power machine. Natural Hazards, 65(3), 1637-1652. [Google Scholar]
  83. Sinclair, P., Kable, A., & Levett-Jones, T. (2015). The efectiveness of internet-based e-learning on clinician behavior and patient outcomes: a systematic review protocol. JBI Database Syst Reviews Implementation Reports, 13(1), 52-64. [Google Scholar]
  84. Sing, K., & Chew, C. (2009). An inquiry approach in learning science with engaging web-based multimedia interactive resources. International Science Education Conference. Singapore. [Google Scholar]
  85. Stirling, A., & Birt, J. (2014). An enriched multimedia e-book application to facilitate learning of anatomy. Anatomical sciences education 2014, 7(1), 19-27. [Google Scholar]
  86. Sun, H., & Zhang, P. (2006). Causal relationships between perceived enjoyment and perceived ease of use: An alternative approach. Journal of the Association for Information Systems, 7(1), 24. [Google Scholar]
  87. Sünkür, M., Arabacı, İ., & Şanlı, Ö. (2012). Secondary part of elementary schools students' views toward smart board pratices. E-Journal of New World Sciences Academy, 7(1), 313-321. [Google Scholar]
  88. Tabachnick, B., & Fidell, L. (2014). Using multivariate statistics (New International Ed.). Harlow: Pearson. [Google Scholar]
  89. Tekin, H. (1977). Quantitation and evaluation in education. Ankara: Mars Publishing. [Google Scholar]
  90. Tekışık, H. (1986). What is a textbook, how should it be selected. Journal of Contemporary Education, 11(110), 1-3. [Google Scholar]
  91. Thomas, J. (2007, July 20). Project gutenberg digital library seeks to spur literacy. (Bureau of International Information Programs) Retrieved September 24, 2022, from US Department of State: https://en.wikipedia.org/wiki/Project_Gutenberg [Google Scholar]
  92. Thut, I., & Adams, D. (2005). Pola-pola pendidikan dalam masyarakat kontemporer. Yogyakarta: Pustaka Pelajar. [Google Scholar]
  93. Turkmen, H. (2009). An effect of technology based inquiry approach on the learning of" Earth, Sun, & Moon" subject. Asia-Pacific Forum on Science Learning & Teaching, 10(1), 1-20. [Google Scholar]
  94. Ünveren Kapanadze, D. (2019). Evaluation of teaching turkish in the context of vygotsky's theory of sociocultural and cognitive development . SDU Faculty of Arts and Sciences Journal of Social Sciences, 47, 181-195. doi:10.35237/sufesosbil.565193 [Google Scholar] [Crossref] 
  95. Wang, Q., & Woo, H. (2007). Systematic planning for ICT integration in topic learning. Educational Technology and Society, 10(1), 148-156. [Google Scholar]
  96. Wu, P., Kuo, C., Wu, H., Jen, T., & Hsu, Y. (2018). Learning benefts of secondary school students’ inquiry-related curiosity: A cross-grade comparison of the relationships among learning experiences, curiosity, engagement, and inquiry abilities. Science Education, 102(5), 917–950. doi:10.1002/sce.21456 [Google Scholar] [Crossref] 
  97. Wynd, C. A., Schmidt, B., & Schaefer, M. A. (2003). Two quantitative approaches for estimating content validity. Western journal of nursing research, 508-518. [Google Scholar]
  98. Yamane, T. (1967). Statistics, An Introductory Analysis, 2nd Ed. New York: Harper Row. [Google Scholar]
  99. Yang, K., Wang, T., & Kao, Y. (2012). How an interactive whiteboard impacts a traditional classroom. Education as Change, 16(2), 313-332. [Google Scholar]
  100. Yusoff, M. S. B. (2019). ABC of content validation and content validity index calculation. Education in Medicine Journal. 11(2), 49-54. [Google Scholar]
  101. Zhang, D. (2005). Interactive multimedia-based e-learning: A study of effectiveness. The American Journal of Distance Education, 19(3), 149-162. [Google Scholar]
  102. Zhang-Kennedy, L., & Chiasson, S. (2016). Teaching with an interactive e-book to improve children's online privacy knowledge. 15th International Conference on Interaction Design and Children. Manchester, United Kingdom. [Google Scholar]