welcome to our Web Portfolio. This portfolio contains a collection of our work during TECH4102 course (Evaluation in Educational Technology) at Sultan Qaboos University. We wish you find it useful and navigate it easily.

Great thanks for our instructor Dr. Aalla Sadik who opens our insights to the educational technology world as we will graduate from Instructional and Learning Technologies Department.

This portfolio done by Hiba AL-Julandani (teacher446@gmail.com) & Tamadher AL-Za'abi (tamah4@gmail.com).

Monday, March 16, 2009

Evaluation in educational technology

Usability and Instructional Design Heuristics for E-Learning Evaluation
Thomas C. Reeves, Lisa Benson, Dean Elliott, Michael Grant, Doug Holschuh, Beaumie Kim, Hyeonjin Kim, Erick Lauber, and Sebastian Loh Available at: http://eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/1b/19/c8.pdf
This paper aims at evaluating e-learning programs by participants in terms of usability and instructional design. This study was conducted using Nielsen’s instrument and protocol for heuristic evaluation of e-learning programs. Nielsen’s protocol was modified for use by instructional designers and other experts engaged in heuristic evaluations of e-learning programs. This protocol provides experts with guidance on what to do before, during and after conducting the heuristics evaluation. The instrument contains fifteen aspects that help participants to evaluate e-learning programs, some of which are based upon Jacob Nielsen’s widely used protocol for heuristic evaluation of any type of software (http://useit.com/papers/heuristic/), and the rest of which are based upon factors related to instructional design. The instrument examines Visibility of system status, Match between system and the real world, Error recovery and exiting, Consistency and standards, Error prevention, Navigation support, Aesthetics, Help and documentation, Interactivity, Message Design, Learning Design, Media Integration, Instructional Assessment, Resources and Feedback.

Usability and Learning: A Framework for Evaluation
S. Ssemugabi Department of Computer Studies Walter Sisulu University, South Africa ssemugab@wsu.ac.za
M.R. De Villiers School of computing University of South Africa, South Africa dvillmr@unisa.ac.za
Available at: http://www.editlib.org/?fuseaction=Reader.ViewFullText&paper_id=25488

The purpose of this study is to develop a framework for e-learning course usability. The focus of this framework is to close the gap between the educational computing and Human computer interaction. To identify the main criteria for the framework the researcher used literature from books, journals, articles and conferencing articles about theories & models in e-learning and human computer interaction. The framework is used later to develop a questionnaire consist of 66 statements giving to (61) learners from Walter Sisulu University (WSU) in East London who took the (information system3 online course). The aim of this questionnaire is to determine the effectiveness of the framework and to test the usability of the course. In addition to the questionnaire the researcher use interview to evaluate the information system3 online course. The researcher test twenty criteria related to the usability of the course classified within three categories in the framework. From the evaluation process of the information system3, it shows (72) problems with usability in the course: (1) problem from the analysis of the closed question in the survey, (66) problems from the analysis of the opened questions in the survey, and (5) problems from the interview. The table below shows the number of problems that has been identified for each criterion as result from this study:


The table above shows that a (simplicity of site navigation) and organization of the site criterion has the most number of problems. However, No problems find for (cognitive error recognition) criteria. In general, however, users had positive perceptions of Info3Net.



No comments: