Forever at the Bend

OMDE 620 Learning and Training with Multimedia
Home
Reflections. . .
MDE Course Work
Capstone Project
Contact Me
Note:  The Learning with New Media site is not available, as it is a secure site. 

Multimedia evaluation and comparison of two programs:  WebT Learning with New Media and DNA from the Beginning

 

 

INTRODUCTION

 

The purpose of this paper is to compare two multimedia programs, Web-based Training:  Learning with New Media (“WebT”) located at http://www.uni-oldenburg.de/zef/cde/media/wbt/lmnm_extra/frameset.html and DNA from the Beginning (“DNA”), described as “an animated primer on the basics of DNA, genes, and heredity.”  http://www.dnaftb.org/dnaftb/  DNA is an editor’s choice on Multimedia Educational Resource for Learning and Online Teaching (“MERLOT”). 

 

EVALUATION APPROACH

 

        The programs were evaluated using a rubric based on the qualitative weight and sum (“QWS”) method in Baumgartner and Payr (1997).  The starting point was the 1996 European Academic Software Award (“EASA”) criteria cited by the authors.  Additional criteria were added as appropriate after review of criteria and standards articulated by Kennedy, Petrovic, and Keppell (1998), and pedagogical and usability dimensions explained by Reeves and Harmon (1994).  Criteria and standards were further refined with the taxonomy discussed by Heller et al. (2001) and usability considerations discussed by Oppermann (2002) and Lee (1999). Innovation was omitted from the criteria as I determined that it has no intrinsic value to either program.

The structure of the evaluation rubric is shown on Tables 1 and 2 attached.  First each criterion was assigned a weight in the Weight of Criterion column using the evaluation protocol.  Then a protocol definition was assigned for each criterion in the Rating of Item column based.  The evaluation of the programs is illustrated by the rating of each criterion.

 

EVALUATION AND COMPARISON OF THE PROGRAMS

 

Usability

 

        Regarding interface, Opperman (2002) instructs us that with sufficient implicit and explicit guidance, users likely spend less time learning the program and spend more time learning content.  Programs can strike the balance of sufficient guidance while allowing users appropriate control. 

DNA offers sufficient guidance while giving users information needed to control their progression through the program.  WebT falls short in giving such guidance and, therefore, users have insufficient information to perceive control over its use.

DNA’s users readily see how the content/concept areas relate to one another.  Screens and functions are consistent throughout lessons. On the contrary, WebT users likely will spend considerable time pondering the logic of its interface.  

DNA’s animations and graphics were simple, varied, non-distracting, appropriately used, and transparently relevant to content.  According to Hasebrook (1999) users better understand concepts when words and pictures are “highly interconnected” (para. 6). 

         This vital interconnection was missing in WebT.  Although the program promised animation and graphics, it was heavily text-based and did not offer a variety of learning tools.  Available tools were primitive and most often relevance to content likely would escape users who could spend too much time contemplating relevance instead of learning content. 

        DNA’s page setup, font, and placement of graphics were appealing.  Text was in quite large font, concise, and limited to one page per sub-concept thus eliminating the need to scroll.  All but a handful of graphics were of adequate size and sharp even when magnified.  WebT font was quite small and required scrolling within sub-topics.  Many graphics were difficult to see even upon magnification. 

In addition to affecting the aesthetics of interface, WebT’s presentation would prove difficult for sight impaired users.  Therefore, DNA scored higher for accessibility; WebT did not meet standards. 

        Opperman (2002) instructs us that an effective program overview is vital for navigation such that it should enable users to construct a “cognitive map” (p. 239).  DNA’s effective overview permits users to readily see the interrelationship of the three main concepts.  WebT’s overview does not facilitate a cognitive map since all it provides are three main topics whose interrelationship is not readily apparent and a graphic that offers no additional insight into possible learning paths.

        DNA’s navigation is smooth and simple.  Users can easily move among lessons and features within each lesson such as animation, audio/video, self-test/problem, etc.  Once a user has visited a sub-topic, its menu item’s color changes from blue to red.  The program would be better if it tracked users’ progress, but this is a relatively minor drawback.  All links within and to the Web that were tested were operable and delivered users to expected destinations. 

On the contrary, navigation in WebT was difficult since sub-topics did not appear until a click on each topic and a click on sub-topics revealed sub-sub-topics.  The tracking system is maddening, so much so that an entire paragraph was devoted to its decoding.  Users likely would spend too much time deciphering the code.  While DNA’s system is not perfect users likely find it easily discernible.

Users find WebT’s self-tests only by accident and with several clicks into main topics.  Self-tests are not apparent as such, being labeled with the term expert.  On the other hand, DNA’s self-tests are prominent as Problem tabs in each lesson. 

        Web Links in WebT created another thorn in the side.  The first link tested delivered users to a destination that did not quite match the expected destination.  Sound links may take up to two minutes to load, too long for users to wait to move to the next part of a program.

When I clicked on WebT’s Home, Search, and Help links, nothing happened.  While not perfectly placed, DNA’s Home link was at least reachable from lesson pages.  I discovered it by chance in a banner but its location was easy to remember.

DNA’s value is further enhanced by its documentation via use of high quality sources that are well documented in bibliographies in each lesson’s Links tab.  WebT had no such bibliography.

According to Opperman (2002) if a program offers sufficient implicit and explicit guidance then a help system is not crucial.  Since there is sufficient guidance, absence of a Help link does not detract from DNA’s value.  On the contrary, because WebT does not offer sufficient implicit and explicit guidance, an operable Help system is crucial.   

DNA outscored WebT on adaptability, first with regard to update and maintenance.  Segregating Web links in lessons’ Links and Bio tabs should make routine checks and edits fairly simple.  On the contrary, WebT’s links embedded within text would be cumbersome to maintain.

Second, DNA affords opportunity for cross-curricular usage since the program is adaptable both for advanced high school and undergraduate college students.  As discussed below in Pedagogical, because it is difficult to know the target audience or context for WebT, it is impossible to evaluate a range of uses.   

        Lastly, both programs scored equally on the use of computer criterion.  Neither posed problems with this user’s operating system. 

 

Pedagogical

 

        Kennedy, Petrovic, and Keppell (1998) remind us that the criterion of internal interaction with content is essential and “cannot be satisfied through basic point and click procedures” (p.410).  DNA allows ample opportunities for meaningful internal interaction in all lessons as users can choose animations and video clips that demonstrate concepts.  They can also choose to further explore concepts via lessons’ Links tabs that contain resources such as a link to a genetics research group’s site.  Such interaction can motivate users towards exploratory learning.  On the other hand, WebT offers meager opportunity for content interaction.  Users spend most of their time pointing, clicking, and passively reading text or viewing graphics with little apparent connection to text. 

Opperman (2002) advises that feedback, especially immediate, is central to learning.  DNA excels here with self-tests in each sub-topic/lesson that offer immediate feedback with each question answered and explanation why some choices are incorrect.  WebT offers self-tests only at the end of each main topic and feedback is not provided until after users have answered all questions.  Meager feedback consists of a score and list of correct answers with no explanation about incorrect choices.

While neither program provides self-contained external interaction, DNA likely would inspire more discussion in class among learners and/or between learners and instructor.  At a minimum, self-tests with incorrect choice explanations provide learners with context for seeking guidance.  WebT barely inspires such interaction since users are not afforded much meaningful content interaction.

        DNA again outranks WebT in the learning criterion.  Learning objectives are discernible from the topics/sub-topics menu and certainly from self-tests.  Sequencing appears appropriate and material is well structured and cohesive.  This is not so in WebT.  Topics/sub-topics/sub-sub-topics do not adequately communicate learning objectives especially since all unfold only with endless clicks.  The introduction says only that the program is a “collection of definitions, aids, tips, and practical exercises guaranteed to help you extend your knowlege (sic) in a defined way with the help of electronic media.”  But it does not tell us what knowledge users can expect to learn.   

        DNA further outranks WebT in coverage. The program first gives users an overview of genetics, then explains significance of genes, and finally explains DNA.  A MERLOT peer review touts the coverage as complete.  WebT’s coverage cannot be adequately evaluated since the program objectives are not readily discernible.  It is difficult to discern whether the “collection” mentioned above is complete.  Therefore, it did not meet standards.

        Likewise, WebT’s relevance criterion did not meet standards since its area of teaching/learning is unclear.  On the other hand, the MERLOT peer review states that DNA is quite a useful resource for teaching/learning in biology.

        WebT’s content only marginally met standards.  First it is difficult to discern whether it is appropriate for certain level(s) since its target users are not discernible.  On the contrary, the MERLOT peer review rated DNA as appropriate for its target users.   Second, WebT’s content does not provide for varying depths of information that users can explore.  DNA provides such depths, including in the Links tabs.     

        Finally, DNA scored higher on correctness as there were no glaring errors in spelling, etc, and sources were well cited.  On the other hand, WebT only partially met standards as it contains a misspelled word, knowlege, on one of the first pages that users visit.  And as noted, sources are rarely cited.    

       

CONCLUSION

 

        A detailed explanation of the tally of rubrics’ ratings is not required to convey to readers of this paper that DNA is the superior multimedia program.  The rubric reveals, however, that DNA received the highest possible scores in all criteria.  With all due respect to the creators of WebT, Learning with Multimedia is not a good example of learning with multimedia.

 

 


TABLE 1 – DNA

 

Multimedia Evaluation

 

Evaluation Protocol

 

  • E  =  essential
  • *  =  very important (very valuable)
  • #  =  important, relevant (valuable)
  • +  =  additional, less important (marginally valuable)
  • 0  =  unimportant, irrelevant (no intrinsic value)

 

Protocol Definitions

 

  • *  =  meets standards
  • #  =  partially meets standards
  • +  =  marginally meets standards
  • 0  =  does not meet standards

 

Note:

            Criterion cannot be scored higher than its given weight.

 

 

Usability

Criteria

Standards

Weight of Criterion

Rating of Item

Navigation

- User navigation is smooth and simple

-Content areas are well labeled; all links are in working order, and

- Navigation of various program levels is easily understood.

- Overview provides clear information about:

-- What program expects user to learn and

-- In what order

 

 

 

*

 

 

 

     *

Documentation

- Information, documentation, and research on page were

-- of high quality,

-- clearly distinguishable, and

-- from a reputable source.

 

- Is online help available?

 

+

 

      +

 

Interface

- Software provides learning tools for multiple learning styles;

- graphics, page setup, and colors are appropriate without being overwhelming;

- information presentation is clear and concise with a manageable cognitive load;

- media cohesive and not hodgepodge.

- Sufficient implicit and explicit guidance, while maintaining appropriate user control.

 

 

E

 

 

     *

Use of Computer

Software

- facilitates the integration of technology with existing curricula, and

- is compatible with multiple operating systems.

 

#

 

   #

 

 

Adaptability

Content

- is easily updated

- is maintained on a regular basis and

- provides the opportunity for cross-curricular usage. 

 

*

 

 

     *

 

 

 

Accessibility

 

 

Software accessible for special target groups (including users with disabilities

 

*

 

    *

 


 

Pedagogical

Criteria

Standards

Weight of Criterion

Rating of Item

Correctness

- No glaring errors in mechanics, punctuation, or spelling, and –

- sources of factual information are cited appropriately.

 

 

*

 

 

     *

Relevance

Software is relevant to teaching and learning in the subject area.

 

 

#

 

 

      #

Coverage

Subject matter is sufficiently covered.

 

*

 

      *

 

 

Interaction – external (learner-learner; learner-instructor)

Facilitates external interaction, among learners and between learner and instructor.

 

#

 

      #

Interaction – internal (learner-content)

User interaction with content

- is appropriate and

- provides immediate feedback 

 

thereby creating and maintaining learner motivation and interest as well as inspiring exploratory learning. 

 

 

 

 

E

 

 

     *

Learning

- Sequencing is appropriate,

- learning objectives are well defined and easily identified, and

- material is appropriately structured and organized to support the learning process.

 

 

 

*

 

 

 

      *

Content

- Content is age and grade appropriate while

- capable of providing more in-depth information for experienced users.

 

 

#

 

 

    #

 

Credits:  Most of the framework for this rubric was created in collaboration with my study group colleagues.  Modifications to the writer’s rubric were made after submission of the group product, thus netting the above rubric.

 

TABLE 2 – WebT

 

Multimedia Evaluation

 

Evaluation Protocol

 

  • E  =  essential
  • *  =  very important (very valuable)
  • #  =  important, relevant (valuable)
  • +  =  additional, less important (marginally valuable)
  • 0  =  unimportant, irrelevant (no intrinsic value)

 

Protocol Definitions

 

  • *  =  meets standards
  • #  =  partially meets standards
  • +  =  marginally meets standards
  • 0  =  does not meet standards

 

Note:

            Criterion cannot be scored higher than its given weight.

 

Usability

Criteria

Standards

Weight of Criterion

Rating of Item

Navigation

- User navigation is smooth and simple

-Content areas are well labeled; all links are in working order, and

- Navigation of various program levels is easily understood.

- Overview provides clear information about:

-- What program expects user to learn and

-- In what order

 

 

 

*

 

 

 

    +

    

Documentation

- Information, documentation, and research on page were

-- of high quality,

-- clearly distinguishable, and

-- from a reputable source.

 

- Is online help available?

 

+

 

   +

 

Interface

 

- Software provides learning tools for multiple learning styles;

- graphics, page setup, and colors are appropriate without being overwhelming;

- information presentation is clear and concise with a manageable cognitive load;

- media cohesive and not hodgepodge.

- Sufficient implicit and explicit guidance, while maintaining appropriate user control.

 

 

E

  

 

     +

Use of Computer

Software

- facilitates the integration of technology with existing curricula, and

- is compatible with multiple operating systems.

 

#

 

     #

 

 

Adaptability

Content

- is easily updated

- is maintained on a regular basis and

- provides the opportunity for cross-curricular usage. 

 

*

 

 

    0

 

 

 

Accessibility

 

 

Software accessible for special target groups (including users with disabilities

 

*

 

    0

 


 

Pedagogical

 Criteria

Standards

Weight of Criterion

Rating of Item

Correctness

- No glaring errors in mechanics, punctuation, or spelling, and –

- sources of factual information are cited appropriately.

 

 

*

 

 

    #

Relevance

Software is relevant to teaching and learning in the subject area.

 

 

#

 

 

      0

Coverage

- Subject matter is sufficiently covered.

 

*

 

      0

 

 

Interaction – external (learner-learner; learner-instructor)

Facilitates external interaction, among learners and between learner and instructor.

 

#

 

     +

Interaction – internal (learner-content)

User interaction with content

- is appropriate and

- provides immediate feedback 

 

thereby creating and maintaining learner motivation and interest as well as inspiring exploratory learning. 

 

 

 

 

E

 

 

     +

Learning

- Sequencing is appropriate,

- learning objectives are well defined and easily identified, and

- material is appropriately structured and organized to support the learning process.

 

 

 

*

 

 

 

     +

Content

- Content is age and grade appropriate while

- capable of providing more in-depth information for experienced users.

 

 

#

 

 

      +

 

Credits:  Most of the framework for this rubric was created in collaboration with my study group colleagues.  Modifications to the writer’s rubric were made after submission of the group product, thus netting the above rubric.

References

Bank Academy Frankfurt, Germany.  Web-based Training: Learning with New Media.     http://www.uni-oldenburg.de/zef/cde/media/wbt/lmnm_extra/frameset.html

Baumgartner, P., & Payr, S. (1997). Methods and practice of software evaluation: The case of the European Academic Software Award (EASA). Paper presented at the ED-MEDIA 97, Charlottesville.

 

Dolan DNA Learning Center, Cold Spring Harbor Laboratory, Cold Spring Harbor, NY.  DNA from the beginning.  http://www.dnaftb.org/dnaftb/

 

Hasebrook, J. (1999). Exploring electronic media and the human mind: A Web-based training. Paper presented at the World Conference on Internet, Intranet and World Wide Web (WebNet), Honolulu, Hawaii, 16 paragraphs.

 

Heller, R. S., Martin, D., Haneef, N., & Gievska-Krliu, S. (2001). Using a theoretical multimedia taxonomy framework. Journal of Educational Resources in Computing, 1(1), 1-22.

 

Kennedy, G., Petrovic, T., & Keppell, M. (1998). The development of multimedia evaluation criteria and a program of evaluation for computer aided learning. Paper presented at ASCILITE '98.

Lee, S. H. (1999). Usability testing for developing effective interactive multimedia software: concepts, dimensions, and procedures. Educational Technology & Society, 2(2).

 

Multimedia Educational Resource for Learning and Online Teaching.   http://www.merlot.org/merlot/index.htm

 

Oppermann, R. (2002). User-interface Design. In H. H. Adelsberger & B. Collis & J. M. Pawlowski (Eds.), Handbook on Information Technologies for Education and Training (pp. 234-248). Berlin, Heidelberg, New York: Springer.

 

Reeves, T. C., & Harmon, S. W. (1994). Systematic evaluation procedures for interactive multimedia for education and training. In S. Reisman (Ed.), Multimedia computing: Preparing for the 21st century (pp. 472-505). Hershey, PA: Idea Group Publishing.

Back to MDE Course Work page

Home