Table Index
Table 1
Phases of ADDIE Model
Phase |
Definition |
Task |
Analysis |
The
process of defining what is to be learned |
Needs
Analysis Task Analysis Context Analysis Performance Analysis Resource Analysis |
Design |
The
process of specifying how it is to be learned |
Write
objectives Develop Assessments Plan Instruction |
Development |
The
process of authoring and producing the materials |
Gather
existing materials Develop new materials Work with media Formative Evaluation |
Implementation |
The
process of installing the project in the real world context |
Train
instructors Plan resources allocation Develop maintenance system |
Evaluation |
The
process of determining the impact of the instruction |
Summative
Evaluation |
Adopted from Seels and Glasgow, 1998
Table 2
Adopter Category Descriptions
Adopter Category | Characteristics |
Innovators (first 2.5% of adopters) | Innovators are active information seekers with a high degree of media exposure and wide interpersonal networks. |
Early Adopters (second 13.5% of adopters) | Early adopters have the highest degree of opinion leadership in most social systems. They are able to communicate a subjective evaluation of innovation to peers through interpersonal communication channels. |
Early Majority (third 34% of adopters) | Early majority members adopt new ideas just before the average members of the group and may deliberate for some time before completely adopting a new idea. |
Late Majority (fourth 34% of adopters) | Late majority members adopt new ideas just after the average members of the social system and approach innovation with a skeptical and cautious air. Often peer pressure is needed for this group to adopt. |
Laggards (last 16% of adopters) | Laggards possess no opinion leadership within the group and are often isolated from other members of systems. Many time laggards do not adopt innovations until after they have been replaced by newer innovations. |
Adopted from Rogers, 1995
Table 3
Evaluation Levels
Level of Evaluation | Description |
Program | Program evaluation assesses the educational activities that provide services on a continuing basis and often involve curricular offerings. This may include a degree program or ongoing training program. |
Project | Project evaluation assesses activities that are funded for a defined period of time to perform a specific task. Projects may include training for new technologies in an office or an instructional module for a specific need. |
Materials | Materials evaluation assesses the merit or worth of content related physical items. This includes websites or manuals designed for programs or project. |
Adopted from Seels and Richey, 1994
Table 4
TNA Purposes
Purposes of TNA | Description |
Optimals | These are the desired knowledge or performance in its ideal form. |
Actuals | The existing knowledge and/or performance in the setting. |
Feelings | Opinions about the problem or task from various sources within the organization. |
Causes | Information about what various sources think is contributing to the problem. |
Solutions | This is information about ways to end or diminish the problem. |
Adopted from Rossett, 1987
Table 5
TNA Techniques
Techniques | Description |
Extant Data Analysis | The information that the company collects that represents the results of employee performance. This data already exists and should be made available by the organization. |
Needs Assessment | Step within overall topic of training needs assessment that involves the way opinions on purposes of TNA are sought. |
Subject Matter Analysis | The process in which the trainer or instructional developers seek the nature and shape of bodies of knowledge which employees need to possess to do their jobs effectively. This information concerns the knowledge the learners must know in order to perform the tasks. |
Adopted from Rossett, 1987
Table 6
Types of Criterion-Referenced Tests
Type of Criterion-Referenced Test | Purpose |
Entry Behaviors Test | This test is given to learners before they begin instruction in order to assess their mastery of the prerequisite skills needed for the unit of instruction. |
Pretest | Pretests are meant to profile learners with regard to the instructional analysis by determining what knowledge and skills they have mastered that are to be included in the instruction. The primary purpose is not for posttest comparison although that is often used in summative evaluation. |
Practice Tests | These tests provide learners an opportunity to practice the knowledge and skills during instruction. They encourage active learner participation and allow learners to evaluate their own performance during the instruction. |
Posttests | Posttests are given following instruction and measure the objectives included in the instruction especially the terminal objective. It is meant to assess learner performance and may be used to identify specific areas where the instruction is not working. |
Adopted from Dick, Carey, and Carey, 2005
Table 7
Summative Evaluation Phases
Phase | Description |
Expert Judgment | During this phase evaluators determine whether instruction has the potential for meeting the needs of the organization as they were defined in the needs assessment. Activities include congruence analysis, content analysis, design analysis, and feasibility analysis. |
Field Trial | Field trials document the effectiveness of instruction with target group members in the intended instructional and performance settings. Activities include outcomes analysis, impact on learners, impact on job, impact on organization, and management analysis. |
Adopted from Dick, Carey, and Carey, 2001
Site designed and
developed by Amanda Hodges
©2006