Educational psychology and instructional design (ID) have had a long and fruitful relationship (Dick, 1987; Merrill, Kowallis, & Wilson, 1981). Educational psychologists like Gagné and Glaser have always shown an interest in issues of design (Gagné, 1968; Glaser, 1976); indeed, they helped establish instructional design as a field of study (Gagné, 1987; Lumsdaine & Glaser, 1960). In recent years, a growing number of cognitive psychologists have shown a renewed interest in design issues, and have tested out their ideas by developing prototype teaching models. These teaching models differ from most educational innovations in that they are well-grounded in cognitive learning theory. Examples include John Anderson's intelligent tutors (Anderson, 1987) and Brown and Palincsar's reciprocal teaching method for teaching reading (for a research review, see Rosenshine & Meister, 1994). Wilson and Cole (1991) reviewed a number of these prototype teaching models, and related them to current ID theory. This chapter continues that agenda by reviewing a number of additional teaching models, and drawing implications for the design of instruction.
Specifically, the purposes of the chapter are to:
1. Argue that the development and validation of teaching models is a legitimate research method, and has been an important vehicle for advancing knowledge in learning and instruction.
2. Show how the development of cognitive teaching models compares to the development of traditional ID theory.
3. Review a number of cognitive teaching models, and discuss a few in detail.
4. Look for insights from these cognitive teaching models that relate to instructional design.
5. Identify issues for future research.
INSTRUCTIONAL PSYCHOLOGY AND DESIGN: AN HISTORICAL
To provide a context for interpreting the chapter, consider the
historical overview provided in Table 1.
Information processing psychology
Status of ID
ID engaged in theory/model development
ID engaged in redefinition
Status of instructional psychology
Moves toward cognitive mainstream
Follows mainstream towards constructivism
Relationship between ID and instructional psychology
ID and instructional psychology closely aligned
ID and instructional psychology diverge
ID and instructional psychology engaged in more dialogue
The field of instructional design developed in the 1960s and early 1970s at a time when behaviorism still dominated mainstream psychology. ID shared those behaviorist roots and at the time was closer to mainstream psychology in the U. S. ID theorists such as Gagné, Briggs, Merrill, and Scandura all were educational psychologists. With the cognitive revolution of the 1970s, instructional psychology differentiated itself from ID and drifted more to the cognitive mainstream, leaving ID relatively isolated with concerns of design. In a review of instructional psychology in 1981, Lauren Resnick (who only a few years earlier had developed Gagné-style learning hierarchies) observed:
An interesting thing has happened to instructional psychology. It has become part of the mainstream of research on human cognition, learning, and development. For about 20 years the number of psychologists devoting attention to instructionally relevant questions has been gradually increasing. In the past 5 years this increase has accelerated so that it is now difficult to draw a clear line between instructional psychology and the main body of basic research on complex cognitive processes. Instructional psychology is no longer basic psychology applied to education. It is fundamental research on the processes of instruction and learning. (Resnick, 1981, p. 660)
In her review, Resnick acknowledged that mainstream instructional psychologists had focused on issues of performance modeling and cognitive task analysis, neglecting the challenge of devising effective instructional strategies, models, and interventions. Even so, she did not look to the ID community to fill the need because "Instructional design theory..., which is directly concerned with prescribing interventions, has developed without much reference to cognitive psychology" (Resnick, 1981, p. 693). Hence, she excluded ID theory entirely from her review. Of course, the ID community was active during this time, with even some attempt at integrating cognitive psychology into their methods (e.g., Low, 1981; Merrill, Wilson, & Kelety, 1981; Merrill, Kowallis, & Wilson, 1981)--Resnick and other mainstream psychologists just weren't reading them! This polarization between ID and psychology continued through the 1980s. In spite of efforts to move ID into the cognitive mainstream, psychologists and designers continued to move in different circles and speak somewhat different languages. Psychologists viewed designers with suspicion because of the eclectic and ad hoc nature of the ID theory base and because of the field's concern for stimulus design over cognitive processes. Likewise, the ID literature often ignored developments in cognitive theory, resulting in theory that was generally divorced from state-of-the-art learning theory (e.g., Reigeluth, 1983, 1987).
Only recently, with the vigorous dialogue on constructivism and
situated learning, have psychologists and designers resumed a
substantive conversation (Duffy & Jonassen, 1992; Educational
Technology, April 1993 special issue on situated learning;
Wilson, in press). Psychologists such as Bransford, Perkins, Scardamalia,
and Lesgold, who have taken on the challenge of design, have run
up against many of the same problems addressed by traditional
ID theories. At the same time, the perspectives of psychologists
have stimulated reflection and renewal within the ID community.
The net result of this interplay is a renewed recognition of the
importance of design, as well as an array of new designs that
take into account new technologies and theories of learning.
THE ROLE OF TEACHING MODELS IN RESEARCH
Like other scientists, instructional psychologists develop theories and models describing the world, then use accepted methods of inquiry to test and revise those theories. Examples of appropriate research methods include controlled experiments in laboratory settings as well as ethnographic and qualitative studies in field settings. Another legitimate method of testing out concepts and strategies is to develop a prototype teaching model and assess its overall effectiveness in different settings. A teaching model incorporates a complex array of learning/instructional factors into a single working system. For example, John Anderson tested out his ideas of procedural learning by developing intelligent tutoring systems in LISP programming, geometry, and algebra (Anderson, 1987; Lewis, Milson, & Anderson, 1988); Ann Brown and her colleagues (Brown, Campione, & Day, 1981; Brown & Palincsar, 1989) developed reciprocal teaching as a means of testing their work on metacognition and reading.
The development and tryout of practical teaching models would not normally come to mind as a method of "research," yet surely such design and implementation efforts yield important new knowledge about the viability of cognitive theories and models. Perhaps such practical projects could be termed "inquiry" even if they do not fit the traditional connotation of "research." When researchers become interested in the problem of how people learn complex subject matters in realistic learning settings, practical tryout of programs and methods fills a role that no amount of theorizing or isolated-factor research can provide.
Teaching models can derive from direct empirical observation. Collins and Stevens (1982, 1983) closely observed teachers who used a Socratic dialogue approach and, based on the observed patterns, developed an instructional framework for inquiry teaching. Duffy (in press) developed a hypermedia tool to help preservice teachers to develop instructional strategies. The tool displays real-life segments of master teachers' lessons, then offers critiques from a number of perspectives, including those of the observed teachers. It also provides an electronic notepad for each preservice teacher to reflect on strategies used in the teaching episodes. While one could argue that Duffy's work is merely a neutral tool for displaying observed teaching performances, the tool embodies an underlying teaching model that is heavily grounded in actual teaching performances. Such "bottom-up" approaches can complement the heavy influence of "top-down" learning theory as a basis for the design of teaching models.
In summary, the development of teaching models constitutes a unique combination of theory construction and empirical testing. Theoretical abstractions must be carried to a new level of specificity as they become instantiated into an effective teaching program. At the same time, promising theory must be tested against the demands of real-world settings. Thus the development and testing of teaching models helps triangulate findings from more traditional research methods and assures a relevance to the practice of teaching.
In the review of models below, we have purposefully selected a
range of models to illustrate the diversity found in the instructional
psychology literature. We conclude with a discussion of goals
and methods for instruction aimed at bringing some order to the
For a number of years, John Sweller, an Australian psychologist from the University of New South Wales, has examined instructional implications of a model of memory called "cognitive load theory." Cognitive load theory is based on a straightforward reading of information-processing concepts of memory, schema development, and automaticity of procedural knowledge:
--Human working memory is limited--we can only keep in mind a few things at a time. This poses a fundamental constraint on human performance and learning capacity.
--Two mechanisms to circumvent the limits or working memory are:
--Schema acquisition, which allows us to chunk information into meaningful units, and
--Automation of procedural knowledge.
The first mechanism deals primarily with processing and understanding information; the second deals with the acquisition of skills. Each mechanism helps us overcome the limits of working memory by drawing on our long-term memories, which are very detailed and powerful.
Sweller's model of instructional design is based upon these concepts:
1. Our limited working memories make it difficult to assimilate multiple elements of information simultaneously.
2. When multiple information elements interact, they must be presented simultaneously. This imposes a heavy cognitive load upon the learner of the information and threatens successful learning.
3. High levels of element "interactivity" and their resulting cognitive load can be inherent in the content--e.g., learning language grammar inherently involves more element interactivity than simple vocabulary learning. However, weak methods of presentation and instruction may result in unnecessarily high overhead. An example would be to present a student a figure whose understanding requires repeated consultation of the text. The extra work required in decoding and translating the figure competes with the content for precious working-memory resources as the learner attempts to comprehend the material.
Cognitive load theory leads to some specific predictions for student learning:
--Simple content--i.e., content with relatively few intrinsic interactive elements--is not threatened by weak instructional methods. Learners are generally able to fit the demands of content and instruction within their working memories in such cases.
--Content containing high levels of interactivity among its elements cannot be learned effectively through weak instructional methods--that is, methods that require extra processing by learners. The demands of content and/or the method exceed the limits of the learner's working memory and learning does not occur.
Sweller's cognitive load theory has led to a number of instructional prescriptions, including:
--Carefully analyze the attention demands of instruction. Sweller's method defines "elements" and then counts the number of elements in instructional messages. Processing troubles arise when the learner must attend to too many different elements at the same time.
--Use single, coherent representations. These should allow the learner to focus attention rather than split attention between two places, e.g., between a diagram and the text or even between a diagram with labels not located close to their referents (Chandler & Sweller, 1991; see discussion in Sweller & Chandler, 1994, pp. 192-193).
--Eliminate redundancy. Redundant information between text and diagram have been shown to decrease learning. (See Saunders & Solman, 1984; Reder & Anderson, 1982; Lesh, Landau, & Hamilton, 1983; and Schooler & Engstler-Schooler, 1990 for research on redundant information on other tasks.)
--Provide for systematic problem-space exploration instead of conventional repeated practice (Pierce, Duncan, Gholson, Ray, & Kamhi, 1993).
--In multimedia instruction, present animation and audio narration (and/or text descriptions) simultaneously rather than sequentially (Mayer & Anderson, 1991, 1992; Mayer & Sims, 1994).
--Provide worked examples as alternatives to conventional problem-based instruction (Paas & Van Merriënboer, 1994; Carroll, 1994; in the area of analogical reasoning tasks, see Robins & Mayer, 1993; and Pierce et al., 1993).
In the section below, we present an overview of research on worked
examples to illustrate the implications of cognitive load theory
for instruction. For more discussion of the other instructional
strategies briefly listed above, refer to Sweller (1989) and Sweller
and Chandler (1994).
Conventional models of instruction in many domains involve the presentation of a principle, concept, or rule, followed by extensive practice on problems applying the rule. This approach at first glance seems like commonsense--providing ample skills practice is "learning by doing." However, cognitive load theory suggests that such instructional approaches may actually be hurting learners' understanding of the subject matter.
Sweller and Cooper (1985) examined the cognitive-load effects of methods for teaching algebra to high-school students. They hypothesized that when learners confront a conventional end-of-chapter practice exercise, they devote too much attention to the problem goal and to relatively weak search strategies such as means-end analysis. Students already know how to use general search strategies to solve problems; what they lack is the specific understanding of how cases relate to the general rule.
Sweller and Cooper hypothesized that learners might benefit from studying worked examples until they have "mastered" them, rather than working on conventional practice problems as soon as they have "obtained a basic familiarity with new material" (p. 87). The authors developed an alternative teaching model that emphasized the study of worked examples. After learners acquire a basic understanding of the algebraic principle, they study a series of examples; then the teacher answers any questions the learners have. When the learners indicate they understand the problems, they are required to explain the goal of each sample problem and to identify the mathematical operation used in each step of the problem. The teacher provides assistance to any learners who have difficulty with the questions. Then the learners complete similar problems, repeating them until they are solved with no errors; if too much time elapses, the teacher provides the answer.
Sweller and Cooper found that in the worked-examples model, acquisition of knowledge was significantly less time-consuming than in the conventional practice-based model. Furthermore, learners required significantly less time to solve similar problems (i.e., problems identical in structure) and made significantly fewer errors than did their counterparts. There were no significant group differences in solving novel problems. Thus learning was more efficient with no discerned loss in effectiveness. The authors concluded that "the use of worked examples may redirect attention away from the problem goal and toward problem-state configurations and their associated moves" (p. 86).
Sweller (1989) summarizes his position toward problem solving and learning by arguing that:
a. both schema acquisition and rule automation are the building blocks of skilled problem-solving performance...;
b. paradoxically, a heavy emphasis on conveying problem solving is not the best way to acquire schemas or facilitate rule automation because the means-end strategy commonly used focuses attention inappropriately and imposes a heavy cognitive load;
c. alternatives to conventional problem solving such as...worked examples must be carefully analyzed and, if necessary, modified to ensure that they, too, do not inappropriately direct attention and impose a heavy cognitive load; and
d. for the same reasons as for Point C, the format of instructional materials should be organized to minimize the need for students to attend to and mentally integrate disparate sources of information. (Sweller, 1989, p. 465, reformatted)
Sweller's critics might claim that students under the worked-example treatment were indeed actively engaging in problem-solving and practice activities, but that the nature of the practice shifted from traditional word problems to the study of worked examples. Instead of engaging in a multi-task activity (e.g., translating the word problem into one or more formulas, and performing calculations), the task narrowed to articulating the goal of the worked example and the appropriate mathematical operation. Sweller would likely agree with the critic. The point of the research is to suggest that not all "problem-solving" activities are equally effective. Some problem-solving activities actually leave learners at a loss, forcing them to resort to "weak" problem-solving methods--which they already know--rather than "strong" or domain-specific methods--which they are trying to learn. Bereiter and Scardamalia (1992) discuss this issue:
In novel situations, where no strong methods have been devised, weak methods are all anyone has. We use them all the time, whenever we are stumped. But just because everyone uses them, could hardly survive without doing so, and therefore practices them extensively, there is reason to question the value of teaching them. Teaching problem-solving skills may be an illusion, like teaching babies to talk. (Bereiter and Scardamalia, 1992, p. 528)
If our goal is to teach students certain well-defined domains such as algebra or physics, then giving them problems requiring extensive use of "weak" methods may be counterproductive and may even interfere with learning the domain.
Worked examples and self-explanations. One limitation of the Sweller and Cooper study was that only indirect inferences could be made concerning learners' cognitive processes. Chi and her colleagues (e.g., Chi, Bassok, Lewis, Reimann, & Glaser, 1989; Chi, de Leeuw, Chiu, and LaVancher, 1991; Chi & VanLehn, 1991) addressed this issue in the area of college-level physics. They analyzed the think-aloud protocols of good and poor problem solvers to identify the cognitive processes learners used in studying worked examples in physics. Learners in the study did not differ significantly in their prior knowledge of physics; instead, good and poor problem solvers were identified by their performance on objective tests.
Chi's worked examples differed from those used by Sweller and Cooper (1985) in significant ways. Sweller and Cooper presented worked-example sheets which were not part of a text and which included no verbal commentary. By contrast, the physics examples were part of the text and included step-by-step verbal commentary, although the learners had to infer the "why's and wherefore's" of each step.
Self-explanations were one kind of student response to the worked examples. "A self-explanation is a comment about an example statement that contains domain-relevant information over and above what was stated in the example line itself" (p. 69). Chi et al. (1989) found that good problem solvers generated more self-explanations than poor problem solvers and that poor problem solvers used the examples rotely as prompts for solving subsequent problems. Chi and VanLehn (1991) conjectured that "the act of self-explaining may make the tacit knowledge more explicit and available for use" (p. 101). They identified two general sources for self-explanations: "deduction from knowledge acquired earlier while reading the text part of the chapter, [and] generalization and extension of the example statements" (p. 69).
In an intervention study, Chi et al. (1991) found that high-ability and average-ability students benefited equally from being prompted to generate self-explanations. This finding counters other research on strategy training, which has found that such training generally benefits low-ability students while it doesn't benefit and may even interfere with the performance of high-ability students (e.g., Brown & Campione, 1981). These discrepant findings might be partially explained by the fact that the earlier studies tended to teach skills rather than strategies (see Duffy & Roehler  for a discussion of the confusion about skills and strategies).
Summary. Cognitive load theory bears a strong resemblance
to traditional instructional-design theories (Reigeluth, 1983,
1987). The prescriptions for instruction require a careful task
analysis that especially considers the memory load implications
of different content combinations and instructional methods. The
emphasis on well-defined content, worked examples, and careful
doses of presented information is reminiscent of Merrill's (1983;
Merrill & Tennyson, 1977) Rule-Example-Practice prescriptions
for teaching concepts and procedures. The emphasis on careful
control over presentation and pacing, and the strongly positive
gains attributable to managing cognitive load, serve as prudent
reminders of the importance of task and memory variables.
Collins, Brown, and colleagues (e.g., Collins, Brown, & Newman, 1989; Collins, 1991) developed an instructional model derived from the metaphor of the apprentice working under the master craftsperson in traditional societies, and from the way people seem to learn in everyday informal environments (Lave, 1988). The cognitive apprenticeship model rests on a somewhat romantic conception of the "ideal" apprenticeship as a method of becoming a master in a complex domain (Brown, Collins, & Duguid, 1989). In contrast to the classroom context, which tends to remove knowledge from its sphere of use, Collins and Brown recommend establishing settings where worthwhile problems can be worked with and solved. The need for a problem-solving orientation to education is apparent from the difficulty schools are having in achieving substantial learning outcomes (Resnick, 1989).
Emulating the best features of apprenticeships is needed because, as Gott (1988a) noted, lengthy periods of apprenticeship are becoming a rarity in industrial and military settings. She termed this phenomenon the "lost apprenticeship." She noted the effects of the increased complexity and automation of production systems. First, the need is growing for high levels of expertise in supervising and using automated work systems; correspondingly, the need for entry levels of expertise is declining. Workers on the job are more and more expected to be flexible problem solvers; human intervention is often most needed at points of breakdown or malfunction. At these points, the expert is called in. Experts, however narrow the domain, do more than apply canned job aids or troubleshooting algorithms; rather, they draw on the considerable knowledge they have internalized and use it to flexibly solve problems in real time (Gott, 1988b).
Gott's second observation relates to training opportunities. Now,
at a time when more problem-solving expertise is needed due to
the complexity of systems, fewer on-the-job training opportunities
exist for entry-level workers. There is often little or no chance
for beginning workers to acclimatize themselves to the job, and
workers very quickly are expected to perform like seasoned professionals.
True apprenticeship experiences are becoming relatively rare.
Gott calls this dilemma--more complex job requirements with less
time on the job to learn--the "lost" apprenticeship
and argues for the critical need for cognitive apprenticeships
and simulation-type training to help workers develop greater problem-solving
FEATURES OF COGNITIVE APPRENTICESHIPS
The Collins-Brown model of cognitive apprenticeship incorporates the following instructional strategies or components.
1. Content: Teach tacit, heuristic knowledge as well as textbook knowledge. Collins et al. (1989) refer to four kinds of knowledge:
--Domain knowledge is the conceptual, factual, and procedural knowledge typically found in textbooks and other instructional materials. This knowledge is important, but often is insufficient to enable students to approach and solve problems independently.
--Heuristic strategies are "tricks of the trade" or "rules of thumb" that help people narrow solution paths while solving a problem. Experts usually pick up heuristic knowledge indirectly through repeated problem-solving practice; slower learners usually fail to acquire this subtle knowledge and never develop competence. There is evidence to believe, however, that at least some heuristic knowledge can be made explicit and represented in a teachable form (Chi, Glaser, & Farr, 1988).
--Control strategies are required for students to monitor and regulate their problem-solving activity. Control strategies have monitoring, diagnostic, and remedial components; this kind of knowledge is often termed metacognition (Flavell, 1979).
--Learning strategies are strategies for learning; they may be domain, heuristic, or control strategies. Inquiry teaching to some extent directly models expert learning strategies (Collins & Stevens, 1983).
2. Situated learning: Teach knowledge and skills in contexts that reflect the way the knowledge will be useful in real life. Brown, Collins, and Duguid (1989) argue for placing all instruction within "authentic" contexts that mirror real-life problem-solving situations. Collins (1991) is less forceful, moving away from real-life requirements and toward problem-solving situations: For teaching math skills, situated learning could encompass settings "ranging from running a bank or shopping in a grocery store to inventing new theorems or finding new proofs. That is, situated learning can incorporate situations from everyday life to the most theoretical endeavors" (Collins, 1991, p. 122).
Collins cites several benefits for placing instruction within problem-solving contexts:
--Learners learn to apply their knowledge under appropriate conditions.
--Problem-solving situations foster invention and creativity.
--Learners come to see the implications of new knowledge. A common problem inherent in classroom learning is the question of relevance: "How does this relate to my life and goals?" When knowledge is acquired in the context of solving a meaningful problem, the question of relevance is at least partly answered.
--Knowledge is stored in ways that make it accessible when solving problems. People tend to retrieve knowledge more easily when they return to the setting of its acquisition. Knowledge learned while solving problems gets encoded in a way that can be accessed again in similar problem-solving situations.
3. Modeling and explaining: Show how a process unfolds and tell reasons why it happens that way. Collins (1991) cites two kinds of modeling: modeling of processes observed in the world and modeling of expert performance, including covert cognitive processes. Computers can be used to aid in the modeling of these processes. Collins stresses the importance of integrating both the demonstration and the explanation during instruction. Learners need access to explanations as they observe details of the modeled performance. Computers are particularly good at modeling covert processes that otherwise would be difficult to observe. Collins suggests that truly modeling competent performance, including the false starts, dead ends, and backup strategies, can help learners more quickly adopt the tacit forms of knowledge alluded to above in the section on content. Teachers in this way are seen as "intelligent novices" (Bransford et al., 1988). By seeing both process modeling and accompanying explanations, students can develop "conditionalized" knowledge, that is, knowledge about when and where knowledge should be used to solve a variety of problems.
4. Coaching: Observe students as they try to complete tasks and provide hints and helps when needed. Intelligent tutoring systems sometimes embody sophisticated coaching systems that model the learner's progress and provide hints and support as practice activities increase in difficulty. The same principles of coaching can be implemented in a variety of settings. Bransford and Vye (1989) identify several characteristics of effective coaches:
--Coaches need to monitor learners' performance to prevent their getting too far off base, but leaving enough room to allow for a real sense of exploration and problem solving.
--Coaches help learners reflect on their performance and compare it to others'.
--Coaches use problem-solving exercises to assess learners' knowledge states. Misconceptions and buggy strategies can be identified in the context of solving problems.
--Coaches use problem-solving exercises to create the "teachable moment."
5. Articulation: Have students think about their actions and give reasons for their decisions and strategies, thus making their tacit knowledge more explicit. Think-aloud protocols are one example of articulation. Collins (1991) cites the benefits of added insight and the ability to compare knowledge across contexts. If learners' tacit knowledge is brought to light, that knowledge can be recruited to solve other problems.
6. Reflection: Have students look back over their efforts to complete a task and analyze their own performance. Reflection is like articulation, except it is pointed backwards to past tasks. Analyzing past performance efforts can also influence strategic goal-setting and intentional learning (Bereiter & Scardamalia, 1989). Collins and Brown (1988) suggest four kinds or levels of reflection:
--Imitation occurs when a batting coach demonstrates a proper swing, contrasting it with your swing;
--Replay occurs when the coach videotapes your swing and plays it back, critiquing and comparing it to the swing of an expert;
--Abstracted replay might occur by tracing an expert's movements of key body parts such as elbows, wrists, hips, and knees, and comparing those movement to your movements;
--Spatial reification would take the tracings of body parts and plot them moving through space.
The latter forms of reflection seem to rely on technologies--video or computer-- for feasible implementation.
7. Exploration: Encourage students to try out different strategies and hypotheses and observe their effects. Collins (1991) claims that through exploration, students learn how to set achievable goals and to manage the pursuit of those goals. They learn to set and try out hypotheses, and to seek knowledge independently. Real-world exploration is always an attractive option; however, constraints of cost, time, and safety sometimes prohibit instruction in realistic settings. Simulations are one way to allow exploration; hypermedia structures also allow exploration of information.
8. Sequence: Present instruction in an ordering from simple to complex, with increasing diversity, and global before local skills.
--Increasing complexity. Collins et al. (1989) point to two methods for helping learners deal with increasing complexity. First, instruction should take steps to control the complexity of assigned tasks. They cite Lave's study of tailoring apprenticeships: apprentices first learn to sew drawers, which have straight lines, few pieces of material, and no special features like zippers or pockets. They progress to more complex garments over a period of time. The second method for controlling complexity is through scaffolding. Here the cases or content remains complex, but the instructor provides the needed scaffolding for initial performances and gradually fades that support.
--Increasing diversity refers to the variety in examples and practice contexts.
--Global before local skills refers to helping learners acquire a mental model of the problem space at very early stages of learning. Even though learners are not engaged in full problem solving, through modeling and helping on parts of the task (scaffolding), they can understand the goals of the activity and the way various strategies relate to the problem's solution. Once they have a clear "conceptual map" of the activity, they can proceed to developing specific skills.
The three teaching models presented below illustrate various features
of the cognitive apprenticeship model. The first two are computer-based
environments: Sherlock and goal-based scenarios. The third
model is the problem-based learning environment developed by medical
educators at the University of Illinois. All three models build
instruction around problems or cases that are faithful to real-life
situations, within which learners learn the details of a subject
What happens when you combine extensive cognitive-task analysis of a well-defined, technical domain with a situated-learning philosophy and a teaching model based on intelligent tutoring systems (ITS)? Sherlock is an example of such a teaching tool. Sherlock is a computer-coached practice environment developed by Alan Lesgold and colleagues (e.g., Lajoie & Lesgold, 1992; Lesgold, et al., 1988; Lesgold, Lajoie, Bunzo, & Eggan, 1992) to develop the troubleshooting skills of Air Force electronics technicians. Sherlock was specifically designed to teach the most difficult parts of the troubleshooter's job. Learners are presented a number of troubleshooting problems requiring two kinds of activities:
--The student solves the problem, requesting advice from the intelligent tutor/coach as necessary.
--The student reviews a record of his/her problem-solving activity, receiving constructive critique from the coach. (Gott, Lesgold, & Kane, in press)
Sherlock serves not only as an instructional environment and an assessment device, but also as a laboratory for instructional research (Lajoie & Lesgold, 1992). Assessment is interwoven with instruction so that coaching is highly individualized. This is achieved using expert systems technology to create two types of student modeling: a competency model which is updated throughout the program, and a performance model. Together, they provide the basis for diagnosing learner problems and selecting appropriate instructional mediation relating to goals, operators, methods and strategies.
A central feature of Sherlock is its intelligent hyperdisplay:
When Sherlock constructs a schematic diagram to help illustrate the advice it is providing, that diagram is organized to show expert understanding about the system with which the trainee is working....What is displayed is approximately what a trainee would want to know at that time, but every display component is "hot" and can be used as a portal to more detail or explanation. (Gott et al., in press)
Sherlock's diagrams are dynamic; that is, they are assembled at any point in the program to be sensitive to immediate conditions. The diagrams are adjusted so that conceptually central components are afforded the most space; diagram boxes and circuit paths are color-coded to reflect the learner's prior knowledge about them.
Gott, Hall, Pokorny, Dibble, & Glaser (1992) studied Sherlock learning environments to find out how students made flexible use of their knowledge in novel situations:
Time and again we observed [successful] learners access their existing mental models of equipment structure...and their schema of the troubleshooting task.... They then used these models as flexible blueprints to guide their performance as they crafted solutions to new problems. Their prior models became interpretive structures, and when these models were inadequate, better learners flexibly used them as the basis for transposed and elaborated structures that could accommodate the novel situations. They were ready and willing to construct new knowledge that was grounded in their existing representational and functional competence. (Gott et al., in press)
The current incarnation of the program, Sherlock 2, includes a number of refinements aimed at facilitating students' development of device models and transfer of knowledge. Consistent with the cognitive apprenticeship model, students have a variety of supports and reflective tools available to them:
To complement coached learning by doing, we have developed a collection of tools for post-performance reflection. One provides an intelligent replay of the trainee's actions. A trainee can "walk through" the actions he just performed while solving the problem. In addition, he can access information about what can in principle be known about the system given the actions replayed so far.... Also, he can ask what an expert might have done in place of any of his actions, get a critique of his action, and have his action evaluated by the system....Further, there is an option for side-by-side listing of an expert solution and the trainee's most recent effort. (Gott et al., in press)
One key to Sherlock's success is the extensive and sophisticated cognitive task analysis which provided "critical information necessary for developing the appropriate student models of proficiency" (Lajoie & Lesgold, 1992, p. 381). The program addresses eight dimensions of proficiency proposed by Glaser, Lesgold, and Lajoie (1987) and based on research on expert-novice differences (Lajoie & Lesgold, 1992):
1. knowledge organization & structures,
2. depth of underlying principles,
3. quality of mental models,
4. efficiency of procedures,
5. automaticity to reduce attentional demands,
6. procedural knowledge,
7. procedures for theory change, and
8. metacognitive skills.
According to Gott et al. (in press), the Sherlock team's approach to task analysis is similar but distinguishable from traditional instructional-design approaches. "What is different is that the structure of learning tasks is more authentic, rooted in the needs of practice (or simulated practice) rather than being derived directly from task analysis structure...."
Research has found that learners who used Sherlock improved
dramatically in their troubleshooting skills, during training
as well as on a posttest (Lesgold et al., 1988; Nichols, Pokorny,
Jones, Gott, & Alley, in press). Sherlock 2 yielded
effect sizes on posttest measures ranging from .87 to 1.27 (Gott
et al., in press).
Roger Schank and colleagues at the Institute for the Learning Sciences have developed an architecture for the design of learn-by-doing courses and simulations (Schank, Fano, Bell, & Jona, 1993/1994). Goal-based scenarios constitute an alternative to traditional intelligent tutoring systems (ITS) that combine elements of simulation, case-based reasoning, and traditional ITS modeling techniques. Riesbeck (in press) describes the concept of a goal-based scenario :
In a [goal-based scenario], a student is given a role to play, e.g., owner of a trucking company or chief scientist at a nuclear research installation, and interesting problems to solve or goals to achieve. The role and problems should be of real interest to the student, e.g., feeding the world, getting rich, or flying a rocket to the moon, not artificial word problems....
The student engages in a simulation in order to solve the defined problem or achieve the goal. Typically the student interacts with simulated agents and objects within a simulated environment. A goal-based scenario differs, however, from traditional simulations in a number of respects, for example:
When the student gets stuck or in trouble, a tutor, in video form, appears to offer advice, tell stories, and so on. The stories come from a multimedia archive of texts and video interviews of experts in that domain, telling personal experiences similar to the student's simulated situation. These stories are also organized for browsing in a structure we call ASK networks. (Ferguson et al., 1991, cited in Riesbeck, in press)
ASK networks are systems for indexing and archiving stories in a way that makes them useful within the goal-based scenario. Stories are brought into the simulation as the need arises, with the indexing sensitive to the learner's progress and other local conditions.
Three goal-based scenarios that have been developed are briefly reported below.
Broadcast News. High school students collaborate to produce their own simulated TV news broadcast. "The student first sees a brief introduction informing him or her that he or she will be working on a newscast for a particular day in history...[say] May 21, 1991. The student is then given a rough draft of a news story that requires revisions so it can go on the air" (Schank et al., 1993/94, p. 309).
Students often lack the historical or political knowledge to understand the draft script, so they consult tools and resources within the program that provide a context for understanding the script.
The student then needs to revise the script and prepare it for broadcast. Rather than personally re-writing the draft, the student submits specifications for revision back to the writers. The program's experts can provide support and advice through this process. As in real life, two experts' feedback, in fact, may conflict with each other, forcing the student to decide how to interpret suggestions and incorporate them into the revised script.
After the student gives final approval to a story, he or she can then choose to play the role of anchorperson for the newscast as well. The program then acts as a teleprompter and editing booth. The student reads the story as the text rolls by on the screen. A video camera controlled by the computer records the student as he or she plays the role of anchor; the computer also supplies the video accompanying the story. A complete videotape of the student's newscast is ready as soon as the newscast ends. ( Schank et al., 1993/94, pp. 309-310)
The student can watch the tape and compare it with a professional network newscast covering the same event. This comparison fosters reflection and discussion about the process and decisions made.
Sickle Cell Counselor. This is an interactive hypermedia exhibit designed for the Museum of Science and Industry in Chicago (Bell, Bareiss, & Beckwith, 1993/94; Schank et al., 1994, pp. 310-311). The user assumes a role of genetic counselor advising couples of the genetic risks of their upcoming marriage. The student is able to run simulated lab tests, interact via interactive video with the couple, collect data from the couple, and offer advice. Research indicated that museum visitors spent considerably more time with the exhibit than for other exhibits and learned something about genetics (based on both self-report and performance on pre- and posttests).
YELLO. This is a program designed to teach telephone operators how to sell Yellow Pages advertising (Kass, Burke, Blevis, & Williamson, 1993/94). The program follows a framework designed for the teaching of complex social skills. Of particular interest is the interjection of stories into the practice section. The program tracks student performance and retrieves a "story" that matches the student's performance profile, based on a sophisticated indexing scheme. The story--a real-life recounting by an experienced practitioner--is then provided to the student to strengthen motivation and make the task meaningful, as well as to correct the performance.
Schank et al. (1993/1994) outline the principal components of goal-based scenarios, along with criteria for good design. The following four components form the basis of a goal-based scenario.
Mission. The mission is the overall goal of the goal-based scenario. A scenario's mission may relate to process skills or outcome achievement skills. Process skills (e.g., running a trucking company, flying a plane, being a bank teller, etc.) lend themselves to role-play scenarios where the learner assumes the role of a character and learns the knowledge and skills related to that role. Outcome achievement skills (e.g., troubleshooting an engine or building a bridge) lend themselves to a scenario focusing on a specific task or achieving a particular result. By accomplishing the task, the student learns relevant skills along the way.
Mission focus. Schank et al. (1993/94) refer to the mission focus as "the underlying organization" of the activities engaged in by students (p. 327). They identified four mission foci:
--Explanation. Students are asked to account for phenomena, predict outcomes, diagnose systems, etc. Sickle Cell Counselor has an explanation mission focus.
--Control. Students are asked to run an organization or maintain and regulate a functioning system. Examples of a control focus would be managing a software project or running a nuclear power plant (Schank et al., 1993/94, p. 331).
--Discovery. Students enter a microworld and explore the features available. They may be asked to infer the microwold's governing principles or participate in activities available. YELLO offers an example of this type of mission focus.
--Design. Students create or compose some product or create the design specifications for some artifact. An example would be Broadcast News.
Cover story. The cover story provides the specifics of the student's role and the surrounding context. Schank et al. (1993/94) suggest designing a cover story around something the student might like to do (e.g., be president of the United States or fly an airplane) or something the student would have some strong feeling for (e.g., investigate the Chernobyl accident site, help a person threatening suicide). The details of the story are worked out in the design of the cover story, including the "setup" (explanations to students about why the scenario is important, specification of tools available in solving problems, etc.) and the "scenes" (the specific physical settings encountered in the story).
Scenario operations. Specification of scenario operations is the final stage of a goal-based scenario design. Scenario operations are the discrete, specific responses required of students engaged in the program. Examples might include "adjusting a parameter with a dial, issuing a directive in a social simulation, answering a question, using a tool to shape part of an artifact, searching for a piece of information, and deciding between two alternatives" (Schank et al. (1993/94, p. 336).
In many ways, Schank's work continues the tradition of developing instructional simulations. Simulations have been popular forms of instruction since the advent of computer-based instruction in the 1960s, including models of role-playing, control of dynamic systems, and task performance. Schank's contribution has been in the development of a sound theoretical model based on cognitive memory research, and in the creation of a design laboratory that follows a well-defined development model in creating working products. The costs of developing full-blown goal-based scenarios undoubtedly remain high, but they signal important progress in the design of instructional systems.
As noted above, intelligent tutoring systems rely on extensive and sophisticated cognitive task analysis to develop expert and student models; moreover, they are usually designed for individual learners. Like Sherlock, they tend to address well-structured problems in well-structured domains and build on a broad base of content knowledge. Problem-based learning (PBL) fills a different need, addressing ill-structured problems and/or ill-structured domains. Koschmann, Myers, Feltovich, and Barrows (1994) distinguished between an ill-structured domain, "in which no single concept, or even a small number of conceptual elements, is sufficient for capturing the workings of a typical instance of knowledge application" (Koschmann et al., 1994, p. 231), and an ill-structured problem. According to Korschman et al. (1994, summarizing Barrows & Feltovich, 1987), ill-structured problems have the following characteristics:
--[D]efining the problem requires more information than is initially available;
--the nature of the problem unfolds over time;
--there is no single, right way to get that information;
--as new information is obtained, the problem changes;
--decisions must be made in the absence of definitive knowledge; and
--there may never be certainty about having made the right decision. (p. 231, punctuation altered and formatting added)
Problem-based learning integrates the learning of content and skills, utilizes a collaborative environment, and emphasizes "learning to learn" (as opposed to case-based "learn by doing") by placing most of the responsibility for learning on the learner rather than providing a sophisticated pre-designed instructional system.
The PBL model has been implemented in several areas of higher education, including medicine, business, education, architecture, law, engineering, and social work, as well as in high school (Savery & Duffy, in press). However, the best known applications of PBL are in medical schools, where it was developed in the 1950's, and whose graduates face particularly ill-structured problems in an ill-structured and ever-expanding domain that requires life-long learning skills. (See, e.g., Williams , Savery & Duffy [in press], and Koschmann et al.  for critical overviews of the use of PBL in medical schools.) More than a hundred medical programs include a PBL option (Duffy, 1994). PBL combines the teacher-directed case method which is used extensively in law and business schools with the discovery-learning philosophy of Jerome Bruner (Lipkin, 1989; Schmidt, 1989).
During the first two years of medical school, students in a PBL curriculum work in small self-directed groups to learn both content (basic science knowledge) and skills (examining and diagnosing patients, and metacognitive skills such as self-monitoring, reflections, and resource allocation). In lieu of traditional lectures and laboratory exercises, the problem-based curriculum presents a series of authentic patient problems; groups of five to seven students work intensely for about a week on each problem, diagnosing and learning to understand its causes. Authenticity is critical in motivating students and in avoiding the "construction of fictional problems...[with] symptoms that cannot coexist" (Williams, 1992, p. 404).
The facts of the problem are presented, just as they were initially to a doctor, as an incomplete set of symptoms that must be evaluated and explained. For practical reasons, the presentation is usually simulated on paper or by an actor trained as a patient. The facts of the complete case are contained in a problem-based learning module (Barrows, 1985; Distlehorst & Barrows, 1982), which "is designed to allow for free inquiry, providing responses for any question, examination, or laboratory test an examiner might request for the actual patient" (Koschmann et al., 1994, p. 241), without cueing any factors that are critical to the case (Savery & Duffy, in press).
A tutor facilitates students' negotiation of five recursive stages of the problem-based methodology (Koschmann et al., 1994):
1. Problem formulation. Students isolate important facts from their rich context, identify the problem, and generate hypotheses.
2. Self-directed learning. Group members identify and address information needed to evaluate hypotheses. This list of needed information sets the learning agenda. For example, they might research basic biological mechanisms that might underlie a patient's problems, question or "examine" the patient, review results of tests they "order," and consult with the medical faculty.
3. Problem re-examination. Group members bring to bear their findings from their self-directed learning activities--adding, deleting or revising hypotheses as warranted.
4. Abstraction. This is an articulation process (cf. Collins, Brown, & Newman, 1989) during which members compare and contrast cases, forming cognitive connections to increase the utility of the knowledge gained in specific contexts.
5. Reflection. At this point the group debriefs the experience and identifies areas for improvement in their learning processes.
The role of the tutor, according to Barrows (1992), is critical to "the success of any educational method aimed at 1) developing students' thinking or reasoning skills (problem solving, metacognition, critical thinking) as they learn, and 2) helping them to become independent, self-directed learners..." (p. 12). The tutor must not provide mini-lectures or suggest solutions but help each member of the group internalize effective metacognitive strategies by monitoring, modeling, coaching, and fading. This role includes
not only moving students through the various stages but also monitoring group process and the participation of individuals within it, guiding the development of the clinical reasoning process by strategically questioning the rationale underlying the inquiry strategy of the group or individuals, externalizing self-questioning and self-reflection by directing appropriate questions to individuals or the group as a whole and evaluating each student's development. (Koschmann et al., 1994, p. 243)
The tutor externalizes higher-order thinking that students are expected to internalize, by asking students to justify not only their inferences (e.g., "How do you know that's true?") but also any question they ask the patient (Savery & Duffy, in press; Williams, 1992). Thus the students learn to identify and challenge superficial thinking and vague notions. In PBL, the tutor carefully monitors each student's development and forces him/her to remain an active learner.
Medical students appear to be very positive about the approach, particularly in their first year. However, many issues are still to be addressed. Because so much of the work is group-oriented, PBL cannot guarantee that the student will do his/her own thinking. As Koschmann et al. noted:
Teaching methods that depend on group interaction often experience what is termed the polling problem; the opinions of individuals vary as a function of the order in which their views are gathered. Contributions of less dominant members may be suppressed or contaminated by the more dominant members; convictions of any single individual in the group may be inappropriately influenced by other members; individuals can find means to hide or ride on the coattails of other group members. The polling problem, therefore, can result in the suppression of ideas, reducing the multiplicity of viewpoints expressed. (p. 243)
Compared to the computer-based programs reviewed above, PBL is relatively ill-defined; that is, students' specific interactions cannot be prespecified. Because of this, good design and successful implementation become intertwined--Design must happen constantly throughout the course of students' activities. The instructor must remain vigilant to ensure that less dominant members still have a voice within the group. Reluctant learners need to be monitored and encouraged to participate. More than some forms of controlled instruction, PBL depends on high-quality implementation by skilled instructors and participants.
PBL is also time-consuming. Problem-based activities can become tedious and boring, especially for students who have already internalized the clinical reasoning process. These complaints lead one to ask: Is it essential that students discover or seek out primary sources in the library for every case? Or might learning be just as effective but more efficient if hypermedia programs consolidated answers to relevant questions to reduce the temporal and cognitive loads, at least for some of the cases? A careful analysis to identify the specific activities most valuable in generating new knowledge seems in order (cf. Collins' epistemic games, described below).
Indeed, faculty have begun to modify PBL programs. For example, faculty at the University of New Mexico School of Medicine designed focused cases "to solve these problems by encouraging learning on a single topic rather than leading to the generation of multiple, diverse hypotheses" (Williams, 1992, pp. 403-404). Koschmann et al. (1994) are developing computer-assisted programs to augment PBL, several of which directly or indirectly reduce cognitive load. These programs will facilitate
--students' expression of their honest viewpoints,
--"a retrievable record of the group's deliberations for previously studied cases" (p. 247),
--the tutor's monitoring of the development of each person's progress,
--the collection of authentic cases,
--the selection of cases appropriate to the needs of the students,
--students' communicating outside group meetings,
--students' access to learning resources,
--students' access to information in their notes.
Koschmann et al. (1994) are designing programs that utilize groupware
(Stefik & Brown, 1989), hypertext/hypermedia (Conklin, 1987),
database technologies, and electronic mail.
In an article on the design of collaborative learning environments, Pea (1994) describes three metaphors of communication:
1. Communication as transmission of information. This is the dominant idea that communication conveys a message over time and distance from one person to another.
2. Communication as ritual. This refers to the participation and fellowship involved in the sharing of certain forms of expression. Participation in the performing arts such as dance, theater, and music, either as a performer or as an audience member, involves ritualistic aspects of communication. The content of the message is often less important than the medium and style of expression, informing the audience and strengthening the common bond of group membership. Ritual communication emphasizes the sharing and communal functions of communications, allowing groups to maintain a sense of identity and coherence.
3. Communication as transformation. Both the "sender" and "receiver" of information are transformed as they share a goal of learning and knowledge generation. Participants in transformative communication open themselves up and expect change to occur as part of the process. Communication thus serves as a stimulus to inquiry, observation, and reflection. Transformative communication combines aspects of knowledge sharing and group collaboration, with an emphasis on new experience and learning (Pea, 1994, pp. 287-288)
Transformative styles of communication are characteristic of learning communities, whether in schools, classrooms, workgroups, or families. Pea (1994, p. 289) notes that a number of researchers are presently moving from a cognitive-science base toward a social-cognition framework in their attempt to understand the symbols and discourses of learning communities. Cognitive apprenticeships are an example, as are Brown's (1994) communities of learning and Scardamalia & Bereiter's (1994) knowledge-building communities. The common notion is that groups of people share a goal of building meaningful knowledge representations through activities, projects, and discussion.
Transformative communication seems not to be emphasized in Sherlock or Schank's case-based scenarios. The goal is not mutual change between communicating parties, but more computer-directed change in the student. Similarly, the PBL model expects the students to be transformed, playing roles of both senders and receivers, but the tutor is expected to remain basically detached, monitoring, coaching, and externalizing higher-order thinking. A transformative or learning-community view would suggest that the instructor is a part of the learning community, and should be an active, learning participant in the community.
The models described below are designed as tools to support knowledge-building
What do learning communities do? Collins and colleagues (Collins & Ferguson, 1993; Morrison & Collins, in press) would respond that learning communities generate new knowledge by participating in certain defined cultural patterns or forms. The products of this work are called epistemic forms. "Completed" forms contain new knowledge and adhere to defined structures accepted by the community. Working together to generate these forms is called participating in epistemic games. The game is the set of rules or conventions that can be followed in generating a given epistemic form.
Collins and Ferguson (1993) suggest three important types of epistemic games, along with several sub-categories shown in Table 2:
1. Structural analysis games. What are the components or elements of a system?
2. Functional analysis games. How are the elements in a system related to each other?
3. Process analysis games. How does the system behave?
Each of these general game types is found in every subject matter.
Additional knowledge-building games and activities are found in
Collins and Ferguson (1993) and in Jonassen, Bessner, and Yacci
(1993). Domain-specific games take on very specific forms, for
example, designing a research study or developing a timeline for
a project. As games become more domain-specific, they typically
become more valuable to participants of that work area.
CATALOGUE OF GAMES
STRUCTURAL ANALYSIS GAMES
List. Make a list of answers to a specific question.
Spatial decomposition. Break an entity down into non-overlapping parts and specify topographical relations between them.
Temporal decomposition. Make a list of sequential stages of a process.
Compare & contrast. Compare salient features of entities.
Cost-benefit. Identify the pros and cons of choices.
Primitive elements. Characterize a set of phenomena by their makeup or component elements.
Cross products. Compare listed items across a set of dimensions or attributes.
Axiom systems. Diagram the relationships between a set of formulae and their rules of inference.
FUNCTIONAL ANALYSIS GAMES
Critical-event. Identify causes leading to an event, or the consequences derived from an event.
Cause & effect. Use critical-event analysis, distinguishing between causes and preconditions. Each effect of a cause can become a cause of a new effect.
Problem-centered. Break an event stream into problems and actions taken to solve them. The side effects of the solution may cause new problems.
AND/OR graphs. Create a causal chain diagram showing the logical AND and OR relationships between links in the chain.
Form-and-function. Distinguish between an objects structure and its purpose.
PROCESS ANALYSIS GAMES
Systems-dynamics. Model a system showing how the contributory variables increase and decrease, and how they affect the system via feedback.
Aggregate behavior. Model a system showing how the interactive events between the components affect the behavior of the system.
Constraints. Model a system by creating a set of equations which describe system behavior.
Situation-action. Model a situation by a set of rules to apply in various cases. The situation can change because the world changes, or an agent takes action.
Trend/cyclical. Model the relationships between variables by showing how each changes over a period of time. Variable behavior can be linear, exponential, cyclical or growth.
Morrison and Collins (in press) argue the following:
1. Our culture supports numerous ways of constructing knowledge--some domain-specific, and some more general.
2. These different ways of constructing knowledge, which we call epistemic games, are culturally patterned.
3. Different contexts (communities of practice) support different ways of knowing, and therefore different kinds of epistemic games. People are more or less fluent epistemically, depending largely on their contextual experiences, i.e., the sorts of subcultures and communities of practice in which they have participated.
4. An important goal of school is to help people become epistemically fluent, i.e., be able to use and recognize a relatively large number of epistemic games.
5. A key question to ask about particular environments is whether they tend to foster (or inhibit) epistemic fluency.... (Morrison & Collins, in press)
The epistemic-game framework can serve as a language for describing learning activities within constructivist learning environments (Grabinger, this volume; Wilson, in press).
According to Collins and Ferguson (1993, pp. 27-28), the playing of epistemic games exhibits the following characteristics:
1. There are constraints to playing. In playing a list game, for example, the items listed should be similar (that is, on the same scale or level) and yet distinct from one another. The lists should be comprehensive in their coverage (that is, leaving nothing important out), yet brief and succinct. These constraints can serve as the rules or criteria we use to judge the quality or appropriateness of a new list.
2. There are entry conditions that define when and where game-playing is appropriate and worthwhile. The list game, for example, becomes appropriate in response to a question such as "What is involved in X" or "What is the nature of X?" where X can be decomposed or analyzed in simple fashion.
3. Allowable moves are the actions appropriate during the course of the game. List moves include adding a new item, consolidating two items, and rejecting or removing an item from consideration.
4. Players occasionally may transfer from one game to another. For example, a list game may shift to a hierarchy game when the structure of list elements begins to assume a form containing sub-categories.
5. Game playing results in the generation of a defined epistemic form, e.g., lists, hierarchies, processes, etc.
The Collins framework of epistemic games and forms provides a structure and language to articulate what learning communities do when they work together to generate new knowledge. Such a framework can become useful to understanding classroom and workgroup processes, but it also can serve a prescriptive or heuristic role for teachers and designers. Many teachers complain that they want to teach critical thinking, but have failed to find a suitable set of strategies.
Epistemic games can be useful to teachers in either of two ways:
1. Using the framework as a diagnostic or interpretive device. Existing learning activities can be interpreted from an epistemic-game perspective, providing valuable insights into processes and interactions.
2. Targeting game-playing as a learning objective. Our students (Trigg & Sherry, 1995) are presently developing learning materials aimed at teachers, encouraging them to engage students directly in epistemic games.
While empirical research in this area is only in beginning stages,
epistemic game-playing seems a promising way to think about knowledge-generating
activities. It provides a needed link between cultural forms and
cognitive-epistemic points of view. Research must address many
questions, such as the extent to which the games encourage knowledge
generation rather than rote learning, and the types and amount
of scaffolding that are desirable in various learning situations.
One could imagine students mindlessly developing a list or guessing
at causes where no new knowledge was generated. Rules need to
be developed for playing games in a way that is conducive to knowledge
Tabletop (Hancock & Kaput, 1990a; Hancock & Kaput, 1990 b; Hancock, Kaput, & Goldsmith, 1992; Kaput & Hancock, 1991) is a computer-based tool that allows users to manipulate numerical data sets. By combining features of what Perkins (1992a) calls symbol pads and construction kits, Tabletop provides a "general purpose environment for building, exploring, and analyzing databases" (Hancock et al., 1992, p. 340).
The program allows the user to construct a conventional row-and-column database and then to manipulate the data by imposing constraints on the data with animated icons. Double-clicking on an icon displays the complete record for that icon. Summary computations can be represented in a variety of formats, including scatter plots, histograms, cross tabulations, Venn diagrams, and other graphs.
Tabletop is a product of two major design goals: intelligibility and informativeness. Hancock et al. (1992) compare the role of the individual icons, which allow the learner to identify with them physically, to the role of the Turtle in Papert's Logo programming language. They theorize that the icons provide a "pivotal representation in which kinesthetic/individual understanding can be enlisted as a foundation for developing visual/aggregate-understanding" (p. 346).
Intelligibility and accessibility are also supported by other aspects of the program. For example, the user constructs and can modify graphs through a series of reversible steps and always has the full database in view. Thus, the user can observe the effect each new constraint has on each member of the database (a feature which also contributes to the program's informativeness).
The intelligibility and informativeness of the program support the learner in negotiating meaning in a real-world iterative process of construction, question-asking, and interpretation. Hancock et al. (1992) describe a case in which a student used Tabletop to graph a hypothesis about data which had not yet been entered in the database.
Tabletop was initially piloted on students aged 8 to 15, and with students aged 11 to 18. The pilot studies provided insight into the kinds of questions, problems and thinking processes that students engage in during all phases of data modeling and confirmed Hancock et al.'s (1992) belief that data creation and data analysis are inextricably intertwined. The description of the thinking students engaged in clearly reveals that in the "data definition phase" the students were drawing on the "raw data" of their individual experiences.
Tabletop was clinically tested on an eighth-grade class and a combined fifth- and sixth-grade class in six units during one school year. Field observation clearly demonstrated that Tabletop can help students develop their understanding of many kinds of graphs. However, students were less successful in:
a. using that graph to characterize group trends;
b. constructing the graph in order to generate, confirm, or disconfirm a hypothesis;
c. connecting the graph with the data structures necessary to produce it; and
d. embedding the graph in the context of a purposeful, convergent project. (pp. 361-62)
Tabletop is not designed to be a self-contained program for developing skills and concepts in data modeling. Indeed, the developers envision it as a tool in a collaborative learning environment, with students helping each other and receiving appropriate scaffolding and coaching from the teacher, as in a cognitive apprenticeship model. In the pilot and clinical tests of the program, students sometimes were unable to perceive--even with some coaching and scaffolding--that they could not create particular graphs because they had not coded the data in a relevant way. Because of time constraints, the teacher sometimes scaffolded learning by adding a relevant data field "between sessions" (Hancock, Kaput, & Goldsmith, 1992, p. 350). Although the researchers imply that it would have been preferable for the students to discover the solution for themselves, research must still address the issue of whether, when, and how much scaffolding of this sort is beneficial.
As noted above, while Tabletop was generally effective in helping students understand a variety of graphs, it was less effective in helping students use the graphs to support general conclusions. Like Duffy and Roehler (1989), Hancock, Kaput, and Goldsmith (1992) found that learning and incorporating new strategies into one's repertoire requires much time. They concluded that one year was insufficient time for students to develop "authentic, well-reasoned data modeling activities" (p. 353). Even after a year, students' projects lacked coherence and purpose, beginning "without clear questions, and end[ing] without clear answers" (p. 358).
Two lessons can be learned from Tabletop. First, it is an excellent
example of a tool that allows ideas and content elements to be
manipulated, tested, explored, and reflected upon. Students working
with Tabletop have a qualitatively different experience than they
would completing exercises at the back of the chapter. Second,
students using these kinds of tools need well designed supports,
meaningful goals and projects, and attentive teachers to realize
the tool's potential. Even when conditions are favorable and care
is given to design and support, students cannot be expected to
reach higher levels of schema acquisition and problem-solving
skill simply by having experience with the tool. Learning environments
that allow projects, data manipulation, and exploration require
continuing attention to design in order for students to achieve
learning gains (see Jonassen, 1995 for a discussion of other tools
useful to learning communities).
COMPUTER-SUPPORTED INTENTIONAL LEARNING ENVIRONMENTS
Scardamalia, Bereiter, McLean, Swallow, and Woodruff (1989) observe:
There has been a history of attempts in computer-assisted instruction to give students more autonomy or more control over the course of instruction. Usually these attempts presupposed a well developed repertoire of learning strategies, skills, and goals, without providing means to foster them. (p. 51)
Scardamalia and Bereiter envision a computer-based learning environment wherein students can learn and exercise these metacognitive skills, giving the name computer-supported intentional learning environments to "environments that foster rather than presuppose the ability of students to exert intentional control over their own learning..." (Scardamalia et al., 1989, p. 52). In a series of studies, Scardamalia & Bereiter (1992) found that children were capable of generating impressive higher-order questions about a new subject, based upon their interest and background knowledge. These questions could then be used to guide students' research and exploration of the topic. Intentional learning environments are designed to support the high-level, knowledge-generating activity resulting from this question-asking process.
The authors have developed a model computer program referred to as "CSILE" (for computer-supported intentional learning environment), the acronym denoting the specific program developed in the laboratory. While CSILE exhibits a number of design features, for space reasons we focus on the program's foundation and philosophy. In an early report, Scardamalia et al. (1989) suggested eleven principles that should guide the design of intentional learning environments:
1. Make knowledge-construction activities overt.
2. Maintain attention to cognitive goals.
3. Treat knowledge deficits in a positive way.
4. Provide process-relevant feedback.
5. Encourage learning strategies other than rehearsal.
6. Encourage multiple passes through information.
7. Support varied ways for students to organize their knowledge.
8. Encourage maximum use and examination of existing knowledge.
9. Provide opportunities for reflection and individual learning styles.
10. Facilitate transfer of knowledge across contexts.
11. Give students more responsibility for contributing to each other's learning.
While many of these principles also are similar to traditional instructional-design prescriptions (e.g., Reigeluth, 1983), there is additional emphasis on knowledge construction consistent with the cognitive apprenticeship model and other constructivist learning theories.
More recently, Scardamalia and Bereiter describe three ideas that are fundamental to intentional learning environments:
1. Intentional learning. Ng and Bereiter (1991) observed students learning computer programming and found three kinds of goals:
--performance goals, i.e., task completion goals;
--instructional goals, i.e., the goals articulated by the instructor and the learning materials;
--learning goals, i.e., the specific goals for learning brought to the situation by the learner. Learning goals usually overlap but do not equate to performance or instructional goals.
Intentional learning depends upon students having learning goals and finding successful avenues to learn based upon those goals.
2. The process of expertise. Scardamalia and Bereiter argue for a view of expertise in process terms rather than strictly as a performance capability. As people gain experience in a domain, simple tasks become routinized, freeing up mental resources for other tasks. If those newly available resources are reinvested back into learning more about the domain, then more and more difficult problems can be mastered. This process of expertise is equally characteristic of serious students and seasoned experts "working at the edges of their competence" (Scardamalia & Bereiter, 1994, p. 266). To become experts, students must demonstrate a disposition and commitment to engage in systematic intentional learning, in addition to having the brute cognitive capacity to learn.
3. Restructuring schools as knowledge-building communities. Scardamalia and Bereiter (1994) contrast what we call static from dynamic learning communities. Members of static communities must adapt to the environment, but once adapted, "one becomes an old timer, comfortably integrated into a relatively stable system of routines..." (pp. 266-267). Traditional schools and even child-centered, individualized instruction are often static in this sense. In contrast, a dynamic learning community requires constant readaptation to other community members. Sports and businesses are examples. "[T]he accomplishments of participants keep raising the standard that the others strive for" (p. 267). In the sciences, for example, the collective knowledge base is continually changing. The challenge in these environments is to continue growing, adapting, and contributing along with the rest of the community. Dynamic knowledge-building communities engage in the kind of transformative communication suggested by Pea (1994; see discussion above). In large part, the goal of the CSILE research agenda is to find ways to help classrooms and schools become dynamic knowledge-building communities in this respect.
Central to intentional learning environments is the cultivation of a collective knowledge base, explicitly represented in CSILE as a computer database. This knowledge base allows the creation and storage of numerous forms of representation--text, graphics, video, audio--as well as the linking of items together via a hypermedia structure. The knowledge base is built up over a period of time in response to students' questions and their subsequent investigations and reports.
Another key feature of CSILE is the "publication" process, similar to the review process of academic journals:
Students produce notes of various kinds and frequently revise them. When they think they have a note that makes a solid contribution to the knowledge base in some area, they can mark it as a candidate for publication. They then must complete a form that indicates, among other things, what they believe is the distinctive contribution of their note. After a review process (typically by other students with final clearance by the teacher), the note becomes identified as published. It appears in a different font, and users searching the database may, if they wish, restrict their search to published notes on the topic they designate. (Scardamalia & Bereiter, 1994, p. 279)
Thus CSILE emulates in many respects the activities of scholarly knowledge-building communities. Attempts at applying a CSILE-like model to higher education classrooms are reported in Grabinger (this volume).
Like problem-based learning, the CSILE model provides a concrete framework for designers and teachers seeking to break out of traditional conventions and incorporate constructivist principles of instructional design. Two remaining issues for consideration are:
--Matching intentional learning activities to pre-specified curriculum objectives. Every learning environment exists within a larger system of curriculum expectations and learning needs. A student may want to study X while the teacher thinks that Y would be a better choice. Meanwhile, the school district has a policy insisting on Z as the proper content. Negotiating between student-generated study questions and the surrounding system is an important consideration.
--Maintaining motivation. As in every learning environment,
designers of intentional environments must develop methods for
encouraging thoughtful collaboration while avoiding the damaging
effects of competition. Cultivating a cooperative, open spirit
among participants requires attention to group dynamics and the
chemistry between individuals and within working groups. Maintaining
motivation could be a challenge within an environment of widely
diverging competencies and expectations.
In our previous review of cognitive teaching models (Wilson & Cole, 1991), we were surprised by the diversity of approach and method. In this review, key differences again should be acknowledged between the various models. Cognitive load theory adopts a no-nonsense approach to the efficient and effective teaching of defined content. The packaged computer-based learning environments (Sherlock and the case-based scenarios) are highly effective, controlled environments that filter out much of the world's complexity and provide learners with an authentic-enough environment conducive to learning. The problem-based learning model is simpler in design yet more ambitious in the sense that it departs more radically from established instructional methods. Finally, the tools and models related to learning communities become almost anti-models in the sense that so much is left to the participants, both instructors and students.
In view of the substantial differences between models, generalizing across them is a difficult task. Several key points of reflection, however, are offered below.
Design and implementation are inseparable. There is no doubt that development and tryout of coherent models can yield important outcome information. Knowing, for example, that reciprocal teaching produces, on average, effect sizes of .32 on standardized tests and .88 on locally developed measures (Rosenshine & Meister, 1994), conveys a sense of confidence and reliability for users of the method. At the same time, some of the most valuable lessons learned may come from the real-world experience gained in setting up and administering a program. The implementation can be just as important as the theory-guided design.
An example may be taken from the 1970s research in computer-assisted instruction. One program, the TICCIT project, was shown by an NSF evaluation to achieve its objectives more successfully than traditional classroom instruction (Merrill, Schneider, & Fletcher, 1979). The program failed, however, in getting students to stay with the program; the dropout rate was unacceptably high when compared to traditional classrooms. In this case, the actual development and tryout of working models produced knowledge that would not have been anticipated ahead of time.
Another example is Clancey's Guidon-Manage research in intelligent tutoring systems or ITS (1993). In a remarkable example of self-reflection, Clancey concludes: "After more than a decade, I felt that I could no longer continue saying that I was developing instructional programs for medicine because not a single program I worked on was in routine use..." (p. 7). What did Clancey learn from his research? Apart from his contribution to intelligent tutoring technologies, he learned that research in a laboratory differs from research in the field. "[R]esearchers must participate in the community they wish to influence... (p. 9, italics retained). "As ITS matures, some members of our research community must necessarily broaden their goals from developing representational tools to changing practice--changing how people interact and changing their lives... (p. 9, italics retained). Clancey then reflects on how he might approach the Guidon-Manage research differently today:
--participating with users in multidisciplinary design teams versus viewing teachers and students as my subjects,
--adopting a global view of the context...instead of delivering a program in a...box,
--being committed to provide cost-effective solutions for real problems versus imposing my research agenda on another community,
--facilitating conversations between people versus only automating human roles,...
--relating...ITS computer systems to...everyday practice...versus viewing models...as constituting the essence of expert knowledge that is to be transferred to a student, and
--viewing the group as a psychological unit versus modeling only individual behavior. (Clancey, 1993, p. 17)
Although the specific research agenda is different, the lessons Clancey learned apply very well to cognitive teaching models. Developers of teaching models need to stay close to the context of use and include implementation within their domain of interest.
Like the TICCIT and Guidon projects, outcomes of research are sometimes negative, as happened in the failure of young students to learn abstract concepts via TableTop, or the tendency for some experienced medical students to become bored with PBL activities. We believe that these negative findings can become extremely useful as formative evaluation data, feeding back into future implementations of the model.
Norman (1993) speaks of the power of representation as a stimulus to scientific progress. Repeatedly in the history of science, revolutionary strides are made when a new technology is developed that allows a repicturing of problems in a domain. By analogy, a similar kind of progress is made possible by research on teaching models. Through the careful articulation and construction of actual working methods, new perspectives are made possible. An actual product, once created, may be examined from a variety of angles and for a variety of purposes, many perhaps unintended by the creator. Determining exactly "what is learned" from research of this type may be difficult to articulate yet remain extremely valuable to the scientific and practitioner communities.
Choosing a model is closely tied to the curriculum question. Perkins (1992b) warns of a common fallacy implied by the statement: "What we need is a new and better method. If only we had improved ways of inculcating knowledge or inducing youngsters to learn, we would attain the precise...outcomes we cherish" (p. 44). Instead, Perkins believes that "given reasonably sound methods, the most powerful choice we can make concerns not method but curriculum--not how we teach but what we choose to try to teach" (p. 44). This comment suggests that a fundamental step in instructional design involves the serious consideration of learning goals. A variety of constituencies should be included in this process, including sponsors and members of the learning community itself. Once consensus is reached about the kind of learning being sought, certain teaching models become unfeasible while others become more attractive.
A basic lesson learned from observing schools is that two teachers may be covering the same ostensive curriculum while what really is taught differs radically between them. And what any two students learn in the same teacher's class may differ just as radically. At its base, the constructivist movement in education involves curriculum reform, a rethinking of what it means to know something. A constructivist curriculum is reflected in many of the models reviewed in this chapter. Thus, if a commitment is made toward rethinking curriculum to expand the roles of knowledge construction and learning communities, then a corresponding commitment needs to be made in rethinking learning activities. Deciding upon a teaching model is not a value-neutral activity. Recognizing this puts the selection of a teaching model squarely into the political realm of policymaking. New issues become important, such as access, equity, representation, voice, and achieving consensus amid diverse perspectives.
As Reigeluth (1983) acknowledges, curriculum and instruction cannot be completely separated. There is a tendency among many institutions to give lip service to higher-order outcomes while maintaining teaching methods that specifically suppress such outcomes. Medical schools that teach students to simply memorize and take tests are an example. Another example is a military school whose mission statement prizes "creativity" in students, yet whose teaching methods and authoritarian culture strictly reinforce conformity and transmission of content.
Deciding upon a teaching model and making decisions within that framework is a highly situated activity. The success of a given implementation will depend more on the local variables than on the general variables contained in the various models described above. Put another way, "the devil is in the details." There is a way to succeed and a way to fail using a whole host of teaching models. All the models reviewed above can succeed if properly implemented. Teachers and students must see the sense of what they are doing, come to believe in the efficacy of the program, and work hard to ensure that the right outcomes are achieved.
This situational perspective conflicts with traditional views. Thinking of instructional design as a technology would lead us to think that a situation gets analyzed, which leads to a technical fix to be implemented, which leads either to a measured solution to the problem or a revision in the fix for the next cycle of intervention. A situated view of instructional design would lead to a different process:
1. A learning community examines and negotiates its own values, desired outcomes, and acceptable conventions and practices.
2. The learning community plans for and engages in knowledge-generating activities within the established framework of goals, conventions, and practices.
3. Members of the learning community, including both teachers and students, observe and monitor learning and make needed adjustments to support each other in their learning activities.
4. Participants occasionally re-examine negotiated learning goals and activities for the purpose of improving learning and maintaining a vital community of motivated learners. This may lead to new goals and methods and cultural changes at all levels, from cosmetic to foundational.
This situated, community-oriented view of instruction takes a more holistic view to the design of instruction. The community is opportunistic in addressing "design" issues at any stage of planning and implementation. Community members, including students, have a voice in determining what happens to them in instruction. In return, they must show the needed commitment and disposition to behave responsibly and in support of learning.
If community members have participated in the establishment of a program, they are more likely to believe in it. If they believe in the program, the chances of success increase dramatically. As Perkins (1992b) suggests, even very imperfect instructional methods can work if the commitment is made to work together and ask the right questions in designing curriculum.
Each teaching model is a particular blend of costs and outcomes. Some kind of costs-benefits analysis--implicit or explicit--happens in designing educational programs. Surveying our teaching models reveals that some have demonstrably high development costs. Sherlock has taken a decade of patient research to achieve its present form. Once developed, however, the prototype model may be replicated at a reasonable cost. Other models, such as problem-based learning, may pose heavy demands in terms of time in the curriculum. Instructional designers (or learning-community members) must then face the questions of how and whether to implement such resource-demanding teaching methods into an existing system and curriculum.
Every decision to adopt one teaching model over another involves such weighing of pros and cons. However, while costs may be objectively measured and estimated, learning benefits are notoriously difficult to reduce down to a number. This inequity of measurability results in a common bias: The cost differences become exaggerated while the potential benefits, because they are harder to measure, tend to be undervalued or ignored. Comparison of alternative teaching models must give full consideration to qualitative differences in learning outcomes in addition to the more visible cost differences in time and money.
Some ideas may be borrowed and inexpensively incorporated into related products or programs. For example, if an instructor becomes excited by Schank's case-based scenarios, she may choose to incorporate case histories and classroom simulations into her teaching. While the resulting lessons may bear only a passing resemblance to the computer-based scenarios, they are heavily influenced by Schank's principles of case-based, interactive instruction. Many of the principles discussed above, including those of cognitive apprenticeships and intentional learning communities, can be efficiently adapted into instruction in a number of ways, depending on local circumstances and resources.
Instruction should support learners as they become efficient in procedural performance and deliberate in their self-reflection and understanding. Virtually all of the teaching models under review--Sweller's research notwithstanding--emphasize the grounding of instruction in complex problems, cases, or performance opportunities. Yet organizing instruction around problems and cases should not mask the importance of perception, reflection, and metacognitive activity. Indeed, these two aspects of human performance (problem solving and perception) can be seen as inherently complementary and equally necessary. Contrary to the suggestion of Dreyfus and Dreyfus (1986), experts are more than mere automatic problem-solvers. Rather, experts become experts through a progressive series of encounters with the domain, each involving an element of routine performance and a corresponding element of reflection and deliberation. This is the process of expertise spoken of by Scardamalia and Bereiter (1994) and discussed above.
Prawat (1993) makes this point well. While there is a tendency among cognitive psychologists to make problem solving central to all cognition, Prawat reminds us that schemas, ideas, and perceptual processes hold an equally important place. Learning how to see --from a variety of points of view--is as important as solving a problem once we do see. Principles of perception, whether from ecological psychology (Allen and Ott, this volume), connectionism, or aesthetics, need to have a place within successful teaching models. This includes teaching students how to represent problems and situations, but also how to appreciate and respond to the aesthetic side of the subject, how to reflect upon one's actions, and how to "raise one's consciousness" and recognize recurring themes and patterns in behavior and interactions.
Successful programs must seek to make complex performance do-able while avoiding the pitfalls of simplistic proceduralization. The art of "scaffolding" complex performance is a key problem area that surprisingly is still not well understood. How does a coach entice a young gymnast to perform just beyond her capacities, overcoming the fear and uncertainty that normally accompany new performances? How does the coach know just when and where to step in, preserving the integrity of the task (and the learning) while not letting the athlete fall on her head? These are questions of appropriate scaffolding or support for learning. Once a teacher begins believing the constructivist agenda and the importance of authentic, meaningful tasks, then the challenge of supporting novice performance within a complex environment becomes a central concern. As Sweller's research makes clear, poorly supported problem-solving activities force learners to rely on weak methods that they already know. Appropriate and wise scaffolding makes problem-solving activities more efficient because learners stay focused within the critical "development" zone between previously mastered knowledge and skills beyond their reach (Vygotsky, 1978). Developing a technology for optimizing this kind of support is an area in need of further research and development.
This same concept of scaffolding can be directed to the implementation of the teaching model itself. Instructional designers and teachers need proper supports and aids in designing according to a particular model or tradition. At the same time, they should be cautioned against simplistically "applying" a model in a proceduralized or objectivist fashion. Postmodernists would say that in such cases, the model "does violence" to the situation. The complexities of a situation should not be reduced down to the simple maxims of a teaching model. Any model that is forced upon a situation and made to fit, will lead inevitably to unintended negative consequences. The negative fallout will happen at those points of disjuncture or lack of fit between model and situation. As we have stressed, the details of the situation need to be respected and taken into account when adapting a model to a situation.
This, perhaps, is a more appropriate way of thinking about implementation: Rather than applying a particular teaching model, a teacher necessarily adapts that model to present circumstances. Learning how to adapt abstractions to concrete realities is a worthy task for both students and teachers, and indeed, may lie at the heart of some forms of expertise.
Each of these points is worthy of continued research. As psychologists
continue to develop models for teaching that embody their best
thinking and theories, the field of instructional design will
have opportunities for reflection and growth as they re-examine
their own models and methods. Our hope is that the dialogue may
continue to flourish and expand, resulting in a "transformation"
of both communities.
Anderson, J. R. (1987). Methodologies for studying human knowledge. Behavioral and Brain Science, 10, 467-477.
Barrows, H. S. (1985). How to design a problem-based curriculum for the preclinical years. New York: Springer.
Barrows, H. S. (1992). The tutorial process. Springfield, IL: Southern Illinois University School of Medicine.
Barrows, H. S., & Feltovich, P. J. (1987). The clinical reasoning process. Journal of Medical Education, 21, 86-91.
Bell, B., Bareiss, R., & Beckwtih, R. (1993/94). Sickle Cell Counselor: A prototype goal-based scenario for instruction in a museum environment. The Journal of the Learning Sciences, 3 (4), 347-386.
Bereiter, C., & Scardamalia, M. (1992). Cognition and curriculum. In P. Jackson (Ed.), Handbook of Research on Curriculum (pp. 517-542). New York: MacMillan.
Brown, A. L. (1994). The advancement of learning. Educational Researcher, 23 (3), 4-12.
Brown, A. L., & Campione, J. C. (1981). Inducing flexible thinking: A problem of access. In M. Friedman, J. P. Das, & N. O'Connor (Eds.), Intelligence and learning (pp. 515-529). New York: Plenum.
Brown, A. L., Campione, J. C., & Day, J. D. (1981). Learning to learn: On training students to learn from texts. Educational Researcher, 10 (2), 14-21.
Brown, A. L., & Palincsar, A. S. (1989). Guided, cooperative learning and individual knowledge acquisition. In L. B. Resnick (Ed.), Knowing, learning, and instruction: Essays in honor of Robert Glaser. Hillsdale, NJ: Erlbaum.
Carroll, W. M. (1994). Using worked examples as an instructional support in the algebra classroom. Journal of Educational Psychology, 86 (3), 360-367.
Chandler, P., & Sweller, J. (1991). Cognitive load theory and the format of instruction. Cognition and Instruction, 8, 293-332.
Chi, M. T. H., de Leeuw, N., Chiu, M-H, & LaVancher, C. (1991). The use of self-explanations as a learning tool. Pittsburgh, PA: The Learning Research and Development Center, University of Pittsburgh.
Chi, M., & VanLehn, K. A. (1991). The content of physics self-explanations. The Journal of the Learning Sciences, 1(1), 69-105.
Chi, M. T. H., & Bassok, M. (1989). Learning from examples via self-explanations. In L. B. Resnick (Ed.), Knowing, learning, and instruction: Essays in honor of Robert Glaser (pp. 251-282). Hillsdale NJ: Erlbaum.
Chi, M. T. H., Bassok, M., Lewis, M. W., Reimann, P., & Glaser, R. (1989). Self-explanations: How students study and use examples in learning to solve problems. Cognitive Science, 13, 145-182.
Chi, M. T. H., de Leeuw, N., Chiu, M-H, & LaVancher, C. (1991). The use of self-explanations as a learning tool. Pittsburgh PA: The Learning Research and Development Center, University of Pittsburgh.
Chi, M., & VanLehn, K. A. (1991). The content of physics self-explanations. The Journal of the Learning Sciences, 1(1), 69-105.
Clancey, W. J. (1993). Guidon-Manage revisited: A socio-technical systems approach. Journal of Artificial Intelligence in Education, 4 (1), 5-34.
Collins, A., Brown, J. S., & Newman, S. E. (1989). Cognitive apprenticeship: Teaching the crafts of reading, writing, and mathematics. In L. B. Resnick (Ed.), Knowing, learning, and instruction: Essays in honor of Robert Glaser. Hillsdale, NJ: Erlbaum.
Collins, A., & Ferguson, W. (1993). Epistemic forms and epistemic games: Structures and strategies to guide inquiry. Educational Psychologist, 28 (1), 25-42.
Collins, A., & Stevens, A. L. (1983). A cognitive theory of inquiry teaching. In C. M. Reigeluth (Ed.), Instructional-design theories and models: An overview of their current status (pp. 247-278). Hillsdale NJ: Erlbaum.
Conklin, J. (1987). Hypertext: A survey and introduction. IEEE Computer, 20(9), 17-41.
Dick, W. (1987). A history of instructional design and its impact on educational psychology. In J. A. Glover, R. R. Ronning (Eds.), Historical foundations of educational psychology (pp. 183-200). New York: Plenum Press.
Distlehorst, L. H., & Barrows, H. S. (1982). A new tool for problem-based self-directed learning, Journal of Medical Education, 57, 466-488.
Dreyfus, H. L., & Dreyfus, S. E. (1986). Mind over machine: The power of human intuition and expertise in the era of the computer. New York: The Free Press.
Duffy, T. M. (in press). Strategic teaching framework: An instructional model for learning complex, interactive skills. In C. Dills & A. Romiszowski (Eds.), Perspectives on instructional design. Englewood Cliffs NJ: Educational Technology Publications.
Duffy, T. M. (1994, October 22). Workshop on Problem-based Learning. Denver CO: University of Colorado at Denver.
Duffy, T. M., & Jonassen, D. H. (Eds.). (1993).Constructivism and the technology of instruction: A conversation . Hillsdale NJ: Erlbaum.
Duffy, G. G., & Roehler, L. R. (1989). Why strategy instruction is so difficult and what we need to do about it. In C. McCormick, G. Miller, & M. Pressley (Eds.), Cognitive strategy research: From basic research to educational applications (pp. 133-154). New York: Springer-Verlag.
Flavell, J. H. (1979). Metacognition and cognitive monitoring. American Psychologist, 34, 906-911.
Gagné, R. M. (1968). Learning hierarchies. Presidential Address of APA Division 15. Educational Psychologist, 6 (1), 1-9.
Gagné, R. M. (Ed.). (1987). Instructional technology: Foundations. Hillsdale NJ: Erlbaum.
Glaser, R., Lesgold, A., & Lajoie, S. P. (1987). Toward a cognitive theory for the measurement of achievement. In R. Ronning, J. Glover, J. C. Conoley, & J. C. Witt (Eds.), The influence of cognitive psychology on testing, Buros/Nebrasks Symposium on Measurement: Vol. 3 (pp. 41-85). Hillsdale NJ: Erlbaum.
Glaser, R. (1990). The reemergence of learning theory within instructional research. American Psychologist, 45 (1), 29-39.
Glaser, R. (1976). Components of a psychology of instruction: Toward a science of design. Review of Educational Research, 46, 1-24.
Gott, S. P. (1988a, April). The lost apprenticeship: A challenge for cognitive science. Paper presented at the meeting of the American Educational Research Association, New Orleans.
Gott, S. P. (1988b). Apprenticeship instruction for real-world tasks: The coordination of procedures, mental models, and strategies. In E. Z. Rothkopf (Ed.), Review of Research in Education, 15, 97-169.
Gott, S. P., Hall, E. P., Pokorny, R. A., Dibble, E., & Glaser, R. (1992). A naturalistic study of transfer: Adaptive expertise in technical domains. In D. K. Detterman & R. J. Sternberg (Eds.). Transfer on trial: Intelligence, cognition, and instruction. (pp. 258-288). Norwood NJ: Ablex Publishing Corp.
Gott, S. P., Lesgold, A., & Kane, R. S. (in press). Tutoring for transfer of technical competence. In B. G. Wilson (Ed.), Constructivist learning environments: Cases studies in instructional design. Englewood Cliffs NJ: Educational Technology Publications.
Hancock, C. M., & Kaput, J. (1990a). Annual report of hands on data (NSF MDR-#8855617). Cambridge, MA: TERC.
Hancock, C. M., & Kaput, J. (1990b). Computerized tools and the process of data modeling. In G. Booker, P. Cobb, & T. N. deMendicuti (Eds.), Proceedings of the 14th International Conference on the Psychology of Mathematics Education (Vol., 3, pp. 65-172). Mexico: The Program Committee of the 14th PME Conference.
Hancock, J. J., Kaput, L. T., & Goldsmith, L. T. (1992). Authentic inquiry with data: critical barriers to classroom implementation. Educational Psychologist, 27(3), 337-364.
Jonassen, D. H. (1996). Computers in the classroom: Mindtools for critical thinking. Columbus OH: Charles Merrill.
Jonassen, D. H., Bessner, K., & Yacci, M. (1993). Structural knowledge: Techniques for representing, conveying, and acquiring structural knowledge. Hillsdale NJ: Erlbaum.
Kaput, J. J., & Hancock, C. M. (1991). Cognitive issues in translating between semantic structure and formal record structure. In F. Furinghetti (Ed.), Proceedings of the 15th International Conference on the Psychology of Mathematics Education (Vol. 2, pp. 237-244). Genova, Italy: Dipartimento di Matematica dell' Università de Genova.
Kass, A., Burke, R., Blevis, E., & Williamson, M. (1993/94). Constructing learning environments for complex social skills. The Journal of the Learning Sciences, 3 (4), 387-427.
Koschmann, T. D. (1994). Toward a theory of computer support for collaborative learning. The Journal of the Learning Sciences, 3 (3), 219-225.
Koschmann, T. D., Myers, A. C., Feltovich, P. J., & Barrows. (1994). Using technology to assist in realizing effective learning and instruction: A principled approach to the use of computers in collaborative learning. The Journal of the Learning Sciences, 3 (3), 227-264.
Lajoie, S. P., Lesgold, A. M. (1992). Dynamic assessment of proficiency for solving procedural knowledge tasks. Educational Psychologist, 27(3), 365-384.
Lajoie, S. P., & Lesgold, A. (1989). Apprenticeship training in the workplace: Computer coached practice environment as a new form of apprenticeship. Machine-Mediate Learning, 3, 7-28.
Lave, J. (1988). Cognition in practice: Mind, mathematics and culture in everyday life. Cambridge UK: Cambridge University Press.
Lesgold, A., Lajoie, S., Bunzo, M., & Eggan, G. (1992). A coached practice environment for an electronics troubleshooting job. In J. Larkin & R. Chabay (Eds.), Computer assisted instruction and intelligent tutoring systems: Establishing communications and collaboration (pp. 201-238). Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.
Lesh, R., Landau, M., & Hamilton, E. (1983). Conceptual models and applied mathematical problem solving research. In R. Lesh & M. Landau (Eds.), Acquisition of mathematical concepts and processes (pp. 263-343). New York: Academic Press.
Lewis, M. W., Milson, R., & Anderson, J. R. (1988). Designing an intelligent authoring system for high school mathematics ICAI: The teacher apprentice project. In G. P. Kearsley (Ed.), Artificial intelligence and instruction: Applications and methods (pp. 269-301). New York: Addison-Wesley.
Lipkin, M. (1989). Education of doctors who care. In H. G. Schmidt, M. Lipkin, Jr., M. W. de Vries, & J. M. Greep (Eds.), New directions for medical education (pp. 3-16). New York: Springer-Verlag.
Low, W. C. (1981). Changes in instructional development: The aftermath of an information processing takeover in psychology. Journal of Instructional Development, 4(2), 10-18.
Lumsdaine, A. A., & Glaser, R. (1960). Teaching machines and programmed learning. Washington D. C.: National Educational Association.
Mayer, R. E., & Anderson, R. B. (1991). Animations need narrations: An experimental test of a dual-coding hypothesis. Journal of Educational Psychology, 83 (4), 484-490.
Mayer, R. E., & Anderson, R. B. (1992). The instructive animation: Helping students build connections between words and pictures in multimedia learning. Journal of Educational Psychology, 84 (4), 444-452.
Mayer, R. E., & Sims, V. K. (1994). For whom is a picture worth a thousand words? Extensions of a dual-coding theory of multimedia learning. Journal of Educational Psychology, 86 (3), 389-401.
Merrill, M. D. (1983). Component display theory. In C. M. Reigeluth (Ed.), Instructional-design theories and models: An overview of their current status (pp. 282-333). Hillsdale, NJ: Lawrence Erlbaum Associates.
Merrill, M. D., Kowallis, T., & Wilson, B. G. (1981). Instructional design in transition. In F. Farley & N. Gordon (Eds.), Psychology and education: The state of the union. Chicago: McCutcheon.
Merrill, M. D., Schneider, E., & Fletcher, K. (1979). TICCIT. Englewood Cliffs NJ: Educational Technology Publications.
Merrill, M. D., Wilson, B. G., & Kelety, J. C. (1981). Elaboration theory and cognitive psychology. Instructional Science, 10, 217-235.
Merrill, M. D., & Tennyson, R. (1977). Teaching concepts: An instructional design guide (1st ed.). Englewood Cliffs NJ: Educational Technology Publications.
Morrison, D., & Collins, A. (1995). Epistemic fluency and constructivist learning environments. In B. G. Wilson (Ed.), Constructivist learning environments: Case studies in instructional design. Englewood Cliffs NJ: Educational Technology Publications.
Ng, E., & Bereiter, C. (1991). Three levels of goal orientation in learning. The Journal of the Learning Sciences, 1 (3), 243-271.
Nichols, P., Poknorny, R., Jones, G., Gott, S. P., & Alley, W. E. (in press). Evaluation of an avionics troubleshooting tutoring system. AL/HR TR-94-XX. Brooks AFB TX.
Paas, F. G. W. C., & Van Merriënboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem-solving skills: A cognitive-load approach. Journal of Educational Psychology, 86 (1), 122-133.
Pea, R. D. (1994). Seeing what we build together: Distributed multimedia learning environments for transformative communications. The Journal of the Learning Sciences, 3 (3), 285-299.
Perkins, D. N. (1992a). Technology meets constructivism: Do they make a marriage? In T. M. Duffy & D. H. Jonassen (Eds.), Constructivism and the technology of instruction (pp. 45-55). Hillsdale, NJ.
Perkins, D. (1992b). Smart schools: From training memories to educating minds. New York: The Free Press (MacMillan).
Pierce, K. A., Duncan, M. K., Gholson, B., Ray, G. E., & Kamhi, A. G. (1993). Cognitive load, schema acquisition, and procedural adaptation in nonisomorphic analogical transfer. Journal of Educational Psychology, 85 (1), 66-74.
Prawat, R. S. (1993). The value of ideas: Problems versus possibilities in learning. Educational Researcher, 22 (6), 5-16.
Reder, L., & Anderson, J. (1982). Effects of spacing and embellishment on memory for main points of a text. Memory and Cognition, 10, 97-102.
Reigeluth, C. M. (Ed.). (1983). Instructional-design theories and models: An overview of their current status. Hillsdale, NJ: Erlbaum.
Reigeluth, C. M. (Ed.) (1987). Instructional theories in action: Lessons illustrating selected theories and models . Hillsdale, NJ: Erlbaum.
Riesbeck, C. K. (in press). Case-based teaching and constructivism: Carpenters and tools. In B. G. Wilson (Ed.), Constructivist learning environments: Cases studies in instructional design. Englewood Cliffs NJ: Educational Technology Publications.
Resnick, L. B. (1981). Instructional psychology. Annual Review of Psychology, 32, 659-704.
Robin, S., & Mayer, R. E. (1993). Schema training in analogical reasoning. Journal of Educational Psychology, 85 (3), 529-538.
Rosenshine, B., & Meister, C. (1994). Reciprocal teaching: A review of the research. Review of Educational Research, 64 (4), 479-530.
Saunders, R., & Solman, R. (1984). The effect of pictures on the acquisition of a small vocabulary of similar sight-words. British Journal of Educational Psychology, 54, 265-275.
Savery, J. R., & Duffy, T. M. (in press). Problem based learning: An instructional model and its constructivist framework. In B. G. Wilson (Ed.), Constructivist learning environments: Cases studies in instructional design. Englewood Cliffs NJ: Educational Technology Publications.
Scardamalia, M., Bereiter, C., McLean, R. S., Swallow, J., & Woodruff, E. (1989). Computer-supported intentional learning environments. Journal of Educational Computing Research, 5 (1), 51-68.
Scardamalia, M., & Bereiter, C. (1991). Higher levels of agency for children in knowledge building: A challenge for the design of new knowledge media. The Journal of the Learning Sciences, 1 (1), 37-68.
Scardamalia, M., & Bereiter, C. (1994). Computer support for knowledge-building communities. The Journal of the Learning Sciences, 3 (3), 265-283.
Schank, R. C., Fano, A., Bell, B., & Jona, M. (1993/1994). The design of goal-based scenarios. The Journal of the Learning Sciences, 3 (4), 305-345.
Schmidt, H. G. (1989). The rationale behind problem-based learning. In H. G. Schmidt, M. Lipkin, Jr., M. W. de Vries, & J. M. Greep (Eds.), New directions for medical education (pp. 105-111). New York: Springer-Verlag.
Schooler, J., & Engstler-Schooler, T. (1990). Verbal overshadowing of visual memories: Some things are better left unsaid. Cognitive Psychology, 22, 36-71.
Stefik, M., & Brown, J. S. (1989). Toward portable ideas. In M Olson (Ed.), Technological support for work group collaboration (pp. 147-166). Hillsdale NJ: Lawrence Erlbaum Associates, Inc.
Sweller, J. (1989). Cognitive technology: Some procedures for facilitating learning and problem solving in mathematics and science. Journal of Educational Psychology, 81 (4), 457-466.
Sweller, J., & Cooper, G. A. (1985). The use of worked examples as a substitute for problem solving in learning algebra. Cognition and Instruction, 2(1), 59-89.
Sweller, J., & Chandler, P. (1994). Why some material is difficult to learn. Cognition and Instruction, 12 (3), 185-233.
Trigg, M., & Sherry, L. (1995). Creating and playing epistemic games: A teacher's guide to knowledge generating activities. Manuscript in preparation.
Vygotsky, L. (l978). Mind and society. Cambridge, MA: Harvard University.
Williams, S. M. (1992). Putting case-based instruction into context: Examples from legal and medical education. The Journal of the Learning Sciences, 2(4), 367-427.
Wilson, B. G. (Ed.). (in press). Constructivist learning environments: Case studies in instructional design. Englewood Cliffs NJ: Educational Technology Publications.
Wilson, B. G., & Cole, P. (1991). A review of cognitive teaching
models. Educational Technology Research & Development Journal.
Brent Wilson (email@example.com) is associate professor of instructional technology at the University of Colorado at Denver. His interests include cognition and instruction and the design of constructivist learning environments. Peggy Cole (firstname.lastname@example.org) is instructor of English at Arapahoe Community College in Littleton Colorado. Her professional interests include constructivist models of instructional design and the use of analogies and journals in instruction. We wish to thank Dr. David Jonassen, a former mentor and colleague, for his continued support and encouragement in the process of completing the chapter.