Evaluative Strategies for the
Networked Classroom

Margaret Downs-Gamble

Traci Gardner

Assumptions and Suggestions

      Because of the range of classes being taught in the computer classroom, it is vital that any evaluation of the CIC or of on-line classes being situated in the context of the theoretical and pedagogical assumptions of the teacher of the class. The goals of the class and of the faculty member provide a necessary filter for all the data which are gathered in the course of this process; divorced from the goals and assumptions under which they were completed, writing from the networked classroom can, at best, lead to inconclusive results and, at worst, lead to contradictory results. For this reason, data must be carefully collected and managed to insure the most accurate analysis.

       This analysis must not, however, result in drawing conclusions about the faculty members involved. The goal of this evaluation is to determine the effectiveness and efficiency of computer networked classroom and computer-assisted instruction. To provide a clear and responsible evaluation, some information about individuals teaching in these classes will be needed, but this information should be presented with the same confidentiality that the evaluation of student writing and reactions are presented.

       Because of the amount of evaluation which these classes involve and the possible burdensome nature of this evaluation, teachers and students entering the classroom should be told in advance of the necessity of gathering analytical data. Those faculty members and students who are uncomfortable with this evaluation should be allowed the alternative of shifting to a traditional classroom setting. Those faculty members currently teaching in the CIC should be informed immediately of the necessity of a variety of evaluative techniques and urged to cooperate fully. In the future, a simple information and release form should be signed by all faculty members and a similar form signed by students using the classroom to ensure that no misunderstandings about the evaluation of this environment occur.

Evaluation Strategies Overview

       The following table of evaluation strategies correlates the various types of evaluation suited for this environment and the areas in which evaluation should concentrate.

The first four categories of evaluation, several kinds of surveys and a concurrent log, are appropriate for all aspects of the computer environment. Frequency-response analysis is best suited for those environments where student response is not necesarily obligatory--primarily those areas most closely associated with class discussion. Formal writing analysis, to the contrary, is best suited for those areas which are most closely associated with formal written expression. While formal writing skills are concentrated in areas such as word processing, the computer environment provides evidence in all areas of the extent and efficiency of critical thinking in a way that the traditional classroom cannot. Finally linguistic anlaysis, which can be completed primarily through computer manipulation of the computer files, can provide interesting evidence of the effect computerization has upon student empowerment and thinking.

Administrative Data Survey

       Certain basic information which administrators dealing with class enrollements would have access to would be useful in determining how these classes meet the expectations of the students who enroll in them.

It would be ideal to gather further information on the students such as gender or experience with computers; however, the constraints of the IMS system may not allow this sort of tracking.

Surveys of Students, Faculty, Monitors, and Administrators

       Entry, midterm, exit, and follow-up surveys collect data for comparative analysis including evaluative, informative and attitudinal feedback. By carefully forming the questions in the students' survey, we should be able to identify changes in student attitudes, self-assessment, and perception of literacy behaviors as well as to gather evaluative information on the applications and techniques used in the classes.

       Student surveys could measure such variables as the following:

       In addition, basic student information necessary for evaluation would also be collected on these surveys. The following data would be especially useful:

       Surveys for faculty teaching these classes would be based primarily on the questions asked of students to compare student and faculty perceptions in order to help determine reliability of the data. Further, faculty surveys could gather data on support needed, effectiveness of the training, pedagogical and theoretical assumptions, and the like.

       Monitors and administrators could be informally surveyed to gather information on use, difficulties, and problem-solving.

Concurrent Logs

       In order to track use of applications, the writing process, and changes in attitude and perceptions, faculty and students could keep logs of their activities related to the class during the term. If a clear log system were adopted, asking students and faculty to follow a rather standard format, logs from a variety of classes could be compared. Such logs could be especially useful in evaluating the effectiveness and efficiency of software applications. Further, if faculty and students in non-"computer" classes were to keep logs as well, we could draw conclusions about the effect of the computer emphasis in the classes upon the writers.

       Faculty logs keeping track of the time spent on individual projects and explicitly detailed accounts of those projects would give us the basis for comparing the computer's ability to streamline teaching and increase efficiency.

Frequency-Response Analysis

       In frequency-response analysis, we examine the mail archives (for Eudora Mail and for Daedalus Mail) and and real-time discussion files (InterChange files in Daedalus) for information about students' discussion and reading skills which is unavailable in the traditional classroom as well as for documentation of the computer's effect upon student empowerment.

       Frequency-Response analysis examines the specific interactions among students to determine the degree of social engagement. Quantitative measures of the number of responses per student, which can be further broken into the number of words and/or lines per student, demonstrate one of the ways that students increase the amount of writing in the computer classroom.

       Since writing more does not necessarily lead to better writing, this analysis also examines the content of the discussion to evaluate the sophistication of the discussion.

       Criteria for Evaluation of student responses in real-time discussion include the following:

       In addtion to evaluating the sophistication of students' writing and analytical techniques, students reading process behaviors can also be mapped by identifying textual evidence of these ways of reading:

Furthermore, frequency-response analysis allows the evaluation of participation patterns which indicate student collaboration. A discussion files might be analyzed for these characteristics of student interaction:

By relating the speakers' gender (or any other socio-cultural marker) to the frequency-response analysis, we can determine to what extent computer discussion affects differentiation in classroom. Furthermore, the discussion files can be analyzed for evidence of decentralizing the classroom by comparing student response to teacher responses.

       Finally, the discussion files can be analyzed for linguistic and paralinguistic cues which indicate engagement, authority, anxiety, and the like. These cues could then be correlated with information about gender and authority to provide further evidence of the effect of computer discussion upon student empowerment. Evaluation could consider such issues as the following:

Ideally, transcripts of non-"computer" sections of the same classes should be analyzed for the same characteristics (participation patterns, linguistic cues, etc.); however, such an evaluative technique might prove expensive and burdensome. Instead, we might focus on evaluating a sampling of computer discussions and comparing the information to published studies of the traditional classroom.

Formal Writing Analysis

       Students' writing in a variety of areas could be analyzed for the amount of writing produced by number of pages, lines, or words, and as above, this writing could also be examined for evidence of sophistication of writing skills. Writings could be evaluated for lexical density, readability level, and stylistic sophistication as well as for the qualities of sophistication and analytical insight listed above. Formal writing analysis should only be done on student work which has been identified as examples of formal, polished writing. Because real-time conference documents are meant neither to be polished nor to be formal, such writing analysis would be inappropriate in most instances. Some mail documents could be analyzed formally depending upon the way the texts were treated in class.

       For consistency, the criteria for formal writing evaluation should come from the list published by the Writing Program. These standards will need to be adapted for use with hyperdocuments, but, generally, they can guide the evaluation. The grading standards from The Writing Program Guide for Students (1993-94) are as follows:

Minimum Expectations for All Work
Content
coherent controlling purpose
sufficient and relevant development
strong connections to class readings and issues
logical assertions
 
Organization
clear sense of unity and order
logical transitions between ideas
effective opening and closing
 
Style
smooth, readable prose throughout
absence of clichés
appropriate word choice
avoidance of wordiness
 
Grammar and Mechanics
few if any problems with
agreement of subjects and verbs
agreement of pronouns and antecedents
verb forms
sentence structure
spelling, punctuation, and capitalization
standard English usage and idioms
 

Grading Rationale

Conception: This category measures the intellectual impact of your work -- does it tell us something we don't already know? Does it take the subject beyond the obvious? The range runs from work which fails to have its own center of gravity (F), to a midpoint (C) at which the argument is more or less familiar or predictable, to work which surprises us with some aspect of the subject it reveals (A or B).

Strategy: This category covers both structure, a measure of your skill at anticipating what your audience wants to know and presenting the material strategically, and development, a test of your ability to manage the rhythmic interplay of assertion and connection with concrete details that advance and specify your view.

Style: The range of achievement in style runs from work which shows little ability to cast ideas into intelligent prose (F), to a midpoint (C) at which one finds a clear enough exposition, to work which is able to generate some excitement and effective emphasis out of its choice of syntactic structure, rhythm, word choice, and imagery (A). 

Grammar and Mechanics: This category covers grammatical problems, inappropriate traces of dialect, nonstandard punctuation, poor proofreading, and the like. Unlike the previous three categories, this one either is standard or it's not. It may act as a weight pulling the other categories down, but we assume the average (C) paper can be essentially free of this sort of "error." Poor performance in this category can limit a paper from the outset to a below-average grade.

"C" work must show perfectly acceptable performance in all four categories; + /- indicates strong or weak performance in one or more areas. Written "B" work, on the other hand, is not only grammatically and mechanically "clean," but shows some specific use of stylistic resources, some well-conceived strategy of structure and development, and a noticeable advance of the subject beyond the familiar, predictable, and commonplace. Written "A" work shows significant achievement in all four categories.

(Quoted from The Writing Program Guide for Students, Department of English, Virginia Tech, 1993-94)

Some basic analysis of texts can be completed with style and grammar checkers; however, to complete a thorough formal writing analysis a number of papers written by students in the computer classroom and a number written by students in the traditional classroom should be read and graded by an outside committee of evaluators.

       As formal writing evaluation is completed, the evaluators need always to be aware of the pedagogical and theoretical assumptions of the teacher as well as the class for which the students were writing. A formal assessment which did not take into account such information would find a wide range of abilities--a paper produced for English 1004 would hardly compare to one written for English 5334. Similarly, the students producing a hypertext document for English 2125 should not be compared to those producing a hyperdocument for English 4214, and neither of these hyperdocuments should be compared to those being authored by students in English 5334, where a different theoretical approach is being used. 

Critical Skills Analysis

       Critical thinking and literacy skills can be explored by examining student writing for the analytical strategies which were listed above in the frequency response section. Analytical skills such as intertextual analysis, application of a theoretical position, analytical reading (taxonomic), and aggressive active reading can be compared to similar findings in the traditional classroom. See above for the complete list of skills.

Linguistic Analysis

       Like critical skills analysis, linguistic analysis is discussed above in the frequency-response section. Examining students' syntactic and lexical patterns for sophistication, signs of authority or anxiety, or evidence of thinking and revising can provide data which can be compared to the self-assessment students complete in their survey responses in addition to supplying information on the effect of student empowerment. See above for the explanation of this analysis technique.