whitney@sciences.sdsu.edu (Roger Whitney) (08/01/90)
[Reprinted with permission from Academic Computing Magazine, March 1990]
MICROCOMPUTERS IN THE MATHEMATICAL SCIENCES: EFFECTS ON
COURSES, STUDENTS AND INSTRUCTORS
Roger E. Whitney
Computer Science Group
Mathematical Sciences Department
San Diego State University
San Diego, CA 92182
whitney@sdsu.edu
N. Scott Urquhart
Experimental Statistics
New Mexico State University
Las Cruces, New Mexico 88003
nsu@nmsu.edu
This article grows out of an effort to integrate computers into two
courses at San Diego State University. Computers were intended to
perform low level operations for the students. Our goals were to
increase motivation and understanding of concepts, and to allow
students' to work realistic problems. We felt this approach would
particularly aid the weaker students who we frequently observed
struggling with mechanical or low level operations.
Although our efforts were directed toward improving our students'
performance, we found the computer usage had more important impacts on
the courses, the instructors, and the department. The content and
intent of the courses changed, becoming more relevant to the students'
careers. The change in content made evaluation of the impact of the
computer usage on student performance difficult; the change in intent
made such measurements rather meaningless. The instructors' and the
department's work load increased, which may have significant
ramifications in the future. We did observed more enthusiasm and better
performance among some students. Our initial impression of performance
suggested that weaker students performed worse than they should have,
rather than better. We spent considerable time investigating the weaker
students' performance. From observing these students with wildly
differing computer experience, we recognize the necessity of computer
literacy. Although both of us are committed to a computer enriched
instructional approach, we decided for totally different reasons not to
continue this experiment under present conditions.
In what follows we discuss our motivation, how we used computers in our
courses, the impact this change had, and offer a way to avoid some of
the pitfalls we encountered.
Our Motivation
Our primary motivation in this project was to improve our students'
performance. For some time we had noticed that students entering our
courses were inadequately prepared and rather unmotivated. As an
example of inadequate preparation, computer science seniors commonly
struggle with any mathematical operation dealing with logarithms even
though they had taken at least a year of calculus. Our students were
rarely motivated to go beyond the minimal requirements. Even by the end
of our courses students often displayed a lack of conceptual
understanding of the material covered. Consequently they had almost no
ability to apply the course material to solve problems outside the
narrowly focused cookbook problems found in textbooks. Frequently we
noticed students memorizing pages of information to compensate for a
lack of understanding of basic concepts. Many students did not seem to
notice that the course material was relevant outside the class.
We were dismayed by this situation for two reasons: the long term
consequences and a professional sense of failure. Students eventually
must apply some course material in their jobs. Although higher
education has several objectives, an important one is to help students
learn skills which will enable them to be a productive member of the
work force. The National Research Council summarizes, "We have
inherited a mathematics curriculum conforming to the past, blind to the
future and bound by a tradition of minimal expectation....on average,
U.S. students do not master mathematical fundamentals at a level
sufficient to sustain our present technologically based society."
Students are not learning the mathematical skills society needs. This
cannot continue long without industry, education, and thus the nation
suffering.
If our students are not learning the skills we think they need, then
at least some of the blame belongs to us as educators. After years
of being feed facts, students will not magical understand how all these
facts relate and become adroit at solving problems. We must teach the
skills the students need. However the problem is global to the
educational system, hence hard to overcome locally in a few university
courses.
A Role for Computers
In order to address the problems we were observing we turned to the
computer for help. Computers can play an important role in addressing
inadequate preparation, limited motivation, lack of conceptual
understanding, and weak problem solving skills among students. Our
hypothesis is that computers can aid students learn by:
* reducing the cognitive load on a student and
* improving students' ability to complete computations.
We felt that the computer would particularly aid the weaker
students.
Little learning occurs when people are cognitively overloaded. The more
information students must process to perform a particular task, the less
they will comprehend the basic principles underlying the task. Students
with a poor grasp of low level computations face enormous obstacles in
learning computationally complex topics: They must simultaneously
process the new information and recall how to perform the low level
operations. If these low level operations can be performed easily and
readily by a machine, whether computer, calculator, or abacus, the
cognitive load for these students should be reduced. Secondly, machines
can aid the students in successfully completing computations. Learning
is hindered when students cannot separate their arithmetic errors from
those of strategy or approach. When a machine is used appropriately to
perform computations, the results of the computations are correct.
Students can then focus attention on the far more important task of
selecting the appropriate computations for the situation.
Student motivation improves when they use computers to perform
computations. This allows them to solve realistic problems in a timely
fashion, thereby reducing repetitious and boring assignments. Until
recently it has not been practical to assign large-sized problems. For
example beginning statistics students have not been able to handle data
sets containing hundreds of observations and several variables using
manual computations. However, we gave beginning statistics students
data sets of this size which they analyzed all semester. Although the
same ideas and principles could be illustrated with a small data set of
ten to twenty observations, the students were highly motivated to
analyze their own data set. Once computers are routinely used to
perform a computation, the need for students to perform that computation
by hand diminishes; in turn, the number of "turn-the-crank" problems
assigned can be reduced.
Finally, the computer can help students learn concepts. If a
picture is worth a thousand words, then dynamic simulation may be worth
a thousand pictures. Computers can provide dynamic illustrations and
even give students control over simulations, greatly aiding
understanding of processes or concepts being learned. Students also can
perform computer based experiments to validate concepts and theorems
covered in class. This computer use shifts the emphasis from "Are these
computations performed correctly?" to "Are these the proper
computations to perform for this situation?". In turn this shifts
attention from mechanics toward understanding the underlying principals.
Reducing the amount of time needed to perform low level operations
increases time which can be invested in understanding underlying
concepts.
The skills needed to successfully accomplish problem solving defy
simple description; successful problem solvers use diverse approaches.
Nevertheless, solutions usually are based on an understanding of
important concepts, the representation of the problem in terms of these
concepts, and the execution of steps leading from the representation to
the solution. Exercises of progressive difficulty are essential to hone
these skills. Computers can help students learn by accomplishing a
solution once the problem has been represented in an appropriate manner.
Much practice is required to learn to represent problems in useful ways;
computer- based instructional materials can be organized to provide many
opportunities and paths to practice. The work reported here didn't
focus heavily on this use of computers.
The Two Courses
The two courses involved in this report are an introductory statistics
course and a course on analysis of computer algorithms. The results
reported here cover one semester. However the software developed locally
for the algorithm course was used intermittently for several years during
its development. The goal of computer use in each was to:
* reduce cognitive load on the students,
* allow the students and the instructor to concentrate on
concepts, and
* allow the student to work realistic problems.
We will describe how this was done in each course in a later section.
No attempt was made to use the computer as a replacement for the
instructor, textbook, or lecture.
Macintosh microcomputers were used in this study, because we felt
students could master this device in a short time and their price was
not prohibitive. The goal was to reduce cognitive load, not increase
it. The computer lab used by the students contained twenty-one
Macintosh computers with hard drives. Each of the computers could
access either of two dot matrix printers via an AppleTalk network. The
room housing the equipment was open eighty-five hours per week, but was
staffed only twenty-five hours per week, ten hours by an undergraduate
teaching assistant and fifteen by faculty. The computer lab was located
across campus from the instructors' offices, a walk of ten to fifteen
minutes. When problems arose during unstaffed time, students either
walked across campus looking for help or waited until office hours.
Unfortunately, this made it hard for students to correct errors in use
of the software. We also used a Macintosh connected to a projector,
both secured on a cart. This Mac- on-a-cart was used in the classroom
to demonstrate the use of software and as an instructional tool.
Introductory Statistics
This course introduces students to the main concepts and tools of
statistics at the sophomore/junior level. It covers descriptive
statistics, sampling and probability, the binomial and normal
distributions, estimation, tests of hypotheses and confidence intervals,
inference to one and two populations, followed by a quick survey of
simple linear regression and the one-way analysis of variance. The
course had no prerequisite for computer knowledge because this study was
the first attempt at introducing computers into this course. The
mathematics prerequisite is approximately two years of high school
algebra, enforced by departmental examination. The instructor, Scott
Urquhart, was at San Diego State University on sabbatical leave from New
Mexico State University. He has many years of experience using diverse
computers for statistical analyses and using computers for helping
students learn. He had designed, implemented, and used an interactive
mainframe computer package for supporting instruction in a second
semester regression course for students who (possibly) knew nothing
about computer usage. He also teaches advanced statistical analysis in
sophisticated computing environments.
In the instructor's past experience, he found that students frequently
confused the subject (statistics) with the tool (computers). Given this
background, the plan for this experimental course was to concentrate on
computer usage for about five weeks, using statistical software as a
continuing illustration. Thereafter attention was focused on the
statistical issues of sampling and inference. The intent was to
introduce computer usage through descriptive statistics, then to
encourage students to use the computing resource to explore how the
statistical techniques work. It was hoped that this would allow less
attention to details of computing and more concentration on concepts.
Each student was given his or her own data set consisting of three parts:
* a population (of 100 wages),
* 20 samples selected out of 200 random samples, each of size
ten, which were taken from this population;
* the means and variances of the wages for each of the 200
random samples.
Each student had a unique population and thus unique samples. Most of
the thirteen homework assignments were given in terms of students' own
data. The unique data sets meant unique homework solutions, hence
students had to do their own work; substantial copying of homework
didn't occur as it often does in statistics courses at this level. (The
issue was learning, not grades.) Further comparisons of results between
students and summaries in class emphasized variations of the sort
statistics is intended to cope with. A separate computer program
written by the instructor and used here as well as at his home
institution produced individualized answer sheets.
The software used was DATA DESK, Student Version (1.0),
written by Paul and Agelia Velleman. Several features in the
Professional Version of this software had been removed to produce a
simpler and inexpensive Student Version. Functions of the DATA
DESK software have three relatively distinct areas: data entry and
manipulation, graphics, and statistical analysis. Its data-oriented
functions support data entry in a spreadsheet- like format with
many user configurable attributes. Macintosh clipboard actions are
supported and are consistent with other Macintosh applications.
Further data-oriented features include a data import facility and a
substantial facility for performing transformations and/or
restructuring of a data set. Variables also can be created from
entered patterns or according to random specifications.
The graphics-oriented functions of the DATA DESK capitalize on
the Macintosh's graphics capabilities. The major features support
box plots in several forms, histograms, and scatter plots. These
features allow resizing of displays, but not overlaying of several
plots on the same axis system nor simple labeling for axes and
titles. (The intent is for graphic results to be passed out of the
DATA DESK for subsequent editing in a word processor.)
The analysis features of the DATA DESK support most elementary
parametric statistical analyses; by prior application of a ranking
transformation several nonparametric analyses also are supported.
Currently available techniques extend through one-way analyses of
variance and multiple linear regression. More recent versions of the
DATA DESK support more complex statistical analyses. Several
features clearly are designed to encourage students to explore
statistical testing and confidence intervals.
Each student was required to purchase a copy of the DATA DESK.
The instructor then built an operational disk containing the
operating system, needed parts of the software, and that student's
personal data sets.
Students in this part of the experiment were in two lecture
sections. The first met for fifty minutes, afternoons, three times a
week. It began with thirty-one students of whom twenty-eight
completed the course. The second section met during evening hours,
for seventy-five minutes, twice a week, and started with twelve
students of whom nine completed it. These classes contained mainly
students typical of a commuter university: undergraduates, working
part time, living several miles from campus.
About half of the students came from the health sciences and
related biological areas; the remainder were nearly evenly divided
between the mathematical and social sciences. Although the course
was numbered to be at the sophomore level, about two-thirds of
each section was past that level. The two sections together
included seven students who had completed at least a bachelors
degree; one was an MD who had practiced. These were diverse
classes, but performance ranged from good to poor in every
distinguishable group. However, performance was modestly
associated with both major and level: math science students scored
about 0.3 grade points higher than the rest of the class (from C+ to
B-), and the postgraduate students scored another 0.3 higher (from
C+ to B).
In-class instruction consisted of presentation of material using
a chalk board, demonstrations using the Mac-on-a-cart, planned
discussions of the objectives and results of the homework
assignments, and spontaneous discussion of relevant questions
raised by students. The Mac-on-a-cart was taken to class except for
exam review days (3), exams (2) and the final exam. It was used
sometime during the class period most days, for as little as five
minutes or as much as forty minutes.
Some initial information was gathered about the students'
knowledge of computing. About one-third of the students had a
little experience with a Macintosh; none regarded him- or herself as
proficient. About half of the students had experience with another
micro, mainly PCs. Only about one-fifth of the students had
experience with a system larger than a microcomputer. Nine of the
students reported they had no experience with any kind of computer.
Grading initially caused understandable anxiety. An unknown
instructor was teaching a difficult course with a dramatically
different instructional approach. The announced grading plan had
points for homework, quizzes, participation, and exams, but no fixed
boundaries for specific grades. The instructor assured the classes
that the class average grade would be between 2.25 and 2.50 (C =
2.00) for all students who actively participated for the whole
semester; the final average was 2.35. Thus, comparison of grades of
students in these classes with other classes has no validity.
Shortly after mid-semester, an anonymous survey of student
attitudes was conducted; a second was conducted near the end of the
semester. Some of our observations about attitudes in this course
come from these surveys; others come from personal observation
and numerous casual conversations with the students.
Algorithms Course
The computer science course involved in this study was a senior
level computer algorithms course. It covers both important
computer algorithms and basic principles and techniques of
algorithm analysis. Typically such a course covers algorithms
concerned with sorting, graphs, string matching, geometry, and some
numerical evaluations. The analysis of an algorithm determines the
resources, usually time and space, the algorithm uses. Such an
analysis uses mathematical techniques and timing studies of the
algorithm. The mathematical analysis shows the amount of a
resource required by the algorithm as a function of the size of the
input. Often this analysis only identifies the order of magnitude of
this function. For example an algorithm with input size N may run in
a + b*N or fewer time units, where a and b are constants. In many
cases constants, such as a and b, may be extremely difficult to
determine analytically. Timing tests on machines can give an idea
of their values and how the algorithm really performs. The
algorithms course is meant to provide future programmers with
insights and tools to make informed decisions affecting the speed
and space requirements of programs.
The algorithms course is difficult for students. Algorithms are a
dynamic process which can be difficult to understand. Our students
tend to have some of the mathematical deficiencies mentioned
earlier. Since the course requires understanding and performing
mathematical analysis, any deficiency of mathematical skills
hinders students in the course. Because a high level of mathematics
can be required to analyze algorithms, students sometimes can
neither perform nor understand the analysis of an algorithm. This
can be devastating to the students. Finally, students find it
difficult to relate the mathematical analysis to performance of
algorithms and programs.
The software used was MacBalsa, written by Marc Brown, and
Algorithms Lab (AL), written locally by Roger Whitney. MacBalsa is
an algorithm animation program. This program graphically
demonstrates the operation of algorithms. Each algorithm has many
different views. Algorithms can be viewed singly or in groups. The
user can step though an algorithm or run it as a movie. This program
constantly wins kudos from students. They claim it aids
understanding of the algorithms far more than lectures or the text.
The text for the course, Algorithms by Sedgewick, includes graphics
from the workstation version of MacBalsa.
Preliminary versions of AL were written for a Vax 780
minicomputer by John Donald and Roger Whitney. The software was
moved to the Macintosh using HyperCard as an interface. AL has
three tools to aid the study of algorithms: a timer, plotter, and a
least squares fitter. Students select all actions and commands by
clicking a mouse on a button or menu entry. The timer allows
students to measure the elapsed running time of an algorithm for
data sets of sizes they choose. The results (size of a data set,
associated running time for the algorithm) are displayed in a table.
Once an algorithm is timed, the two other tools support
investigation of the resulting numerical data. Although students can
examine the source code for all the algorithms in AL, they can
neither modify existing algorithms nor add their own. Providing
such a facility is a major goal for future versions, although earlier
versions of AL demonstrated the importance of keeping the
programming details of such a process to a minimum.
The software in this course was used to:
* illustrate algorithms,
* verify the mathematical analysis of algorithms,
* determine analysis not possible via mathematical means, and
* allow students to select proper algorithms for given situations.
When an algorithm was covered, MacBalsa was used in class to
demonstrate the algorithm. Students were asked to investigate its
performance using AL. If the mathematical analysis indicated that
algorithm A is faster than algorithm B, the students were asked to
verify the analysis by experimentation with AL. This helped
students translate factual information given in the text and lecture
into personal experience. When the mathematical analysis could not
determine which algorithm is faster, students were asked to
determine this experimentally. In a conventional algorithms course
students implement the algorithms and then time them. Such
implementation can take days or weeks to complete, instead of the
few minutes required when using AL. Local experience indicates
that when students implement a set of algorithms they do not
investigate their performance, and fewer algorithms can be
examined in a semester.
Finally, students were asked to select the best algorithm in
several typical sorting situations. They were instructed to support
their selections with suitable evidence. The situations were
designed to approximate sorting problems confronting working
programmers; the answers could not be determined exclusively by
either mathematical analysis or timing algorithms.
Many students in this course experienced difficulty performing a
mathematical analysis of an algorithm, apparently often as a
consequence of their deficiencies in mathematics. Given our
thoughts on applying computers to learning, why didn't we provide
tools to aid such students in performing the mathematical analysis
of an algorithm? The task of producing such a system was
overwhelming. Any system that we designed to aid students solve a
reasonable set of problems was beyond our resources to construct.
Two sections of the algorithms course, taught by different
instructors, used AL. About seventy students in total took the
course. All were computer science majors for whom it is a required
course; nearly all were seniors. They had extensive experience with
mainframe and minicomputers. Some had experience with several
different operating systems. Many of these students had work as
computer professionals. However the Macintosh computer was new
to most of them. The ability of the students varied dramatically;
some take the course several times before passing.
Results
As stated earlier the computer usage was designed to:
* reduce cognitive load on the students,
* allow the students and the instructor to concentrate on
concepts, and
* allow the student to work on realistic problems.
We had hoped the computer usage would improve student
motivation, increase students' understanding of concepts, and aid
the weaker students. We meet with some success, primarily among
the better prepared students. Perhaps the most interesting results
of the computer usage are: the lack of improvement by the weaker
students, the cost of the experiment to the instructors and the
department, and the changes that occurred in the courses. The
weaker students, primarily those in the statistics course, did not
show improvement. On the surface this seems counter to our
expectation that a reduction in cognitive load should improve
learning: reducing cognitive load should aid the weaker students as
much as the strong students. Closer examination reveals the
situation to be rather complex. In comparing the statistics students
against the computer science students we conclude that the
computer imposed a large cognitive load on the weaker students. We
feel a computer literacy course would alleviate the problems the
weaker students faced with the computer.
Our experiment extracted a high cost from the instructors and
the department. Equipment had to be obtained, housed and
maintained. This required a concerted effort before the experiment
was run and requires a continuing effort into the future. The full
consequences of these cost are not yet known.
The course changed not only in content but also in intent. Clearly
some material had to be included into the courses cover the
computer usage. The courses did focus more on conceptual
understanding of the material and thus became more useful to the
students.
Below we report how well we achieved our goals, as well as the
project's impact on the courses, the instructors, and the department.
Motivation
Motivation and enthusiasm defy simple measurement. We can
only report our observations: In both courses the students' attitude
toward the use of the computers was very positive and generally
appreciative.
The statistics students appreciated being required to learn about
microcomputers: Most were pleased with the opportunity to learn
about this new and useful tool. One student commented that in the
real world he would use the computer to perform statistics, hence
using the computer in class was the only reasonable way to learn the
subject. Others reported using the software on assignments for
other courses. One of our goals was to make the material more
relevant to the student. Having students utilize material in another
course, because of its usefulness, indicates some success in that
goal. As a result of this course several undeclared students decided
to become statistics majors. Nevertheless, the students were far
from unanimous in their evaluation of the value of computer use in
this course. They were asked to consider a future student like
themselves and to express their recommendation as to how this
course should be taught. The results were: strongly favor traditional
(noncomputer), 9; weakly favor traditional, 5; neutral, 6; weakly
favor computer-based approach, 9; and strongly favor computer-
based approach, 5. A slight plurality (19 to 14) said the course
helped them to become more interested in computers. They were
nearly equally split (15 to 17) in whether the course helped them
become more interested in statistics, but eight of them indicated
the course made them plan to take further statistics course(s), and
another nine answered with "MAYBE." Clearly the computer-based
approach appealed to and motivated some students.
The computer science students recognized and appreciated the
amount of resources directed at them, both the hardware and the
effort to produce the software. This modest effort to improve a
course was far more than they normally observe. When surveyed at
the end of the course eighty percent of seventy-three students
preferred using the software to using standard methods for timing
algorithms. Ninety-two percent of the students felt that they
learned "a lot" or "some" by timing the algorithms. The remaining
eight percent felt that they learned very little or nothing. The
sorting assignment (where students selected the best sort for each
situation) received high praise in the course survey.
Learning
The instructor of the statistics course has taught statistics for
twenty-four years. His experience with students in this type of
course was used to evaluate the performance of these students. The
better students made more progress than normal. For these students
the computer was a useful aid in learning statistics. Hence, for the
better students our hypothesized computer usage was valid.
However, the weaker students did not perform as well as weak
students in similar courses not using computers. We observed that
using the computer was an obstacle for a number of students. These
students had a hard time operating the computer and separating the
software usage from the statistical concepts. During the course of
the semester, various students would finally master the machine
and make rapid gains in statistics. The students (nine of forty-
three) who reported no computer experience at the beginning of the
course divided into two distinct groups, ones who overcame this
obstacle (three of the nine), and those for whom the computer
remained an obstacle throughout the course.
We investigated how these students performed in other courses
by examining their entire transcripts at the end of the following
semester. We found that student performance in this course was
highly correlated with past performance. There were exceptions,
some noteworthy but others attributable to causes unrelated to the
course. For example one student had just started a new business so
she spent little time on the course and performed worse than her
past record indicated she should have. Several of the students
receiving grades of D or F in this course retook it the following
semester; most received almost the same grade the second time,
even though it was taught by a different instructor and without
computer use. We also discovered a high correlation between
computer experience and the strength of the student. The stronger
students had higher GPAs and more computer experience. This makes
it impossible to judge whether the computer hampered the weaker
students, since they were the ones with the least computer
experience. A plot of student's GPA at the end of the following
semester against rank in the computer-based statistics course
displayed several outliers deserving comment. Of the four students
who did substantially better than their GPA history, three were
among the math science students who declared a concentration in
statistics by the end of the semester. Several social science
students who struggled with the computer usage faired less well
than their GPA history.
This contrasts with the effects in the computer science course.
As far as we can determine none of the algorithms students were
seriously impeded by the computer. The lab helped the students
learn about data collection and data analysis, a topic not covered in
previous years. The students also performed better at determining
which algorithm to use in a given situation. For the first time,
students could answer such questions from direct experience, rather
than only from a book or lecture as in the past. Both instructors of
this course felt that students were better prepared to use the
course material in their professional lives. The students did not
improve in the their ability to perform mathematical analysis of
algorithms. However, this is not surprising as the changes in the
course were not directed toward increasing students' ability to
perform mathematical analysis. The course changes made
comparison of grades in previous classes rather meaningless.
In summary we feel the use of computers helped the students
learn the important concepts in the courses, with the better
students benefitting the most. As seen in the statistics course, the
weaker students did not benefit from the use of the computer and
probably were the ones who least liked it. Even though the
instructor felt they performed worse, their grades were comparable
with their past performance. Given the experience with the
algorithms course, we feel that if the weak statistics students had
prior computer experience they probably would have benefitted more
from the computer use.
Effect on Course
An interesting side effect of the computer usage was a tendency
to change the courses. The algorithms course changed in several
ways. The first and unexpected way was the need to discuss the
collection and analysis of data, since the computer science students
had no background in this area. They had to be taught what type of
data to collect, what tests to use on the data, and what the results
meant.
For the first time, we were able to have the students investigate
which algorithms would be appropriate in given situations. The
course drifted from explaining algorithms and measuring an
algorithm's mathematical performance to which algorithm performs
better, why it performs better and when should it be used. More
time was spent on how to select an algorithm, which is part of the
purpose of the analysis. The students responded enthusiastically to
this shift; they viewed it as useful training.
The changes in the algorithms course resulted from questioning
the relevance of a conventional course to students. The longer the
project continued, the stronger the questioning became. The field of
algorithm analysis is dominated by academic researchers who
investigate the complexity of algorithms and write research papers
and books from their results. Our students become programmers and
system analysts. The researcher and the programmer perform vastly
different tasks. It is not clear how much of the classical subject
matter is relevant to the programmer. Even the relevant parts
usually are presented in manner which disguises their value. This
study led to a redefinition of the course for the algorithms
instructors.
The intent for the statistics course was to make it more
conceptual than are similar traditional courses. The plan was for
routine computation to be relegated to the computer so students
could concentrate on understanding and interpreting results. The
stronger students embraced this approach and made substantial
progress; the weaker students were much slower to adapt to the
computer usage and resisted the move from "How do you do it?" to
"What does it mean?"
Computer Literacy
The two courses clearly differed relative to how successfully
computers were used as a learning tool. We feel that the biggest
difference between the courses lay in the difference in computer
literacy between the students in the two courses. Students in the
algorithms course had worked on several computers and several
operating systems although almost none of them had prior
experience with a Macintosh. By contrast, about twenty percent of
the students in the statistics course had experience with a
Macintosh, but twenty-five percent had no computer experience and
none of them had the diversity of experience typical of students in
the algorithms course.
Students in the algorithms course adapted to the Macintosh in a
week or so, even though they received less than twenty minutes of
explicit instruction on its use and about the same amount of
instruction on use of the software. By contrast, the statistics
course presented about three hours, spread over five weeks, on using
a Macintosh and allied matters such as backup and printing. Some
aspect of the statistics software was demonstrated almost every
class meeting. Features of the software, not the content of
statistics, probably received five hours of instruction spread
through the course, but concentrated more heavily during the early
part of the semester. Some students in the statistics course
mastered the Macintosh and the statistics software in a few weeks.
These probably were mostly the fourteen percent who strongly
favored the computer oriented approach. On the other hand, a number
of students remained frustrated with the computers and/or
software throughout the course. Many of these students probably
were among the twenty-six percent who indicated that computer use
was of little value or hindered their learning.
Computer literacy appears to be the fundamental difference
between students in the two courses which contributed to the
difference in the success and acceptance of computers for learning.
When students with experience in several environments had trouble,
they recognized that fact and tried something different. Students
with less experience had substantial difficulty isolating and
resolving problems. They were very prone to repeat the same error
several times without recognizing the mistake they made, and then
blamed the computer for not doing what they wanted.
Our experience suggests some relevant topics for a course in
computer literacy. Students need to understand the role of an
operating system and how to communicate with it, whether it is
mouse driven, menu driven, or command driven. A hierarchical file
system provides a powerful way to organize files, but it embodies a
degree of sophistication which needs to be explicitly understood
before it can be used. The Macintosh paradigm of file folders and
icons slightly simplifies the use of a hierarchical file system for a
user who is familiar with a typical manual filing system. This
paradigm breaks down for a user who never has dealt with a complex
filing system because it explicitly indicates only two levels; few
manual systems have folders within folders within folders. An
experienced computer user takes many actions almost by reflex, i.e.,
without conscious act: starting and ending use of a piece of
software; making choices or indicating actions required for the
software to continue; naming files; using a network, even if only for
printing. Novice users need to be taught such things. More conscious
actions include making backups to recover from the inevitable
mistakes we, software, and machines make. Ethical matters related
to copyright and respect for the privacy of others' work also deserve
attention.
Effects on Instructors and Department
The use of the computers in these courses increased the amount
of time required to teach them. The increase in work load had
different sources. First, the involved faculty found themselves
operating a small computer center. This included raising funds,
purchasing equipment, installing software, debugging network
problems, hiring and training student lab assistants, deciding lab
operation policies and serving as lab proctors. Such work need not
be done by the instructor of the courses, but in this case it was. The
other source of the increased work came from integrating the
software into the course. Course lectures and assignments needed
to be changed to incorporate the use of the computers.
The department also faces an increase in work load. The first
author was the primary force behind the microcomputers project.
Funds to purchase, about $150,000 to date, and operate the computer
lab were raised by him outside of the department's budget. However,
the department must now undertake the operation and maintenance
of the lab. This represents an increased work load for an already
overworked faculty and staff and increased expenditures in an
already meager budget. This equipment was obtained at a time when
the department greatly increased its computer equipment holdings
without funding and staffing increases. It is not clear how the
department's technical staff can maintain their current work load.
The department also faces the problem of incorporating the
computers into its curriculum. The major effort has been by the two
authors, one of whom was a visitor. A few other faculty use the lab
in the manner we described. One statistics instructor jumped on the
bandwagon and is enthusiastically incorporating computers into his
second level courses. Indications are that the statistics curriculum
will undergo major changes, partly due to the encouragement of the
statistics coordinator and partly due to our initial demonstrations.
The changes in the statistics curriculum may be moving more toward
national trends than breaking new ground. It is too early to
determine if other parts of the department will use the machines as
tools for learning.
Would We Do It Again?
The answer in both cases is a qualified no, but the qualifications
are totally different for the two courses. In the algorithms course,
the qualifications center around the untenured status of the senior
author. In the statistics course, the qualifications focus on some
students' lack of computer experience.
The statistics instructor would not use computers in the
statistics course again under present conditions. The students'
problems with computers and software use must be addressed first,
not as part of a one- semester introduction to statistics. One
solution might be to train the staff sufficiently for the lab. This
would require resources that seem far beyond our department or
similar ones. Another solution would be a computer literacy course
required of all freshman students, perhaps in conjunction with
introductory English. As John Wheeler of UC Berkeley indicates:
"Students learn by doing in a meaningful context". A computer
literacy course needs motivation beyond learn how to use a
computer; which is why we feel it should be offered in conjunction
with another course. This is our preferred solution; it would allow
all subsequent courses, including statistics courses, to concentrate
on its subject matter using available tools to enhance learning. The
second author is pursuing this approach at his home institution.
Summary
The mathematical science educational community is facing a
challenge in adapting to the demands of our rapidly changing world.
We used computers as tools for students in two courses in an
attempt to meet this challenge. We found that students improved
their grasp of the subject matter, and student motivation increased,
although not universally. The courses became far more relevant. We
also found the resources needed to successfully integrate the
computers into the curriculum were far greater than those available
to us. Nevertheless an effective prerequisite course on computer
literacy would alleviate many of the difficulties we had in the
statistics course. In looking back on the effort, we cannot see how
institutions can afford the resources needed to integrate computers
successfully into the curriculum. Looking into the future, we cannot
see how the universities can survive without integrating computers
into the curriculum.
Note: Partial Support for this work was provided by the National
Science Foundation's College of Science Instrumentation Program
through grant #CSI-8750569 and by a donation from Apple Computer,
Inc.
We would like to thank John Donald for his insightful comments
on drafts of this paper.
CONTACTS
For information about Data Desk write to:
Odesta Corporation
4084 Commercial Avenue
Northbrook, IL 60062
(312) 498-5615
The student version of Data Desk is available at:
Kinko's Academic Courseware Exchange
255 West Stanley Avenue
P.O. Box 8000
(800) 292-6640 (in CA)
(800) 235-6919
For more information about MacBalsa write to:
Marc H. Brown
DEC Systems Research Center
130 Lytton Avenue
Palo Alto, CA 94301
(415) 853-2152
mhb@src.dec.com (e-mail)
For more information about AL write to:
Roger Whitney
Math/Computer Science
San Diego State University
San Diego, CA. 92182
(619) 594-6191
whitney@sdsu.edu (e-mail)
AL is being prepared for national distribution.
REFERENCES
National Research Council. Everybody Counts: A Report to the Nation
on the Future of Mathematics Education. Washington, D.C.: National
Academic Press (1989).
John Wheeler, Private communication, July 1990.
- - -