Thematic Working Group 5

Stochastic Thinking

Theme:

Computer-Based Tools

Dave Pratt

University of Warwick, UK (dave.pratt@warwick.ac.uk)

The use of computers in data handling and statistical analysis is now widespread. Affordances that digital technology brings, such as speed and automation of calculation, make computers a natural, even indispensable, tool in the field of stochastics. Technology has been exploited to bring new tools to simplify and extend such computations. More recently though, emphasis has changed so that the power of this technology is being harnessed towards the task of supporting students’ conceptual development. Few would argue that concepts such as average, graphing, probability, distribution and dispersion demand our attention both in terms of their relevance to modern society and with respect to their inherent complexity. Indeed other themes within Working Group 5 reaffirm the difficulty of some of these ideas. My own work with ChanceMaker, reported under the Probability theme heading, is one such attempt to design a microworld to research children’s understanding of probability. As part of Working Group 5, we were able to use Fathom, another example where pedagogical concerns have guided the design process. We also had the opportunity to work with NetLogo, a programming language designed to explore emergent phenomena, providing a new vision of how we might concretize randomness and frequency distributions.

There are four papers presented as part of the stochastics theme in Working Group 5. These papers can be categorised as above. Thus the software described in the paper by Karian exploits the calculating power of computers to enhance its ability to support high-level usage of statistical techniques. The paper by Alldredge focuses more on computer-enhanced course design but also attends to the teaching of statistics at university level. The remaining two papers by Abrahamson & Wilensky and Paparistodemou focus much more on conceptual psycho-pedagogical development, though the former emphasises learning through design and the latter learning through interaction.

Mathematica and Maple are programs that have now been available for several years and provide immense power to university undergraduates and researchers through their facility to execute complex symbolic manipulation. There has been considerable interest in the potential to use this power in learning situations. For example, just as we might argue that young children can learn important ideas about the behaviour of number through using a basic calculator, we might conjecture that students using Mathematica or Maple in a suitably designed course might come to appreciate the behaviour for example of integrals or limits. Interest surrounds the potential for learning about how a mathematical object behaves before learning how to construct that object. Karian describes how he has taken this notion into the domain of statistics by enhancing Maple through the provision of many additional statistically oriented primitives.

Alldredge and Brown examine courses designed around two packages, ActivStats and CyberStats. This was an interesting choice since the former provides an approach based on the provision of statistical tools for exploring the world whereas CyberStats takes a more traditional perspective in offering a set of content to be understood. Gender differences were observed in the influence of these software packages on the association between student beliefs and course performance. Nevertheless, the conclusion that ActivStats was arguably more effective over all students supports the general approach apparent in Karian’s work but also in the remaining two papers.

Paparistodemou innovative work reports on an experiment in which very young children were offered tools for changing the way a simple video-type game operated. The tool itself can be conceptualized as a two-dimensional representation of a distribution and so acted both as a control point over the outcome of the game and as a visual graphical (though entirely unconventional) representation of distribution. In this respect there are similarities with the way that I describe the role of the workings box in ChanceMaker (see under the Probability theme).

The fourth paper by Abrahamson and Wilensky shares the same background as Paparistodemou’s game that is designed in a parallel processing version of Logo. NetLogo is a relatively new programming language, based on its predecessor, StarLogo. In this paper, the principal author, Abrahamson, describes his own struggles to design a microworld in NetLogo. The interest surrounds his need to embrace a variety of epistemologies along the way. He found that working on the microworld addressed fundamental questions such as whether probability reflects confidence level or limits of relative frequencies, whether frequencies are empirical or mathematical, why co-occurrences occur less often than individual occurrences, how geometric determinism is connected to emergent data. Abrahamson’s paper stresses the continuing uncertainty about the epistemological foundations of probability. Conventional approaches to the teaching of probability obfuscate the epistemological tension inherent in stochastic experimentation and elsewhere Wilensky has discussed how, in his view, this leads to "epistemological anxiety". The approach of the authors is instead to provide tools that should enable students to explore these equally intuitive epistemologies, exposing them to scrutiny, and ultimately appropriating formal mathematical tools for reconciling them.

 

List of contributions (related to this theme)

 

Introduction to Thematic Group 5


This page have been mirrored by Hamlet for ErmeWeb