Probabilities and statistics dominate our lives, yet few of us really understand them; here’s an attempt to shed some light. Bennett (Mathematics/Jersey City State Coll.) uses practical examples to convey the history and nature of her subject. Ancient societies used dice or bones not only for gambling but to decide matters of life and death—on the theory that a random mechanism made the divine will known, without human bias. Old Testament Hebrews drew lots to divide an inheritance—hence the term “lot” for a parcel of land. The I Ching is a more elaborate method of using randomizers (tossed coins or counted yarrow stalks) to solicit divine guidance. A more scientific approach to probability began with the Renaissance; Galileo’s writings about dice show awareness of the concept of equal probability. Bennett spends some time demonstrating the need for careful enumeration of all the possible outcomes in estimating probability. By the 18th century, the concept of random error led to scientists adopting the mean of a series of measurements as the best approach to accuracy. Laplace was the first to formulate the famous bell curve to describe the likely distribution of random events, a model rapidly adopted throughout the sciences. As the science of statistics matured, random numbers were generated as a tool for analyzing the randomness of natural phenomena. Eventually these investigations, often based on “randomly” chosen data such as the heights of convicts, yielded such statistical tools as the chi-square relationship, which often showed that the data were not as random as originally believed. It was not until the 20th century that the notion that yet undiscovered laws would allow exact prediction of all natural phenomena was abandoned by science and true randomness embraced—most strikingly in the form of quantum mechanics and chaos theory. A clear and detailed examination of the role of pure chance, with fascinating historical asides. (32 illustrations, not seen)