Today, mathematics and computer science often appear as the province of geniuses working at the very edge of human ability and imagination. Even as American high schools struggle to employ qualified math and science teachers, American popular culture has embraced math, science, and computers as a mystic realm of extraordinary intellectual power, even verging on madness. Movies like A Beautiful Mind, Good Will Hunting, and Pi all present human intelligence in the esoteric symbolism of long, indecipherable, but visually captivating equations. One has to think of such prosaic activities as paying the mortgage and grocery shopping to be reminded of the quiet and non-revelatory quality of rudimentary arithmetic. Which is not to put such labor down. Adding the price of milk and eggs in one’s head is also brain work, and we should never forget the central place of mere calculation in the development of more sophisticated areas of human knowledge.
Long before the dawn of calculators and inexpensive desktop computers, the grinding work of large problems had to be broken up into discrete, simple parts and done by hand. Where scads of numbers needed computing — for astronomical purposes at the Royal Observatory in Greenwich, England, or to establish the metric system at the Bureau du Cadastre in Paris — such work was accomplished factory-style. In his book When Computers Were Human, a history of the pre-machine era in computing, David Alan Grier quotes Charles Dickens’s Hard Times to capture the atmosphere of such workplaces: “a stern room with a deadly statistical clock in it, which measured every second with a beat like a rap upon a coffin-lid.” The most famous modern example of such work is probably Los Alamos, where scientists’ wives were recruited in the early stages to compute long math problems for the Manhattan Project.
The social history of pre-machine computing is also interesting in light of contemporary debates about gender and scientific achievement, and here Grier’s reconsideration of the past sheds useful light on the present. Resigned Harvard president Lawrence Summers became an academic outcast after speculating that there might be an “intrinsic” basis for the unequal numbers of men and women engaged in science and engineering at the university level. The idea that men and women are different creatures, with distinct drives and ways of thinking, is apparently so radical that even to raise it leads to the academic guillotine. And yet only a few decades ago, it was assumed by even the most civilized societies that women were not fit for serious intellectual pursuits, especially scientific ones. The occasional female endowed with truly extraordinary talent occupied the unfortunate position of the George Eliot character who tells her son: “You may try — but you can never imagine what it is to have a man’s force of genius in you, and yet to suffer the slavery of being a girl.” Note that even this extraordinary character, created by an intellectually accomplished, great female novelist, refers to genius as something particularly male.
In the history of computing, the humbler levels of scientific work were open, even welcoming, to women. Indeed, by the early twentieth century computing was thought of as women’s work and computers were assumed to be female. Respected mathematicians would blithely approximate the problem-solving horsepower of computing machines in “girl-years” and describe a unit of machine labor as equal to one “kilo-girl.” In this light, one can surely understand the desire to correct past orthodoxies about the female mind with new ones. But even as we rightly decry a past when even the most talented women were prevented from pursuing math and science in the most prestigious posts, we should remember — and honor — the crucial role of women in advancing mathematical and scientific knowledge one detailed calculation at a time.
At the beginning of the long line of women who made their marks as human computers was Nicole-Reine Lepaute. Like many women featured in Grier’s book, Lepaute enjoyed a personal connection to the intellectual world, allowing her to gain experience with scientific matters in spite of conventions that warned women away from science. She owed her education to the forbearance of understanding parents; her freedom to pursue an intellectual career to an obliging husband; and her professional position to Joseph-Jérôme de Lalande, her longtime scientific collaborator.
In a book published in 1705, using Isaac Newton’s new calculus, the English gentleman-astronomer Edmond Halley identified and predicted the return of the comet eventually named after him. But it was the French mathematician Alexis-Claude Clairaut, along with Lalande and Lepaute, who first computed the date of the comet’s perihelion with any precision in 1757, predicting it would occur in the spring of the following year. Sitting “at a common table in the Palais Luxembourg using goose-quill pens and heavy linen paper,” writes Grier, the three friends slowly computed the course of Halley’s Comet along a parabola-shaped orbit, reducing the math to an extraordinary series of baby steps.
Lalande and Lepaute focused on the orbits and gravitational pulls of Jupiter and Saturn (the three-body problem), while Clairaut focused on the comet’s orbit. “With the perspective of modern astronomy,” Grier writes, “we know that Clairaut did not account for the influences of Uranus and Neptune, two large planets that were unknown in 1757.” Still, the result of their number-crunching was a tenfold improvement in accuracy over Halley’s prediction, if still not perfect. When the comet reached its perihelion just a couple of days shy of the two-month window in which Clairaut and colleagues said it would, Clairaut’s computing method was ridiculed by one of the great intellectuals of the day, Jean d’Alembert, one of the editors of the Encyclopédie and himself an astronomer, who called the calculations more “laborious than deep.” But this has not been the verdict of history. “Beyond the simple accuracy of his result,” writes Grier, “Clairaut’s more important innovation was the division of mathematical labor, the recognition that a long computation could be split into pieces that could be done in parallel by different individuals.”
Mme. Lepaute was central to this effort, if largely unrewarded with professional position and prestige. Lalande hired her as his assistant when he became the editor of Connaissance des Temps, an astronomical almanac, where together they prepared tables predicting the positions of various celestial bodies. She performed valuable but largely unappreciated work.
Half a century later and an ocean away, Maria Mitchell would play the next part of the willing female computer supporting the bold designs of male scientists. In the 1840s, as American manufacturing swelled to claim some 25 percent of the economy and American pride vis-à-vis Europe launched a new era of economic and political competition, a movement took hold to establish an American nautical almanac. Lacking such a publication, claimed one supporter, “our absent ships could not find their way home nor those in our ports grope to sea with any certainty of finding their way back again.” The almanac’s chief mathematician was Harvard professor Benjamin Pierce, while the computing staff consisted of several students and amateurs. Mitchell was the only woman in the group. The daughter of a banker and amateur astronomer, she was not some anonymous savant: her discovery of a new comet in 1847 brought her fame and a medal from the king of Denmark. Mitchell herself felt no need to announce her discovery, mentioning it only to her father, who quickly checked to see if the comet had been claimed by anyone else and then insisted on publicizing her accomplishment.
Mitchell proved an able computer, not out of place among the gentlemen who filled this minute trade. She went on to become the first female professor of astronomy at Vassar College, gaining some of the recognition and opportunities that Lepaute never did. The tide was indeed slowly turning in women’s favor, though far from decisively. In the two decades following the Civil War, Grier reports, women went from holding one out of six hundred office jobs to one in fifty. The Harvard Observatory in particular found women to be especially desirable computers, since they accepted payment equal to half the going rate for men.
By the end of the nineteenth century, astronomy was no longer driving the science of computing. New scientific interests — from Darwinist anthropological investigation to modern mathematical economics to war production — would come to require and ultimately redirect the aims of computing. In this period, the discipline of statistics as we know it was born, reshaping the character of all kinds of social inquiry. Computing followed the growth of the social sciences: the effort to move away from broad ideas and conceptual investigations toward empirically-based methodologies in pursuit of a scientific knowledge of human affairs.
Francis Galton looked to mathematics to help prove Darwin’s theory of natural selection. In one investigation, he gathered crude data on African women “endowed,” he wrote to his brother, “with that shape which European milliners so vainly attempt to imitate.” Returning to England, Galton worked for the “Committee for Conducting Statistical Inquiries Into the Measurable Characteristics of Plants and Animals,” where such efforts to support Darwinism came under the powerful influence of Karl Pearson, who introduced a breakthrough formula for correlation. Pearson was also an unusual character, a man of far-flung intellectual interests and progressive social opinions. Grier, who passes up few opportunities to enliven his history, describes Pearson’s Hampden Farm House project, where women and men worked together in an egalitarian atmosphere studying plants. On Fridays, the workers would break for what were called “biometric teas,” while calculation and number-crunching took place on weekends. One of Pearson’s larger projects collated data on some 4,000 children and parents in an attempt to demonstrate that “moral qualities” of character and intelligence were hereditary. It was a fine example of how rigorous calculation in service to misguided theories is error masquerading as a thousand facts — a problem that obviously has not gone away.
World War I shifted the focus of computing to two kinds of questions: military problems concerning artillery trajectories and atmospheric drag, and economic problems concerning production, as the United States strove to outfit, feed, and arm the American Expeditionary Force. England’s Ministry of Munitions relied heavily on Pearson’s Biometrics Laboratory for help calculating ballistics for anti-aircraft munitions. In the United States, such work was handled at the Aberdeen Proving Ground in Maryland. The main task on both sides of the Atlantic was revising Francesco Siacci’s theory of ballistics trajectory, which worked well enough for the artillery of the nineteenth century but needed significant revision in the age of aerial warfare. Human computers struggled to calculate trajectories and end points for aerial bombs, anti-aircraft artillery, and the weaponry of aerial combat.
The Aberdeen Proving Ground was the Manhattan Project of its day. “For many years after the First World War,” said the mathematician Norbert Wiener, voicing conventional wisdom on this point, “the overwhelming majority of significant American mathematicians was to be found among those who had gone through the discipline of the Proving Ground.” But Aberdeen was not the United States’s only wartime home to computers. Nearby in Washington at the Experimental Ballistics Office, Elizabeth Webb Wilson, the top mathematics student in her class at George Washington University, found employment with several other women converting the raw data from Aberdeen into tables usable at the warfront. After the war, she looked in vain for another computing job, eventually becoming a high school mathematics teacher instead.
Wilson’s story confronts us with a paradox of social progress. In a post-feminist world, a distinguished young talent like Wilson would easily find employment working with numbers. Meanwhile, high schools go begging for anyone of Wilson’s ability — male or female — to teach mathematics. That the old system was unjust is indisputable; that the new system is better at raising up the next generation of mathematicians is a complicated question.
Economists also used computing to track domestic productivity. The punched-card tabulator, which the Census Bureau first used for the 1890 census, became an increasingly important tool for tracking retail pricing data mailed in to the Food Administration by thousands of correspondents scattered nationwide. Washington was not exactly converted overnight to such numerical representations of American economic life. As Grier describes: “The notion that the sprawling agricultural economy could be described with differential equations or probed with statistics calculations was not widely accepted in 1917-18.” The work of Harry C. Wallace (an editor and future Secretary of Agriculture) and his son Henry A. Wallace (a writer and future Vice President) during this period foreshadowed the future use of statistics to calculate everything from consumer confidence to inflation to the productivity of American manufacturing. Using carefully crunched numbers, the Wallaces tried to convince U.S. Food Administrator Herbert Hoover to guarantee a price for corn in order to shore up the related price of swine, but to no avail. Hoover feared meddling in the private sector — a stance that has of course become harder for American leaders to maintain, in part because mathematical models of the economy have grown increasingly sophisticated and thus ever more inviting to political intervention.
By the early twentieth century, the machine was catching up with the human computer, as suggested by the presence of those punched-card tabulators in Washington. While computing went from merely supporting astronomy to an essential tool of social science, the technology of computing went from a series of unreliable contraptions to more sophisticated adding machines and cash registers. Both science and business came to rely on the rapid numerical calculations that machines alone could efficiently produce.
The last hurrah for pre-machine computing was a product of the New Deal, the Mathematical Tables Project. Its task was to produce mathematical tables for use, “not only by mathematicians and astronomers, but also by surveyors, engineers, chemists, physicists, biometricians, statisticians, etc.” The Work Projects Administration (W.P.A.) required this large-scale computing operation to use labor-intensive methods and limit the number of female hires to 20 percent of staff. Few of the human computers they hired had completed high school. As one of the early computers recounted, “arrested TB cases, epileptics, malnourished persons abounded.”
Arnold Lowan, a physicist who had fled anti-Semitic pogroms in Europe but could not find a regular teaching position in the United States, was the director of the Mathematical Tables Project. His first lieutenant was Gertrude Blanch, another Eastern European immigrant who could not find academic employment, despite a doctorate in mathematics. Blanch proved to be a true leader. While the regulations of the W.P.A. seemed well-designed for a make-work project of endless mediocrity, she and Lowan worked overtime to check calculations and ensure high-quality products free of errors. Blanch even organized a lunch-hour math curriculum for willing workers that took them from elementary arithmetic through high school algebra, trigonometry, all the way to college calculus and, finally, matrix calculations, the theory of differences, and special functions. It was the most successful mathematical tables project in history.
The arrival of World War II sounded the death knell for work-relief projects, but the Mathematics Tables Project was certified as an urgent wartime program, granting it a reprieve and a degree of respect Lowan had otherwise sought in vain. Grier notes an interesting moment of contact between Lowan and John Brainerd at the University of Pennsylvania, where a team was struggling to build what would become ENIAC, an electrical analyzer that was being developed to calculate ballistics for the Aberdeen Proving Ground. Brainerd was looking for highly skilled human computers, but Lowan’s group was not what he had in mind. Lowan used machines to facilitate the work of human computers; Brainerd wanted human computers to aid the work of his machine. Brainerd then met his own Nicole-Reine Lepaute figure, Adele Goldstine, the wife of a ballistics officer who had done graduate work in mathematics. Goldstine set up a classroom program to educate their own team of computers and promptly hung a “women only” sign on the door of their lab.
At the time, there were almost no researchers whose primary interest was computing, still seen as a mere handmaiden to other, more substantial scientific interests. But this was changing fast as machines began to outperform human computers. Up until World War II, human computers had the advantage. As Grier writes: “A punched-card tabulator could work much faster than a human being, but this advantage was lost if the operator had to spend days preparing the machine.” Richard Feynman, then a junior staff member at Los Alamos, arranged a showdown between man and machine, pitting a group of human computers against the Los Alamos IBM facility with both performing a calculation for the plutonium bomb. For two days, the human computers were able to keep up with the machines. “But on the third day,” recalled one observer, “the punched-card machine operation began to move decisively ahead, as the people performing the hand computing could not sustain their initial fast pace, while the machines did not tire.” Shortly after the war, the machines took over; their human accompanists were now “operators” and “programmers.”
When Computers Were Human tells an important story. Interesting for its insights into science and computing, Grier’s book is also an impressive work of economic and social history. With the discovery of binary logic, the simplest parts of long problems became both too voluminous and too simple for human hands. Yet in slightly more complicated form, the simple number-crunching of long problems made ideal work for the attentive and moderately educated, and it was sometimes the only work available to well-educated women. That scientists often had the benefit of highly talented and under-rewarded female minds who could not stake claim to better-paid academic positions was an important boost to many serious intellectual enterprises. That women of the capacities of Elizabeth Webb Wilson or Gertrude Blanch are now much freer to pursue their own interests is an even greater boost to the sciences, though not without its costs.
During Covid, The New Atlantis has offered an independent alternative. In this unsettled moment, we need your help to continue.