Wednesday, March 20, 2019

British Math vs European Math



Since the seventeenth century, Britain had stood, mathematically, with its back toward Europe, scarcely deigning to glance over its shoulder at it. Back then, Isaac Newton and the German mathematician Gottfried Wilhelm von Leibniz had each, more or less independently, discovered calculus. Controversy over who deserved the credit erupted even while both men lived, then mushroomed after their deaths, with mathematicians in England and on the Continent each championing their compatriots. Newton was the premier genius of his age, the most fertile mind, with the possible exception of Shakespeare’s, ever to issue from English soil. And yet he would later be called “the greatest disaster that ever befell not merely Cambridge mathematics in particular but British mathematical science as a whole.” For to defend his intellectual honor, as it were, generations of English mathematicians boycotted Europe—steadfastly clung to Newton’s awkward notational system, ignored mathematical trails blazed abroad, professed disregard for the Continent’s achievements. “The Great Sulk,” one chronicler of these events would call it.

In calculus as in mathematics generally, the effects were felt all through the eighteenth and nineteenth centuries and on into the twentieth. Continental mathematics laid stress on what mathematicians call “rigor,” the kind to which Hardy had first been exposed through Jordan’s Cours d’analyse and which insisted on refining mathematical concepts intuitively “obvious” but often littered with hidden intellectual pitfalls. Perhaps reinforced by a strain in their national character that sniffed at Germanic theorizing and hairsplitting, the English had largely spurned this new rigor. Looking back on his Cambridge preparation, Bertrand Russell, who ranked as Seventh Wrangler in the Tripos of 1893, noted that “those who taught me the infinitesimal Calculus did not know the valid proofs of its fundamental theorems and tried to persuade me to accept the official sophistries as an act of faith. I realized that the Calculus works in practice but I was at a loss to understand why it should do so.” So, it is safe to say, were most other Cambridge undergraduates.

Calculus rests on a strategy of dividing quantities into smaller and smaller pieces that are said to “approach,” yet never quite reach, zero. Taking a “limit,” the process is called, and it’s fundamental to an understanding of calculus—but also, typically, alien and slippery territory to students raised on the firm ground of algebra and geometry. And yet, it is possible to blithely sail on past these intellectual perils, concentrate on the many practical applications that fairly erupt out of calculus, and never look back.

In textbooks even today you can see vestiges of the split—which neatly parallels that between Britain and the Continent in the nineteenth century: the author briefly introduces the limit, assumes a hazy intuitive understanding, then spends six chapters charging ahead with standard differentiation techniques, maxima-minima problems, and all the other mainstays of Calc 101 . . . until finally, come chapter 7 or so, he steps back and reintroduces the elusive concept, this time covering mine-strewn terrain previously sidestepped, tackling conceptual difficulties—and stretching the student’s mind beyond anything he’s used to.

Well, the first six chapters of this generic calculus text, it could be said, were English mathematics without the Continental influence. Chapter 7 was the new rigor supplied by French, German, and Swiss mathematicians. “Analysis” was the generic name for this precise, fine-grained approach. It was a world of Greek letters, of epsilons and deltas representing infinitesimally small quantities that nonetheless the mathematicians found a way to work with. It was a world in which mathematics, logic, and Talmudic hairsplitting merged.

First Gauss, Abel, and Cauchy had risen above the looser, intuitive nostrums of the past; later in the century, Weierstrass and Dedekind went further yet. None of them were English. And the English professed not to care. Why, before the turn of the century, Cauchy—the Cauchy, Augustin Louis Cauchy, the Cauchy who had launched the French school of analysis, the Cauchy of the Cauchy integral formula—was commonly referred to around Cambridge as “Corky.”

Since Newton’s time, British mathematics had diverged off on a decidedly applied road. Mathematical physics had become the British specialty, dominated by such names as Kelvin, Maxwell, Rayleigh, and J. J. Thomson. Pure math, though, had stultified, with the whole nineteenth century leaving England with few figures of note. “Rigor in argument,” J. E. Littlewood would recall, “was generally regarded—there were rare exceptions—with what it is no exaggeration to call contempt; niggling over trifles instead of getting on with the real job.” Newton had said it all; why resurrect these arcane fine points? Calculus, and the whole architecture of mathematical physics that emanated from it, worked.

And so, England slept in the dead calm of its Tripos system, where Newton was enshrined as God, his Principia Mathematica the Bible. “In my own Tripos in 1881, we were expected to know any lemma [a theorem needed to prove another theorem] in that great work by its number alone,” wrote one prominent mathematician later, “as if it were one of the commandments or the 100th Psalm. . . . 

Cambridge became a school that was self-satisfied, self-supporting, self-content, almost marooned in its limitations.” Replied a distinguished European mathematician when asked whether he had seen recent work by an Englishman: “Oh, we never read anything the English mathematicians do.”

The first winds of change came in the person of Andrew Russell Forsyth, whose Theory of Functions had begun, in 1893, to introduce some of the new thinking—though by this time it wasn’t so new anymore—from Paris, Göttingen, and Berlin. Written in a magisterial style, it burst on Cambridge, as E. H. Neville once wrote, “with the splendour of a revelation”; some would argue it had as great an influence on British mathematics as any work since Newton’s Principia. By the standards of the Continent, however, it was hopelessly sloppy and was soundly condemned there. “Forsyth was not very good at delta and epsilon,” Littlewood once said of him, referring to the Greek letters normally used for dealing with infinitesimally small quantities. Still, it helped redirect the gaze of English mathematicians toward the Continent. It charted a course to the future, but did not actually follow it.

That was left to Hardy.