Men of Mathematics Page 4
Our point of view has changed—and is still changing. To Descartes’ “Give me space and motion and I will give you a world,” Einstein today might retort that altogether too much is being asked, and that the demand is in fact meaningless: without a “world”—matter—there is neither “space” nor “motion.” And to quell the turbulent, muddled mysticism of Leibniz in the seventeenth century, over the mysterious : “The Divine Spirit found a sublime outlet in that wonder of analysis, the portent of the ideal, that mean between being and not-being, which we call the imaginary [square] root of negative unity,” Hamilton in the 1840’s constructed a number-couple which any intelligent child can understand and manipulate, and which does for mathematics and science all that the misnamed “imaginary” ever did. The mystical “not-being” of the seventeenth century Leibniz is seen to have a “being” as simple as ABC.
Is this a loss? Or does a modern mathematician lose anything of value when he seeks through the postulational method to track down that elusive “feeling” described by Heinrich Hertz, the discoverer of wireless waves: “One cannot escape the feeling that these mathematical formulas have an independent existence and an intelligence of their own, that they are wiser than we are, wiser even than their discoverers, that we get more out of them than was originally put into them”?
Any competent mathematician will understand Hertz’ feeling, but he will also incline to the belief that whereas continents and wireless waves are discovered, dynamos and mathematics are invented and do what we make them do. We can still dream but we need not deliberately court nightmares. If it is true, as Charles Darwin asserted, that “Mathematics seems to endow one with something like a new sense,” that sense is the sublimated common sense which the physicist and engineer Lord Kelvin declared mathematics to be.
Is it not closer to our own habits of thought to agree temporarily with Galileo that “Nature’s great book is written in mathematical symbols” and let it go at that, than to assert with Plato that “God ever geometrizes,” or with Jacobi that “God ever arithmetizes”? If we care to inspect the symbols in nature’s great book through the critical eyes of modern science we soon perceive that we ourselves did the writing, and that we used the particular script we did because we invented it to fit our own understanding. Some day we may find a more expressive shorthand than mathematics for correlating our experiences of the physical universe—unless we accept the creed of the scientific mystics that everything is mathematics and is not merely described for our convenience in mathematical language. If “Number rules the universe” as Pythagoras asserted, Number is merely our delegate to the throne, for we rule Number.
When a modern mathematician turns aside for a moment from his symbols to communicate to others the feeling that mathematics in spires in him, he does not echo Pythagoras and Jeans, but he may quote what Bertrand Russell said about a quarter of a century ago: “Mathematics, rightly viewed, possesses not only truth but supreme beauty—a beauty cold and austere, like that of sculpture, without appeal to any part of our weaker nature, without the gorgeous trappings of painting or music, yet sublimely pure, and capable of a stern perfection such as only the greatest art can show.”
Another, familiar with what has happened to our conception of mathematical “truth” in the years since Russell praised the beauty of mathematics, might refer to the “iron endurance” which some acquire from their attempt to understand what mathematics means, and quote James Thomson’s lines (which close this book) in description of Dürer’s Melencolia (the frontispiece). And if some devotee is reproached for spending his life on what to many may seem the selfish pursuit of a beauty having no immediate reflection in the lives of his fellowmen, he may repeat Poincaré’s “Mathematics for mathematics’ sake. People have been shocked by this formula and yet it is as good as life for life’s sake, if life is but misery.”
* * *
To form an estimate of what modern mathematics compared to ancient has accomplished, we may first look at the mere bulk of the work in the period after 1800 compared to that before 1800. The most extensive history of mathematics is that of Moritz Cantor, Geschichte der Mathematik, in three large closely printed volumes (a fourth, by collaborators, supplements the three). The four volumes total about 3600 pages. Only the outline of the development is given by Cantor; there is no attempt to go into details concerning the contributions described, nor are technical terms explained so that an outsider could understand what the whole story is about, and biography is cut to the bone; the history is addressed to those who have some technical training. This history ends with the year 1799—just before modern mathematics began to feel its freedom. What if the outline history of mathematics in the nineteenth century alone were attempted on a similar scale? It has been estimated that nineteen or twenty volumes the size of Cantor’s would be required to tell the story, say about 17,000 pages. The nineteenth century, on this scale, contributed to mathematical knowledge about five times as much as was done in the whole of preceding history.
The beginningless period before 1800 breaks quite sharply into two. The break occurs about the year 1700, and is due mainly to Isaac Newton (1642-1727). Newton’s greatest rival in mathematics was Leibniz (1646-1716). According to Leibniz, of all mathematics up to the time of Newton, the more important half is due to Newton. This estimate refers to the power of Newton’s general methods rather than to the bulk of his work; the Principia is still rated as the most massive addition to scientific thought ever made by one man.
Continuing back into time beyond 1700 we find nothing comparable till we reach the Golden Age of Greece—a step of nearly 2000 years. Farther back than 600 B.C. we quickly pass into the shadows, coming out into the light again for a moment in ancient Egypt. Finally we arrive at the first great age of mathematics, about 2000 B.C., in the Euphrates Valley.
The descendants of the Sumerians in Babylon appear to have been the first “moderns” in mathematics; certainly their attack on algebraic equations is more in the spirit of the algebra we know than anything done by the Greeks in their Golden Age. More important than the technical algebra of these ancient Babylonians is their recognition—as shown by their work—of the necessity for proof in mathematics. Until recently it had been supposed that the Greeks were the first to recognize that proof is demanded for mathematical propositions. This was one of the most important steps ever taken by human beings. Unfortunately it was taken so long ago that it led nowhere in particular so far as our own civilization is concerned—unless the Greeks followed consciously, which they may well have done. They were not particularly generous to their predecessors.
Mathematics then has had four great ages: the Babylonian, the Greek, the Newtonian (to give the period around 1700 a name), and the recent, beginning about 1800 and continuing to the present day. Competent judges have called the last the Golden Age of Mathematics.
Today mathematical invention (discovery, if you prefer) is going forward more vigorously than ever. The only thing, apparently, that can stop its progress is a general collapse of what we have been pleased to call civilization. If that comes, mathematics may go underground for centuries, as it did after the decline of Babylon; but if history repeats itself, as it is said to do, we may count on the spring bursting forth again, fresher and clearer than ever, long after we and all our stupidities shall have been forgotten.
CHAPTER TWO
Modern Minds in Ancient Bodies
ZENO, EUDOXUS, ARCHIMEDES
. . . the glory that was Greece
And the grandeur that was Rome.
—E. A. POE
To APPRECIATE our own Golden Age of mathematics we shall do well to have in mind a few of the great, simple guiding ideas of those whose genius prepared the way for us long ago, and we shall glance at the lives and works of three Greeks: Zeno (495–435 B.C.), Eudoxus (408–355 B.C.), and Archimedes (287–212 B.C.). Euclid will be noticed much later, where his best work comes into its own.
Zeno and Eudoxus are representative
of two vigorous opposing schools of mathematical thought which flourish today, the critical-destructive and the critical-constructive. Both had minds as penetratingly critical as their successors in the nineteenth and twentieth centuries. This statement can of course be inverted: Kronecker (18231891) and Brouwer (1881- ), the modern critics of mathematical analysis—the theories of the infinite and the continuous—are as ancient as Zeno; the creators of the modern theories of continuity and the infinite, Weierstrass (1815-1897), Dedekind (1831-1916), and Cantor (1845-1918) are intellectual contemporaries of Eudoxus.
Archimedes, the greatest intellect of antiquity, is modern to the core. He and Newton would have understood one another perfectly, and it is just possible that Archimedes, could he come to life long enough to take a post-graduate course in mathematics and physics, would understand Einstein, Bohr, Heisenberg, and Dirac better than they understand themselves. Of all the ancients Archimedes is the only one who habitually thought with the unfettered freedom that the greater mathematicians permit themselves today with all the hard-won gains of twenty five centuries to smooth their way, for he alone of all the Greeks had sufficient stature and strength to stride clear over the obstacles thrown in the path of mathematical progress by frightened geometers who had listened to the philosophers.
Any list of the three “greatest” mathematicians of all history would include the name of Archimedes. The other two usually associated with him are Newton (1642-1727) and Gauss (1777-1855). Some, considering the relative wealth—or poverty—of mathematics and physical science in the respective ages in which these giants lived, and estimating their achievements against the background of their times, would put Archimedes first. Had the Greek mathematicians and scientists followed Archimedes rather than Euclid, Plato, and Aristotle, they might easily have anticipated the age of modern mathematics, which began with Descartes (1596-1650) and Newton in the seventeenth century, and the age of modern physical science inaugurated by Galileo (1564-1642) in the same century, by two thousand years.
* * *
Behind all three of these precursors of the modern age looms the half-mythical figure of Pythagoras (569?-500? B.C.), mystic, mathematician, investigator of nature to the best of his self-hobbled ability, “one tenth of him genius, nine-tenths sheer fudge.” His life has become a fable, rich with the incredible accretions of his prodigies; but only this much is of importance for the development of mathematics as distinguished from the bizarre number-mysticism in which he clothed his cosmic speculations: he travelled extensively in Egypt, learned much from the priests and believed more; visited Babylon and repeated his Egyptian experiences; founded a secret Brotherhood for high mathematical thinking and nonsensical physical, mental, moral, and ethical speculation at Croton in southern Italy; and, out of all this, made two of the greatest contributions to mathematics in its entire history. He died, according to one legend, in the flames of his own school fired by political and religious bigots who stirred up the masses to protest against the enlightenment which Pythagoras sought to bring them. Sic transit gloria mundi.
Before Pythagoras it had not been clearly realized that proof must proceed from assumptions. Pythagoras, according to persistent tradition, was the first European to insist that the axioms, the postulates, be set down first in developing geometry and that the entire development thereafter shall proceed by applications of close deductive reasoning to the axioms. Following current practice we shall use “postulate,” instead of “axiom” hereafter, as “axiom” has a pernicious historical association of “self-evident, necessary truth” which “postulate” does not have; a postulate is an arbitrary assumption laid down by the mathematician himself and not by God Almighty.
Pythagoras then imported proof into mathematics. This is his greatest achievement. Before him geometry had been largely a collection of rules of thumb empirically arrived at without any clear indication of the mutual connections of the rules, and without the slightest suspicion that all were deducible from a comparatively small number of postulates. Proof is now so commonly taken for granted as the very spirit of mathematics that we find it difficult to imagine the primitive thing which must have preceded mathematical reasoning.
Pythagoras’ second outstanding mathematical contribution brings us abreast of living problems. This was the discovery, which humiliated and devastated him, that the common whole numbers 1,2,3, . . . are insufficient for the construction of mathematics even in the rudimentary form in which he knew it. Before this capital discovery he had preached like an inspired prophet that all nature, the entire universe in fact, physical, metaphysical, mental, moral, mathematical—everything—is built on the discrete pattern of the integers 1,2,3, . . . and is interpretable in terms of these God-given bricks alone; God, he declared indeed, is “number,” and by that he meant common whole number. A sublime conception, no doubt, and beautifully simple, but as unworkable as its echo in Plato—“God ever geometrizes,” or in Jacobi—“God ever arithmetizes,” or in Jeans—“The Great Architect of the Universe now begins to appear as a mathematician.” One obstinate mathematical discrepancy demolished Pythagoras’ discrete philosophy, mathematics, and metaphysics. But, unlike some of his successors, he finally accepted defeat—after struggling unsuccessfully to suppress the discovery which abolished his creed.
This was what knocked his theory flat: it is impossible to find two whole numbers such that the square of one of them is equal to twice the square of the other. This can be proved by a simple argumentI within the reach of anyone who has had a few weeks of algebra, or even by anyone who thoroughly understands elementary arithmetic. Actually Pythagoras found his stumbling-block in geometry: the ratio of the side of a square to one of its diagonals cannot be expressed as the ratio of any two whole numbers. This is equivalent to the statement above about squares of whole numbers. In another form we would say that the square root of 2 is irrational, that is, is not equal to any whole number or decimal fraction, or sum of the two, got by dividing one whole number by another. Thus even so simple a geometrical concept as that of the diagonal of a square defies the integers 1,2,3, . . . and negates the earlier Pythagorean philosophy. We can easily construct the diagonal geometrically, but we cannot measure it in any finite number of steps. This impossibility sharply and clearly brought irrational numbers and the infinite (non-terminating) processes which they seem to imply to the attention of mathematicians. Thus the square root of two can be calculated to any required finite number of decimal places by the process taught in school or by more powerful methods, but the decimal never “repeats” (as that for 1/7 does, for instance), nor does it ever terminate. In this discovery Pythagoras found the taproot of modern mathematical analysis.
Issues were raised by this simple problem which are not yet disposed of in a manner satisfactory to all mathematicians. These concern the mathematical concepts of the infinite (the unending, the uncountable), limits, and continuity, concepts which are at the root of modern analysis. Time after time the paradoxes and sophisms which crept into mathematics with these apparently indispensable concepts have been regarded as finally eliminated, only to reappear a generation or two later, changed but yet the same. We shall come across them, livelier than ever, in the mathematics of our time. The following is an extremely simple, intuitively obvious picture of the situation.
Consider a straight line two inches long, and imagine it to have been traced by the “continuous” “motion” of a “point.” The words in quotes are those which conceal the difficulties. Without analysing them we easily persuade ourselves that we picture what they signify. Now label the left-hand end of the line 0 and the right-hand end 2. Half-way between 0 and 2 we naturally put 1; half-way between 0 and 1 we put ½; half-way between 0 and ½ we put ¼, and so on. Similarly, between 1 and 2 we mark the place 1½, between 1½ and 2, the place 1½ and so on. Having done this we may proceed in the same way to mark ⅓, ⅔, 1⅓, 1⅔, and then split each of the resulting segments into smaller equal segments. Finally, “in imagination,” w
e can conceive of this process having been carried out for all the common fractions and common mixed numbers which are greater than 0 and less than 2; the conceptual division-points give us all the rational numbers between 0 and 2. There are an infinity of them. Do they completely “cover” the line? No. To what point does the square root of 2 correspond? No point, because this square root is not obtainable by dividing any whole number by another. But the square root of 2 is obviously a “number” of some sort;II its representative point lies somewhere between 1.41 and 1.42, and we can cage it down as closely as we please. To cover the line completely we are forced to imagine or to invent infinitely more “numbers” than the rationals. That is, if we accept the line as being continuous, and postulate that to each point of it corresponds one, and only one, “real number.” The same kind of imagining can be carried on to the entire plane, and farther, but this is sufficient for the moment.
Simple problems such as these soon lead to very serious difficulties. With regard to these difficulties the Greeks were divided, just as we are, into two irreconcilable factions; one stopped dead in its mathematical tracks and refused to go on to analysis—the integral calculus, at which we shall glance when we come to it; the other attempted to overcome the difficulties and succeeded in convincing itself that it had done so. Those who stopped committed but few mistakes and were comparatively sterile of truth no less than of error; those who went on discovered much of the highest interest to mathematics and rational thought in general, some of which may be open to destructive criticism, however, precisely as has happened in our own generation. From the earliest times we meet these two distinct and antagonistic types of mind: the justifiably cautious who hang back because the ground quakes under their feet, and the bolder pioneers who leap the chasm to find treasure and comparative safety on the other side. We shall look first at one of those who refused to leap. For penetrating subtlety of thought we shall not meet his equal till we reach the twentieth century and encounter Brouwer.