Showing posts with label Infinite Powers. Show all posts
Showing posts with label Infinite Powers. Show all posts

Friday, October 4, 2019

1543



Two books published in 1543 marked a turning point, the beginning of the scientific revolution. In that year, the Flemish doctor Andreas Vesalius reported the results of his dissections of human cadavers, a practice that had been forbidden in earlier centuries. His findings contradicted fourteen centuries of received wisdom about human anatomy. In that same year, the Polish astronomer Nicolaus Copernicus finally allowed publication of his radical theory that the Earth moved around the sun. He’d waited until he was near death (and died just as the book was being published) because he’d feared that the Catholic Church would be infuriated by his demotion of the world from the center of God’s creation. He was right to be scared. After Giordano Bruno proposed, among other heresies, that the universe was infinitely large with infinitely many worlds, he was tried by the Inquisition and burned at the stake in Rome in 1600.

Archimedes

https://www.sapaviva.com/archimedes-of-syracuse/


For one thing, there are a lot of funny stories about him. Several portray him as the original math geek. For example, the historian Plutarch tells us that Archimedes could become so engrossed in geometry that it “made him forget his food and neglect his person.” (That certainly rings true. For many of us mathematicians, meals and personal hygiene aren’t top priorities.) Plutarch goes on to say that when Archimedes was lost in his mathematics, he would have to be “carried by absolute violence to bathe.” It’s interesting that he was such a reluctant bather, given that a bath is the setting for the one story about him that everybody knows. According to the Roman architect Vitruvius, Archimedes became so excited by a sudden insight he had in the bath that he leaped out of the tub and ran down the street naked shouting, “Eureka!” (“I have found it!”)
Other stories cast him as a military magician, a warrior-scientist / one-man death squad. According to these legends, when his home city of Syracuse was under siege by the Romans in 212 BCE, Archimedes — by then an old man, around seventy — helped defend the city by using his knowledge of pulleys and levers to make fantastical weapons, “war engines” such as grappling hooks and giant cranes that could lift the Roman ships out of the sea and shake the sailors from them like sand being shaken out of a shoe. As Plutarch described the terrifying scene, “A ship was frequently lifted up to a great height in the air (a dreadful thing to behold), and was rolled to and fro, and kept swinging, until the mariners were all thrown out, when at length it was dashed against the rocks, or let fall.”

In a more serious vein, all students of science and engineering remember Archimedes for his principle of buoyancy (a body immersed in a fluid is buoyed up by a force equal to the weight of the fluid displaced) and his law of the lever (heavy objects placed on opposite sides of a lever will balance if and only if their weights are in inverse proportion to their distances from the fulcrum). Both of these ideas have countless practical applications. Archimedes’s principle of buoyancy explains why some objects float and others do not. It also underlies all of naval architecture, the theory of ship stability, and the design of oil-drilling platforms at sea. And you rely on his law of the lever, even if unknowingly, every time you use a nail clipper or a crowbar.

Archimedes might have been a formidable maker of war machines, and he undoubtedly was a brilliant scientist and engineer, but what really puts him in the pantheon is what he did for mathematics. He paved the way for integral calculus. Its deepest ideas are plainly visible in his work, but then they aren’t seen again for almost two millennia. To say he was ahead of his time would be putting it mildly. Has anyone ever been more ahead of his time?

Two strategies appear again and again in his work. The first was his ardent use of the Infinity Principle. To probe the mysteries of circles, spheres, and other curved shapes, he always approximated them with rectilinear shapes made of lots of straight, flat pieces, faceted like jewels. By imagining more and more pieces and making them smaller and smaller, he pushed his approximations ever closer to the truth, approaching exactitude in the limit of infinitely many pieces. This strategy demanded that he be a wizard with sums and puzzles, since he ended up having to add many numbers or pieces back together to arrive at his conclusions.

His other distinguishing stratagem was blending mathematics with physics, the ideal with the real. Specifically, he mingled geometry, the study of shapes, with mechanics, the study of motion and force. Sometimes he used geometry to illuminate mechanics; sometimes the flow went in the other direction, with mechanical arguments providing insight into pure form. It was by using both strategies with consummate skill that Archimedes was able to penetrate so deeply into the mystery of curves.

**
Mathematicians don’t come up with the proofs first. First comes intuition. Rigor comes later. This essential role of intuition and imagination is often left out of high-school geometry courses, but it is essential to all creative mathematics.


Archimedes concludes with the hope that “there will be some among the present as well as future generations who by means of the method here explained will be enabled to find other theorems which have not yet fallen to our share.” That almost brings a tear to my eye. This unsurpassed genius, feeling the finiteness of his life against the infinitude of mathematics, recognizes that there is so much left to be done, that there are “other theorems which have not yet fallen to our share.” We all feel that, all of us mathematicians. Our subject is endless. It humbled even Archimedes himself.

Limit (in calculus)




A limit is like an unattainable goal. You can get closer and closer to it, but you can never get all the way there.

the unattainability of the limit usually doesn’t matter. We can often solve the problems we’re working on by fantasizing that we can actually reach the limit and then seeing what that fantasy implies. In fact, many of the greatest pioneers of the subject did precisely that and made great discoveries by doing so. Logical, no. Imaginative, yes. Successful, very.

A limit is a subtle concept but a central one in calculus. It’s elusive because it’s not a common idea in daily life. Perhaps the closest analogy is the Riddle of the Wall. If you walk halfway to the wall, and then you walk half the remaining distance, and then you walk half of that, and on and on, will there ever be a step when you finally get to the wall?






 







The answer is clearly no, because the Riddle of the Wall stipulates that at each step, you walk halfway to the wall, not all the way. After you take ten steps or a million or any other number of steps, there will always be a gap between you and the wall. But equally clearly, you can get arbitrarily close to the wall. What this means is that by taking enough steps, you can get to within a centimeter of it, or a millimeter, or a nanometer, or any other tiny but nonzero distance, but you can never get all the way there. Here, the wall plays the role of the limit. It took about two thousand years for the limit concept to be rigorously defined. Until then, the pioneers of calculus got by just fine with intuition. So don’t worry if limits feel hazy for now. We’ll get to know them better by watching them in action. From a modern perspective, they matter because they are the bedrock on which all of calculus is built.

If the metaphor of the wall seems too bleak and inhuman (who wants to approach a wall?), try this analogy: Anything that approaches a limit is like a hero engaged in an endless quest. It’s not an exercise in total futility, like the hopeless task faced by Sisyphus, who was condemned to roll a boulder up a hill only to see it roll back down again over and over for eternity. Rather, when a mathematical process advances toward a limit (like the scalloped shapes homing in on the limiting rectangle), it’s as if a protagonist is striving for something he knows is impossible but for which he still holds out the hope of success, encouraged by the steady progress he’s making while trying to reach an unreachable star.

Calculus is more than a language

Photo by Roman Mager on Unsplash


Calculus, like other forms of mathematics, is much more than a language; it’s also an incredibly powerful system of reasoning. It lets us transform one equation into another by performing various symbolic operations on them, operations subject to certain rules. Those rules are deeply rooted in logic, so even though it might seem like we’re just shuffling symbols around, we’re actually constructing long chains of logical inference. The symbol shuffling is useful shorthand, a convenient way to build arguments too intricate to hold in our heads.

If we’re lucky and skillful enough — if we transform the equations in just the right way — we can get them to reveal their hidden implications. To a mathematician, the process feels almost palpable. It’s as if we’re manipulating the equations, massaging them, trying to relax them enough so that they’ll spill their secrets. We want them to open up and talk to us.

**


In a nutshell, calculus wants to make hard problems simpler. It is utterly obsessed with simplicity. That might come as a surprise to you, given that calculus has a reputation for being complicated. And there’s no denying that some of its leading textbooks exceed a thousand pages and weigh as much as bricks. But let’s not be judgmental. Calculus can’t help how it looks. Its bulkiness is unavoidable. It looks complicated because it’s trying to tackle complicated problems. In fact, it has tackled and solved some of the most difficult and important problems our species has ever faced.

Calculus succeeds by breaking complicated problems down into simpler parts. That strategy, of course, is not unique to calculus. All good problem-solvers know that hard problems become easier when they’re split into chunks. The truly radical and distinctive move of calculus is that it takes this divide-and-conquer strategy to its utmost extreme — all the way out to infinity. Instead of cutting a big problem into a handful of bite-size pieces, it keeps cutting and cutting relentlessly until the problem has been chopped and pulverized into its tiniest conceivable parts, leaving infinitely many of them. Once that’s done, it solves the original problem for all the tiny parts, which is usually a much easier task than solving the initial giant problem. The remaining challenge at that point is to put all the tiny answers back together again. That tends to be a much harder step, but at least it’s not as difficult as the original problem was.

Thus, calculus proceeds in two phases: cutting and rebuilding. In mathematical terms, the cutting process always involves infinitely fine subtraction, which is used to quantify the differences between the parts. Accordingly, this half of the subject is called differential calculus. The reassembly process always involves infinite addition, which integrates the parts back into the original whole. This half of the subject is called integral calculus.

Story of laser

Photo by Gerardo Barreto on Unsplash


Using observation and experiment, scientists worked out the laws of change and then used calculus to solve them and make predictions. For example, in 1917 Albert Einstein applied calculus to a simple model of atomic transitions to predict a remarkable effect called stimulated emission (which is what the s and e stand for in laser, an acronym for light amplification by stimulated emission of radiation). He theorized that under certain circumstances, light passing through matter could stimulate the production of more light at the same wavelength and moving in the same direction, creating a cascade of light through a kind of chain reaction that would result in an intense, coherent beam. A few decades later, the prediction proved to be accurate. The first working lasers were built in the early 1960s. Since then, they have been used in everything from compact-disc players and laser-guided weaponry to supermarket bar-code scanners and medical lasers.

Sunday, July 14, 2019

Without Calculus

Photo by Shubham Sharan on Unsplash


Without calculus, we wouldn’t have cell phones, computers, or microwave ovens. We wouldn’t have radio. Or television. Or ultrasound for expectant mothers, or GPS for lost travelers. We wouldn’t have split the atom, unraveled the human genome, or put astronauts on the moon. We might not even have the Declaration of Independence.

It’s a curiosity of history that the world was changed forever by an arcane branch of mathematics. How could it be that a theory originally about shapes ultimately reshaped civilization?

The essence of the answer lies in a quip that the physicist Richard Feynman made to the novelist Herman Wouk when they were discussing the Manhattan Project. Wouk was doing research for a big novel he hoped to write about World War II, and he went to Caltech to interview physicists who had worked on the bomb, one of whom was Feynman. After the interview, as they were parting, Feynman asked Wouk if he knew calculus. No, Wouk admitted, he didn’t. “You had better learn it,” said Feynman. “It’s the language God talks.”

For reasons nobody understands, the universe is deeply mathematical. Maybe God made it that way. Or maybe it’s the only way a universe with us in it could be, because nonmathematical universes can’t harbor life intelligent enough to ask the question. In any case, it’s a mysterious and marvelous fact that our universe obeys laws of nature that always turn out to be expressible in the language of calculus as sentences called differential equations. Such equations describe the difference between something right now and the same thing an instant later or between something right here and the same thing infinitesimally close by. The details differ depending on what part of nature we’re talking about, but the structure of the laws is always the same. To put this awesome assertion another way, there seems to be something like a code to the universe, an operating system that animates everything from moment to moment and place to place. Calculus taps into this order and expresses it.

Isaac Newton was the first to glimpse this secret of the universe. He found that the orbits of the planets, the rhythm of the tides, and the trajectories of cannonballs could all be described, explained, and predicted by a small set of differential equations. Today we call them Newton’s laws of motion and gravity. Ever since Newton, we have found that the same pattern holds whenever we uncover a new part of the universe. From the old elements of earth, air, fire, and water to the latest in electrons, quarks, black holes, and superstrings, every inanimate thing in the universe bends to the rule of differential equations. I bet this is what Feynman meant when he said that calculus is the language God talks. If anything deserves to be called the secret of the universe, calculus is it.

By inadvertently discovering this strange language, first in a corner of geometry and later in the code of the universe, then by learning to speak it fluently and decipher its idioms and nuances, and finally by harnessing its forecasting powers, humans have used calculus to remake the world.

**
When I said earlier that without calculus we wouldn’t have computers and cell phones and so on, I certainly didn’t mean to suggest that calculus produced all these wonders by itself. Far from it. Science and technology were essential partners — and arguably the stars of the show. My point is merely that calculus has also played a crucial role, albeit often a supporting one, in giving us the world we know today.


Take the story of wireless communication. It began with the discovery of the laws of electricity and magnetism by scientists like Michael Faraday and André-Marie Ampère. Without their observations and tinkering, the crucial facts about magnets, electrical currents, and their invisible force fields would have remained unknown, and the possibility of wireless communication would never have been realized. So, obviously, experimental physics was indispensable here.
But so was calculus. In the 1860s, a Scottish mathematical physicist named James Clerk Maxwell recast the experimental laws of electricity and magnetism into a symbolic form that could be fed into the maw of calculus. After some churning, the maw disgorged an equation that didn’t make sense. Apparently something was missing in the physics. Maxwell suspected that Ampère’s law was the culprit. He tried patching it up by including a new term in his equation — a hypothetical current that would resolve the contradiction — and then let calculus churn again. This time it spat out a sensible result, a simple, elegant wave equation much like the equation that describes the spread of ripples on a pond. Except Maxwell’s result was predicting a new kind of wave, with electric and magnetic fields dancing together in a pas de deux. A changing electric field would generate a changing magnetic field, which in turn would regenerate the electric field, and so on, each field bootstrapping the other forward, propagating together as a wave of traveling energy. And when Maxwell calculated the speed of this wave, he found — in what must have been one of the greatest Aha! moments in history — that it moved at the speed of light. So he used calculus not only to predict the existence of electromagnetic waves but also to solve an age-old mystery: What was the nature of light? Light, he realized, was an electromagnetic wave.

Maxwell’s prediction of electromagnetic waves prompted an experiment by Heinrich Hertz in 1887 that proved their existence. A decade later, Nikola Tesla built the first radio communication system, and five years after that, Guglielmo Marconi transmitted the first wireless messages across the Atlantic. Soon came television, cell phones, and all the rest.

Clearly, calculus could not have done this alone. But equally clearly, none of it would have happened without calculus. Or, perhaps more accurately, it might have happened, but only much later, if at all.