Posts Tagged mathematics

Pi day—tau be or not tau be, that is the question

Math-savvy pizza and pie shops around the world will be celebrating this afternoon of 3/14 at 3:14 pm to honor the mathematical constant pi. 

Rounding pi to 3.14 suffices for most rational people, but those of you who are trained matheletes might like to carry this never-ending irrational number out to 100 or a 1000 decimal places.  If so, knock yourself out at this post by math.com.  You might as well quit at this point because the record is now 50 trillion digits, held by cybersecurity analyst Timothy Mullican who used 303 days of computation to complete this calculation, which he detailed here.  

A good way to build up your chops on pi is to memorize a ‘piem’, that is a poem in which the length of each word represents a number, for example, “Now I need a drink, alcoholic of course, after the heavy lectures involving quantum mechanics.”  See a much longer (101 digits!) piem sung by musician Andrew Huang and many other amazing feats related to pi in this article by Andrew Whalen posted today by Newsweek.

Sadly, some mathematicians are reigning in the pi parade by insisting it be doubled to the constant tau.

“To describe 3/4 of a circle in trigonometry, you would say 3/4 tau radians. But in the pi world, that’s 3/2 pi radians. ‘Blegh!’ says Prof. [Bob] Palais [Utah Valley University]. ‘People are so ingrained that they don’t even see how stupid it is.’”

For Math Fans, Nothing Can Spoil Pi Day—Except Maybe Tau Day
Wall Street Journal, 3/14/20

You’d best circle (ha ha-math joke) June 28 to celebrate Tau Day, even though that’s no reason to eat pizza or any other kind of pie.

No Comments

The hero of zero




Breaking news about nothing: Dating done with the Oxford Radiocarbon Accelerator Unit now puts the invention of the number zero 500 years earlier than previously believed.  As explained in this post by The Guardian, the hero of zero is Indian mathematician Brahmagupta who worked out this pivotal number in 628 AD.  Isn’t that something?

The development of zero in mathematics underpins an incredible range of further work, including the notion of infinity, the modern notion of the vacuum in quantum physics, and some of the deepest questions in cosmology of how the Universe arose – and how it might disappear from existence in some unimaginable future scenario.

– Hannah Devlin,

,

No Comments

The increasing oppression of soul-less algorithms




As I’ve blogged before*, algorithms for engineering and statistical use are near and dear to my heart, but not when they become tools of unscrupulous and naïve manipulators. Thus an essay** published on the first of this month by The Guardian about “How algorithms rule our working lives” gave me some concern. In this case the concern is that employers who rely on mathematically modelled ways of sifting through job applications tend to punish the poor.

“Like gods, these mathematical models are opaque, their workings invisible to all but the highest priests in their domain: mathematicians and computer scientists. Their verdicts, even when wrong or harmful, are beyond dispute or appeal. And they tend to punish the poor and the oppressed in our society, while making the rich richer.”

– Cathy O’Neil

Of course we mustn’t blame algorithms per se, but those who write them and/or put them to wrong use.  The University of Oxford advises that mathematicians don’t write evil algorithms.  This October 2015 post passes along seven utopian principles for ethical code.  Good luck with that!

P.S. A tidbit of trivia that I report in my book RSM Simplified: “algorithm” is an eponym for Al-Khwarizm, a ninth century Persian mathematician who wrote the book on “al-jabr” (i.e., algebra).  It may turn out to be the most destructive weapon for oppression ever to emerge from the Middle East.

* Rock on with algorithms? October 2, 2012

** Adapted from Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy — a new book on business statistics coming out tomorrow by “Math Babe” Cathy O’Neil.

No Comments

Rock on with algorithms?




I started off my career as an experiment designer before the advent of cheap calculators.  Paying $400 for an HP unit that (gasp!) did logarithms went far beyond my wherewithal in 1974.  That was roughly the tuition for one college quarter at University of Minnesota if memory serves.  I managed to cover that cost plus room and board by working 24 hours a week washing pots and pans at a hospital kitchen.  Those were the days!

Calculating effects from the two-level factorial designs I did that summer as an intern at a chemical research lab required a lot of hand calculations—many numbers to add and subtract.  Thankfully a fellow named Yates developed an algorithm after these experiments were invented in the 1930s.  Following his directions one could tally things up and even do check sums without having to think much.  That’s what algorithms do—provide a recipe for solving problems.

As an engineer I have a healthy respect for algorithms, but my wife, who works as a preschool teacher, thinks this is geeky.  For example, I admired the nerdy professor in the TV show “Numbers” that aired a few years ago.  But every time he expounded on some algorithm that ingeniously saw the pattern of a serial criminal, she just laughed.  Ironically she is now hooked on a show called “Person of Interest” that is based on predictive policing, that is, using algorithms to calculate a crime to come.  That scares me!

According to a new book by Christopher Steiner titled Automate This: How Algorithms Came to Rule Our World (see this Wall Street Journal review) all of us had best be on our guard against seemingly clever ways to systematically solve problems.  It seems that the engineers, mathematicians, programmers and statisticians who come up with these numerical recipes invaded Wall Street.  They became known as the “Quants”—dominating the way stocks now get traded.

The problem with all this (even I have to admit) is that these systematic approaches to things take all the fun out of making choices.  Do we really want algorithms to pick our soul mates, invest our money, etcetera?  I am up for algorithms like Yate’s that quickly solve mathematical problems.  A good example of this is the first known algorithm recorded on clay tablets in 2500 B. C. that helped Sumerian traders divvy up a given amount of grain equally to a varying number of recipients.  However when things become capricious with many unknowns that are unknowable being thrown into the mix, I’d rather make my own decisions guided by wise counsel.

There is an elephant in the room whenever it comes to discussing computer algorithms, particularly highly automated ones. Almost all such algorithms are inaccurate. They are inaccurate for many reasons, the most important of which is that human behavior is fickle. The inaccuracy could be shockingly high.

–          Kaiser Fung, author of Numbers Rule Our World

I really shouldn’t bring this up, but do you suppose certain politician might be spending a lot of money on algorithmic solutions to how they can win election?  Do these algorithms have any qualms about turning their protagonists into nabobs of negativism?  I do not believe that an algorithm has any heart, unfortunately.  An algorithm is like Honey Badger—it just don’t care.

No Comments

New math sums digits from left to right: Does this add up as an improvement?




A recent article in my local newspaper, the Stillwater Gazette, provided enlightenment on our school district’s new way of adding numbers – from left to right, rather than right to left.  I might have to try this – maybe it will help me improve my accuracy when tallying checks on deposit slips.  (I always hand-calculate these as a way to maintain my math muscles.)

Supposedly this left-to-right approach makes it easier for children to learn, because it goes in the same direction for processing numbers as for reading words. Here’s how it works.  Let’s say that you and your spouse both collect up pennies and the first jar nets 237 cents versus 159 for the second.  How much in total can be taken to the bank? The way I learned to add one first adds 7 and 9, recording 6 as the right-most digit (the ones column), and then carrying a 1 to the second column (the tens).  This carrying part is where I sometimes get off, mainly due to my poor handwriting, which even I cannot always read.  The new left-to-right approach eliminates a lot of carrying, but not all, I figure, as shown in the following case.  Start by adding the left-most (hundreds in this case) column of numbers:
   247
+159
=300

Do not forget to put in the zeroes to hold the place of what you just added.  Now go to the next column to the right and add it:
= 90 (4 + 5)

And so forth until there’s no more columns:
= 16 (7 + 9)

Finally, tally up all the numbers you calculated:
300
+90
+16
406

I have a feeling that the old saying about not trying to teach an old dog new tricks might be operative for me in regard to this new math.  I think I will just keep adding the old way, or admit that using a calculator or, better yet, a computerized spreadsheet for doing my deposits would be smarter.  Am I shortchanging myself (pun intended)?

PS. This innovation in learning math struck a chord with my son Hank, who programs for Stat-Ease.  He made me aware that “endianness” is a major issue in coding.  Evidently programmers continually feud over the order in which bytes in multi-byte numbers should be stored – most-significant first (Big-Endian) or least-significant first (Little-Endian).*  The “endian” terms come from Jonathan Swift who mocked the pettiness of social customs, such as which end one ought to first attack when shelling an egg.

“…the primitive way of breaking Eggs, before we eat them, was upon the larger End: But his present Majesty’s Grand-father, while he was a Boy, going to eat an Egg, and breaking it according to the ancient Practice, happened to cut one of his Fingers. Whereupon the Emperor his Father published an Edict, commanding all his Subjects, upon great Penaltys, to break the smaller End of their Eggs.”
— Jonathan Swift, Gulliver’s Travels, A Voyage to Lilliput, Chapter IV.

*For more details, see Basic concepts on Endianness.

No Comments