Sunday, July 19, 2015

Review of "How not to be wrong: the hidden maths of everyday life" by Jordan Ellenberg

There are three ways to describe this book.

The way it presents itself is as a more logical, "correct" way of understanding life, using lessons from mathematics to analyse the sort of problems we come across in our lives, early on in the book Ellenberg writes:

To paraphrase Clausewitz: Mathematics is the extension of common sense by other means (p,13)

Shortly after he presents a diagram of four quadrants, which are divided into shallow and profound, simple and complicated. His intention is to focus on the profound but simple aspects of mathematics that the average reader can understand and apply to their lives.

What is a probably fairer description of the book is a series of often interesting observations by a mathematician on life, using mathematical concepts as a way of clarifying the points he is making, together they don't really make a coherent whole, perhaps that doesn't matter, they are generally fascinating reflections, although he tends to over-sell what he is actually delivering.

The book is quite long - 468 pages of small print - I found the extended discussion of the lottery went on way too long, but most of the other topics were fascinating.

If you wanted to try to get behind the individual comments and try to identify a more general point he wants to make it is a defence of formalism as a method of analysis. By "formalism" is meant turning something into a formal system of concepts, where we don't look for any "real life" outside of the framework - an example might be the way a court of law interprets events through a formal legal framework - if the court follows the correct procedure and finds the accused guilty of murder, the accused is a murderer, whether they killed anyone or not.

What I'll do is list a number of what I found to be the more interesting examples covered in the book, to give a flavour of what the book is about, then I will comment on where these observations get us and to what extent the book lives up to its claim "how not to be wrong" and finally we will look in more detail at the benefits and problems of formalism.

The book begins with a story of the Statistical Research Group (SRG) in the Second World War who were asked to suggest where to put armour on planes. Too much armour and the planes would use more fuel and be too heavy, but too little and of course they would be shot down. The group were supplied with statistics of where bullets had hit the planes when they returned - most bullets holes were on the fuselage and only a few on the engine, and the suggestion was to therefore apply more armour to the fuselage. The genius insight came from Abraham Wald however who argued that planes hit on the engine generally never returned and hence it was the engines that needed the armour.

This is an example of survivorship bias and can be applied to other areas - for example statistics for the performance of financial funds over a 10 year period may give the appearance of the investment company having exceptionally good returns on their funds, except these are the funds that haven't been killed off because they weren't performing - by definition a fund that has been going 10 years is going to be successful otherwise it would have been stopped, to get a better understanding of how successful the investment company is doing you would need to look at the performance of all the funds, not just those that survived for 10 years.

The next example looks at a blog posted by Daniel J. Mitchell "Why Is Obama Trying to Make America More Like Sweden when Swedes Are Trying to Be Less Like Sweden?" The article basically argued that Sweden - traditionally a country that has high state spending - was trying to lower state spending at the same time as Obama was trying to increase state spending.

This is an example of false linearity - to assume a linear relationship between factors when it is not appropriate - in other words it seems on reflection that the optimum amount of state spending isn't at an extreme but somewhere in the middle - it is perfectly possible that Sweden has too high state spending and the US not enough, so if the relationship is a curve, not a line, then of course it could be true that one country needs to spend less and another needs to spend more, as both are looking to spend the optimal amount which lies in the middle of the curve.

Republican economists should be aware of this because during the 1980s they frequently referred to the "Laffer curve" which plotted revenue against the tax rate - the idea being that if the tax rate was too high revenue would fall as people would no longer feel an incentive to work, whereas if it was too low then of course the amount collected would be low, so somewhere in the middle was an optimal level which would maximise the amount of revenue collected in tax.

It is worth pausing at this point and note that a very large number of examples in the book come from the USA - from politics, history, sport, US lotteries etc. Usually these don't particularly detract from understanding the point being made, but the sport examples are probably the most challenging unless you have a good knowledge of the terminology of American Football, baseball and basketball, but even with these it isn't too difficult to understand the point being made.

The next discussion concerns infinite series - what happens if you keep adding numbers and never stop - for example

0.9 + 0.09 + 0.009 + 0.0009 + 0.00009 …

or

1 - 1 + 1 - 1 + 1 - 1 …

There isn't really a topical example connected with this but Ellenberg uses it to make the point that mathematics isn't about some special sort of reality, it is just what we define it to be - he suggests mathematicians in the past got into terrible muddles trying to decide what the meaning of some mathematical conundrum was, but suggests that modern mathematicians no longer ask that question, instead mathematics is what we define it to be, we don't ask what something means, but what it makes sense to define something as.

The next section covers linear regression - or linear extrapolation, and the perhaps obvious problems it can lead to - these are the sort of studies behind the headlines "We are all going to be overweight in 2048", and he quotes from Mark Twain's Life on the Mississippi in which it is noted that the river is becoming shorter by about a mile a year, from which he can extrapolate that around a million years ago the river would have been a million miles longer, and in seven hundred years time it will be only a mile and three quarters long.

He even notes that the article that claimed all americans would be overweight by 2048 also observed that black male americans were becoming obese at a slower rate, and that they would all be obese by 2095 - and didn't even appear to realise the contradiction in these two predictions.

He then looks at the Law of Large Numbers. The idea here is that when you have small sample sizes it is possible to get extremes in what is being measured. Take a simple example - flip a coin twice, it is quite possible that you will get heads both times, so you have a result of 100% heads. But flip the coin a thousand times - now try getting 100% heads, it is much less likely. As the size of the sample increases, the "noise" reduces and the "signal" becomes clearer. Ellenberg cites a medical study that looked at death rates from brain cancer - remarkably South Dakota had one of the highest rates while North Dakota had one of the lowest - it wasn't that things were being done any differently in the two states, it is just the sample sizes were so small that the results are likely to be more variable.

The same is true when measuring test results in schools  the smaller schools display greater variance - appearing at the top and bottom of the results tables - not because they are doing things any better or worse, but again just because with smaller numbers the results will be more variable. If I toss a coin twice and get two heads, and you toss it a thousand times and get 492 heads, that doesn't mean I'm better at tossing heads than you.

The next section looks at the principle don't talk about percentages of numbers when the numbers might be negative. As an example it tells a story where in June 2011 it was announced that the US economy only added about 18,000 jobs nationally. The Republican Governor of Wisconsin Scott Walker released a statement that read "today we learned that over 50 percent of US job growth in June came from our state" - well, it is true that 9,500 jobs were created in Wisconsin but Minnesota (with a Democrat governor) added more than 13,000, and indeed Texas, California, Michigan and Massachusetts all created more jobs than Wisconsin that month - the problem was some states had negative job growth - they lost jobs - so it can be highly misleading to give percentages when the numbers might be negative.

Ellenberg makes a very interesting similar point regarding an article from the New York Times that used the work of economists Thomas Piketty and Emmanuel Saez. The article wanted to make the point that in 2010 93% of the additional income created that year went to the top 1% of taxpayers, per person the top 1% got a pay rise on average of $1,019,089 while the bottom 99% received an increase of just $80 per person.

The trouble with this analysis - according to Ellenberg - is that in fact what happened was the bottom 90% had a pay cut, and the top 10% had a pay rise. By merging the rise of the top 9% of the 99% into the cut of the bottom 90%, it gave a small overall pay rise. The figures themselves weren't wrong, but might be seen as misleading to compare the 1% and the 99% when the real difference was between the 10% and the 90%.

Next comes the example of the "Baltimore stockbroker" - in this example you get unsolicited newsletters from a stockbroker in Baltimore predicting certain stock to go up, the next week - their prediction correct - they mail you again with another prediction, which again is correct and so on for ten successive weeks - each week the prediction is correct, and finally is comes an offer to invest with the company - it seems like a sure win? The problem is this stockbroker has been sending out different predictions to 10,000 different people, each time a prediction fails they stop getting the mails. Each week they send out fewer mails until finally only 10 people were left who had been sent the correct prediction each week - the hope being that they will invest a lot of money.

Apparently Derren Brown the UK illusionist did a similar trick in 2008, but the real problem with this phenomenon is not its use in scams but that it may be an important problem for scientists who appear - according to Ellenberg - to unconsciously play the trick on themselves.

Before getting into the details, Ellenberg first uses this to explain The Bible Code. This is the idea that secret codes are hidden in the Bible and it made Michael Drosnin a very successful author. The idea originally came from an analysis of the Torah that apparently revealed the names of famous Jewish rabbies hidden in the Hebrew text - an analysis that was published in Statistical Science in 1994. The key to explaining such phenomena is that given enough chances, improbable things happen. If I put my hand in a bag of letters and pull out those that spell my name, it looks amazing, until you find out how many long I had been trying to get that result.

Now we come to the science problem. At a 2009 Brain Mapping conference a paper was delivered that showed that an fMRI scan of a dead fish which was being shown a series of pictures of human faces revealed it was able to correctly identify the emotions on the faces - the point of the paper was to show a flaw in fMRI scan brain mapping methodology. It turns out that the data from the fMRI scan contains tens of thousands of data points, if you search through that data long enough you can find what you are looking for - some sequence or correlation.

This takes us on to sampling, and R. A. Fisher, the founder of the modern practice of statistics, and the null hypothesis. Suppose I take two groups of children, I give one group corn flakes for breakfast each day, the others have toast, at the start of the month and at the end I record their average running times - can I say the faster group reveals the better breakfast cereal? Of course not, because after a month whether the cornflakes had any effect or not it is likely one of the groups will differ anyway - this is what is known as the null hypothesis - what does it look like if my theory isn't true? Does orange juice make me sleep better, does bacon give my hair more of a shine? The null hypothesis says "what does it look like if there is no effect" - what Fisher devised was what is known as a "significance test" - to detect if there is something statistically significant happening in the data (note generally here I'm taking examples directly from the book - this the cornflake example is my own, it isn't from the book).

Ellenberg says the word "significance" is misleading - it would be better to say "statistically noticeable" or "statistically detectable" - it doesn't actually mean there is anything significant about it, but the word has stuck.

He gives some examples - in 1995 British women on the pill were warned that a certain type of pill doubled the risk of a blood clot and caused a lot of panic - but the problem was the risk of a blood clot was very low, and doubling the risk was also very low - in terms of actual figures it was something like instead of a 1 in 7,000 risk it was 2 in 7,000 - Ellemberg makes the point twice a tiny number is a tiny number. Another was a US study showing at-home child care had a fatality rate seven times that of in-centre child care but the figures were 1.6 per 100,000 and 0.23 per 100,000 - both nearly zero.

That was a slight detour before getting to the problem of statistics and science again. Ellenberg gives an example of The International Journal of Haruspicy - predicting the future by studying sheep entrails. Imagine you do a reading and try a prediction - it doesn't work, you try something slightly different, again, no luck - after a number of attempts your prediction succeeds - you write up what you did and publish.  When Fisher devised his "significance" test he gave a 95% threshold for something being significant - but that means it is just a 1 in 20 chance of it happening my luck. If we did the test 20 times then the chances are the hoped for result would occur. If we are looking for experiments to get published in the Haruspicy journal on average 1 in 20 should get a "significant" result.

Ellenberg says within the scientific community there is increasing concern that this is what is happening. In 2005 John Ioannidis biomedical researcher published a paper Why most Published Research Findings Are False and in 2012 scientists at a California biotech company Amgen set out to replicate some of the most famous experimental results in the biology of cancer and out of 53 studies were only able to replicate 6.

The next topic is how worried should we be if we discovered our neighbour was on a terrorism shortlist? Ellenberg says we need to be careful as there are two questions which are quite different.

  1. How likely is someone to get on the list, given that they are not a terrorist?
  2. How likely is it that a person is not a terrorist, if they are on the list?

Of course this just depends on how good the list is. Suppose a terrorist profile is people who like action movies who have visited the middle east in the last five years and like music by ABBA. Meeting these criteria will get you on the list, perhaps there is a 1 in 20,000 chance that you meet this profile, so you might think if your neighbour is on the list and there is only a 1 in 20,000 chance of getting on the list, you ought to be worried about your neighbour.

But some reflection on the criteria might make you think - that sounds like pretty rubbish criteria, if might produce a shortlist of suspects, but really this makes it clear the second question is the more important one.

Now we see why the second question is important - is this list any good? How likely are you to actually be a terrorist, given you are on the list? If the criteria are rubbish, the fact your neighbour is on the list is irrelevant.

The next example concerns what is random and what we think is random. Ask someone to pick a random number 1 and 20 - typically people pick 17. The same if you ask for a random number between 0 and 9, people more often pick 7. When the election results were announced in Iran in 2009 the figures were suspicious as they showed the exact traces of invented randomness, not actual randomness - the figures from the 29 provinces contained a lot more 7s than would be expected - making it likely the figures were made up to appear random.

Going back to statistics and significant results the problem with the 95% marker of something being significant is that it doesn't distinguish the more general likelihood or believability of an explanation. An experiment regarding the effectiveness of a drug versus one showing the effectiveness of placing someone with a mystical stone circle might both record a statistically significant result - but our response to each may differ drastically. This recognition of the difference between the prior beliefs before seeing the evidence and the posterior beliefs after seeing it is known as Bayesian inference, using a formula in probability called Bayes's Theorem.

If you believe in vibrational earth energy the experiment with the stones will be more likely to be worth further investigation than if you have a materialist viewpoint in which case it will be seen as a lucky coincidence.

This ties in with whether we believe numbers are random or not - our prior beliefs about randomness will affect whether we think something happened by chance or was genuinely random or not.

At this point Ellenberg reviews Paley's argument for God from design (the famous watchmaker argument), trying to give probabilities for whether God is a good explanation for the world - and suggests we also need to consider not just one God, but also many gods and  also whether the universe is a simulation run on a computer, however after some discussion he finally admits that mathematics cannot decide on God's existence.

At this point Ellenberg does an extended discussion on the lottery which I won't go into, although he does reveal an interesting story that the lottery originated from Genoa in the seventeenth century when instead of elections they chose counsellors through drawing lots, which the citizens regularly bet on, at some point they realised the didn't need to wait for an election, they could just bet on drawing numbers - and by 1700 Genoa was running a lottery very similar to the format we have today.

Ellenberg is very interested in the lottery and spends a lot of time discussing it, and in particular a local lottery in the US in which groups of people worked together to regularly win lots of money - I have to admit I found this the least interesting part of the book and so will not go into any further details here.

There is then a fascinating discussion of mediocrity. In 1933 Horace Secrist, a professor of statistics and Director of the Bureau for Business Research at Northwestern published The Triumph of Mediocrity in Business, a 468 page book full of tables and charts, in which Secrist showed how the best and the worst stores and businesses eventually become average - he argued there appears to be an iron law of business that "mediocrity tends to become the rule" (quoted in Ellenberg, p.297).

He then turns to British scientist and pioneering eugenicist Francis Galton, according to Ellenberg, a brilliant mathematician and meticulous collector of data about hereditary traits. He showed for example the tall parents tend to have tall children, but not as tall as themselves, similarly short parents tend to have short children but not as short as themselves - the iron law of mediocrity again?

Ellenberg says Galton realised what Secrist did not - traits are typically both inherited and influenced by the environment. Parents would have genes of course from their parents which would make them tall or short, and the environment (diet, exercise etc) would further influence their height. Exceptionally tall parents had genes plus luck from the environment, combined to give them exceptional height (either tall or short). Chances are their children wouldn't get the same degree of "luck" from the environment, and so would tend to be closer towards the average.

This would be true of most things - take a particularly talented writer - some of the talent would be inherited, but something more would be the luck of upbringing - some combination of factors that just happened to make them exceptional - say the literary equivalent of throwing six sixes with dice. Their children would get the inherited talent, but it would most unlikely they would get the required upbringing that would make them a great writer also.

In effect this is the law of large numbers again - things will in the long term tend back to the average.

Unfortunately for Secrist a mathematician  called Harold Hotelling decided to speak frankly about what Secrist had "discovered". Hotelling said that all Secrist's data "proved nothing more than that the ratios in question have a tendency to wander about". In fact the investigation was "mathematically obvious from general considerations, and does not need the vast accumulation of data adduced to prove it".

Hotelling pointed out that stores and businesses didn't tend to mediocrity but were just sometimes lucky and sometimes not, and over time the "luck" part of the equation caused a fair amount of movement in how well from a business point of view the store was doing. A business that was doing well in 1922 might do less well over the next few years - the law of mediocrity? No, because if you worked back to before 1922 those stores would have been average but moved up to become the best - good luck got them to the top in 1922 and bad luck moved them back down again afterwards.

Of course it isn't all luck - just as Galton knew height wasn't all the environment - for a store to do well it has to have good business practises, but for it to be exceptional it has to have luck as well, and once it's luck is over, it will go down the rankings, just as other well run stores that are in receipt of good luck rise up above it.

About three quarters of the way through the book is a section entitled "Eugenics, Original Sin and this book's misleading title" - in which he points out that while, on the whole, mathematics is something which has a pretty clear cut view of getting something right or wrong, it actually doesn't stop you being wrong: "mathematics is a way not to be wrong, but it isn't a way not to be wrong about everything." Clearly you can be great at maths and rubbish at life - and the book has a few examples of this. Similarly the idea that the book is going to come up with an answer about the question of God - although alluded to on a number of occasions and perhaps even hoped for at some point during the writing of the book is something that Ellenberg has to eventually has to admit he isn't really able to address.

In spite of this admission, there are still some interesting observations left to cover.

The next section looks at cause and correlation, which it has to be admitted is fascinating. Ellenberg gives a very nice illustration of how correlation is a problem. Imagine three people each with different investments.

Let's say A, B, C and D are different business investments.

Laura has investments in A and B.
Sara has investments in B and C.
Tim has investments in C and D.

If C does well then - other things being equal - Sara and Tim do well,
But if B does well then Laura and Sara do well.

So if Sara does well, what does that imply about how well Tim or Laura will do? Or if Laura does well, what does that say about how well Sara will do?

Clearly even with this fairly simple model it is very difficult to try to work out a law of cause and effect, even though we could observe different correlations. Yet how often to we read articles in the paper about some miracle cure - some group had drug A and they were cured - but was that cause or correlation?

It seems strange to us now, but a founding hero of modern statistics R.A. Fisher argued in 1958 that the evidence was equivocal as to whether smoking caused lung cancer or lung cancer caused smoking - he genuinely felt that while there was a correlation between the two the causal relationship was not established.

"It isn't a way not to be wrong about everything" - indeed.

In the section "it's not always wrong to be wrong" makes the reasonable point that governments can't have absolute certainty, they have to consider that making a judgement that errs on the side of caution is probably a better way to go - ok, they might sometimes be wrong, but better to be safe than sorry. Hence it isn't always wrong to be wrong.

The next section is about democracy and public opinion. Ellenberg makes the point that you can't just take a majority opinion about policy - a majority might be in favour of cutting taxes and also in favour of increasing public spending. That sounds as if the public doesn't really know what it wants, but further investigation reveals that generally members of the public have pretty consistent political opinions but that aggregated together they don't make sense - added together you get inconsistencies.

Ellenberg then looks at electoral systems. We all know first past the post is problematic when more than two parties are involved, so what is the alternative? With a fascinating discussion he shows that depending on circumstances each system has its own drawbacks, you can come up with examples which favour one electoral system, but then other examples which reveal problems with it. Unfortunately after a look at the different options there doesn't seem to be a single obvious right answer, or none suggested by Ellenberg, anyway.

Towards the end, Ellenberg addresses "formalism" which he has mentioned on and off throughout the book. The idea is that mathematics is what mathematicians define it to be, to imagine that mathematics is discovering the "truth" about something is quite mistaken - argues Ellenberg - and he claims that while a few hundred years ago mathematicians might have got in all sort of tangles by asking the "meaning" of different concepts, now mathematicians have woken up and realised it isn't the "meaning" mathematicians should be concerned about when discussing a concept, but what the best definition of the concept should be - there is no "right answer" beyond what they decide is the most useful definition.

His examples may strike many readers as not making his argument particularly well. He explains that formalism in law means if the judge says the accused is a murderer then he is "whether he killed anyone or not" (p.402) and if an umpire rules it was a home run even if everyone in the stadium thinks it isn't because a teenager caught the ball and the umpire missed it.

"Formalism has an austere elegance. It appeals to people like G. H. Hardy,  Antonin Scalia and me, who relish that feeling of a nice rigid theory shut tight against contradiction" (p. 404).

Ellenberg finishes with a personal story about some work he did to project the spread of tuberculosis - the person who was paying him wanted to know how many people would have it by 2050. Ellenberg had to conclude from examining all the data that it was impossible to say - there were just too many variables, but this wasn't acceptable to the employer, and after some discussion Ellenberg had to give him some figure - but of course it wasn't a meaningful figure, it is just that this person wouldn't accept the right answer was that the precision he wanted couldn't be had.

However he then moves on to the story of Nate Silver during the 2012 presidential election - where different pundits were saying Obama would win or Romney would win, all Silver would give were probabilities for each of the candidates, and how likely he felt from the figures each candidate was to win - no matter how many times he was asked "who will win" he kept trying to explain this was the wrong question - we can only look at probabilities not predictions, and for Ellenberg this is reason to hope that people will start to appreciate a more mathematically aware way of understanding the world.

Having reviewed many of the "mathematical" arguments and observations of the book, I will address the other two areas I mentioned at the beginning.

First - that there is more to being right than knowing mathematics.

Francis Galton (1822-1911) a mathematician quoted frequently in this book, was a proponent of Eugenics and indeed the inventor of the term, he believed the "feeble-minded" and "paupers" should be forcibly sterilized.

As in most other cases of novel views, the wrong-headedness of objectors to Eugenics has been curious. The most common misrepresentations now are that its methods must be altogether those of compulsory unions, as in breeding animals. It is not so. I think that stern compulsion ought to be exerted to prevent the free propagation of the stock of those who are seriously afflicted by lunacy, feeble- mindedness, habitual criminality, and pauperism, but that is quite different from compulsory marriage. How to restrain ill-omened marriages is a question by itself, whether it should be effected by seclusion, or in other ways yet to be devised that are consistent with a humane and well-informed public opinion. I cannot doubt that our democracy will ultimately refuse consent to that liberty of propagating children which is now allowed to the undesirable classes, but the populace has yet to be taught the true state of these things. A democracy cannot endure unless it be composed of able citizens; therefore it must in self-defence withstand the free introduction of degenerate stock.

R.A Fisher (1890-1962) is described by Ellenberg as "the founding hero of modern statistics" and believed lung cancer caused smoking.

Is it possible, then, that lung cancer-that is to say, the pre-cancerous condition which must exist and is known to exist for years in those who are going to show overt lung cancer---is one of the causes of smoking cigarettes? I don't think it can be excluded. I don't think we know enough to say that it is such a cause. But the pre-cancerous condition is one involving a certain amount of slight chronic inflammation. The causes of smoking cigarettes may be studied among your friends, to some extent, and I think you will agree that a slight cause of irritation-a slight disappointment, an unexpected delay, some sort of a mild rebuff, a frustration-are commonly accompanied by pulling out a cigarette and getting a little compensation for life's minor ills in that way. And so, anyone suffering from a chronic inflammation in part of the body (something that does not give rise to conscious pain) is not unlikely to be associated with smoking more frequently, or smoking rather than not smoking. It is the kind of comfort that might be a real solace to anyone in the fifteen years of approaching lung cancer. And to take the poor  chap's cigarettes away from him would be rather like taking away his  white stick from a blind man. It would make an already unhappy person a  little more unhappy than he need be.

Ellenberg has to admit:

"What can I say? Mathematics is a way not to be wrong, but it isn't a way not to be wrong about everything. (Sorry no refunds!) Wrongness is like original sin; we are born to is and it remains always with us, and constant vigilance is necessary if we mean to restrict its sphere of influence over our actions. There is real danger that, by strengthening our abilities to analyze some questions mathematically, we acquire a general confidence in our beliefs, which extends unjustifiably to those things we're still wrong about" (p.335)

The third point I'll make regards what Ellenberg calls "formalism". He says much philosophical discussion about what mathematics is back in the eighteenth century and perhaps before was about what mathematical concepts "mean" - things perhaps like whether you can really have the number zero or even negative numbers - or what is the meaning of zero to the power of zero. He suggests that modern mathematicians have given up trying to find the "meaning" of concepts and realised all a concept means is how we choose to define it - we aren't trying to get knowledge of some hidden world of mathematics, but simply deciding for ourselves what the most useful and practical definitions are for different concepts - we no longer ask what a concept means but what the most useful definition is.

He gives other examples of formalism - in law for example provided the court has followed all the right procedures, then when the court finds someone is guilty of murder they are a murderer whether they killed anyone or not.

Similarly in sport if a referee or umpire gives a ruling, that's what happened, no matter what the supporters or even video cameras saw of the event.

Ellenberg regards this "formalism" as very clear and simple - we have rules and procedures, we apply them, we get a result - everything is straightforward and everyone knows how we got our result.

But isn't something wrong with this? We might as well say that words mean what we want them to mean, a rose by any other word would smell as sweet - yes there is an arbitrariness in how we define things, but that doesn't mean the thing itself is itself arbitrary, or because we define something in a certain way that's what it is.

Some definitions are better than others because they more closely describe the thing they are defining. Judges and referees get things wrong, they don't change reality by that wrong decision. This might be a more messy and less clear and simple way of engaging with the world, but it is how the world is, a formalistic approach is mechanical and while simplifying things can be incredibly useful, we can only simplify so far, the world itself is incredibly complex and we have to be faithful to that complexity.

The book contains a number of interesting examples of people being mathematically wrong and explains how they were wrong and what the correct mathematical view was, but he is all too aware that while a mathematical and scientific approach is necessary in some areas, it isn't sufficient in life as a whole.

One of the themes of the book is the question of God, where Ellenberg suggests that later in the book the question of God will be covered - as if the sort of mathematical solutions we are being show in the book could apply to the question of God, but when the topic is finally discussed Ellenberg has nothing interesting to say on the subject and has to admit that this isn't something mathematics can address.

Modern life is haunted by the knowledge that things of the spirit cannot be answered by science and mathematics - there is a desperate, tragic desire in some scientists and mathematicians to extinguish our search for God and meaning, or to claim it is irrelevant and mistaken, but we know deep within ourselves that this is not the case, and in some form our search for spiritual balance and wholeness will continue.

Books like this can be entertaining and informative but this is another example of maths and science trying to reach further than they are capable of.