Sorry–it was just a little too name-droppy, heavy on dish and light on substance or intrigue. 150 or so pages in, and I gave up. Not enough to keep my interest.
I've had this sitting in my collection unfinished for years now. Once so excited about it (I'm a huge fan of Gore on the whole), it's essentially a book-length indictment of Bush rather than a real defense of reason-based policy. That's fine, but now it's 2010, the damage of the Bush administration is not news, and though it must never be forgotten, I'm full for now. Shelving this about 2/3 of the way through.
Excerpt: Part of what makes this book less compelling is its lack of unity of purpose. There are chapters focused on debunking attacks from religionists and theologians, then sections summarizing some of the more influential religious and philosophical movements, all peppered with a surprising amount of synopses of his previous works as well as other people's. Though when tackling a subject, Stenger can be refreshingly succinct and sharp, the book as an experience lacks a flow that leads the reader anywhere in particular. Stenger also makes an odd habit of insinuating his own works into the existing New Atheist pantheon (pardon the term)–while I have no quibbles with him being considered a “Fifth Horseman,” it was a little uncomfortable to read several times that he seemed to think that anyone else thought him so. This is not in any way to denigrate the value of his work and contributions, but merely a comment on his choice to write about them in this particular manner. Also somewhat jarring is Stenger's fairly awkward use of more colloquial language that often seems to come from nowhere, and feel out of place in the context of the grand ideas being discussed.
For the full review, visit my column at http://www.examiner.com/examiner/x-4275-Secularism-Examiner~y2010m1d15-Book-Review-Stengers-New-Atheism–Uneven-but-uncompromising#
A brief take from my blog Near Earth Object:
I've just read Nicholson Baker's take on the first years of World War II, Human Smoke, and it is certainly unsettling. But I have come across a couple of reactions to the book of late that complain that Baker is trying to convince the reader that WWII was a bad war that should never have been fought, and that Churchill and Roosevelt were as bad as Hitler. This leads to a pretty much categorical dismissal of the entire work. Here's a bit from the New York Times review:
Muddled and often infuriating, “Human Smoke” sounds its single, solemn note incessantly, like a mallet striking a kettle drum over and over. War is bad. Churchill was bad. Roosevelt was bad. Hitler was bad too, but maybe, in the end, no worse than Roosevelt and Churchill. Jeannette Rankin, a Republican congresswoman from Montana, was good, because she cast the lone vote opposing a declaration of war against Japan. It was Dec. 8, 1941.
[ . . . :]
Almost unbelievably, Baker includes multiple instances in which Churchill and Roosevelt rejected the idea of negotiating with Hitler. Although he offers no commentary on the matter, the reader is forced to draw the conclusion that negotiation was a sensible idea cavalierly tossed aside by leaders who preferred war to peace.
(From Near Earth Object)
Apart from some interesting bits about the challenges presented by, and the romanticism associated with, various writing tools and implements, Dennis Baron's A Better Pencil: Readers, Writers, and the Digital Revolution is a very repetitive book with little to say. Essentially, Baron gives laborious, truly unnecessary explanations of some of the most common and basic writing means — from pencils and typewriters to Facebook and IMs — fit only for those to whom these technologies are totally alien (so perhaps it will be of use to folks 500 years from now).
On the positive side, there's a point made by Baron that, while not needing a book's length to make, is important and worth remembering: Every new means of setting words down has elicited both exciting expansion of the ability to write and publish, as well anxiety over the alleged dire consequences for our culture. And every time, we seem to agree that the advance was worth the ensuing mess and uncertainty. But it's fun to note that yes, even the pencil, seemed a bridge too far for some folks, and we can keep that in mind when we wonder at the wisdom of Tumblr and Twitter.
Baron also uses the book as a clumsy sledgehammer to attack those he sees as Luddites and tech skeptics. I'm sympathetic to Baron's position, certainly, but it's not enough to save the book. Interestingly, Baron may be one among a very rare species: the pro-technology curmudgeon.
Skip this one, at least for the next 500 years.
Kirsch writes on a terribly important subject, if only he would keep his focus upon it. The first half or so of God against the Gods is an eye-opening exploration of the differences and conflicts between monotheistic and polytheistic religions, and certainly concludes that the polytheists, while not perfect, were on the whole far more tolerant and far less murderous than the Abrahamic religions that sought to eradicate them.
Almost equally valuable is the history lesson Kirsch provides, weaving threads of connection between the monotheism we understand today and its probable birth in ancient Egypt. We learn particularly about the somewhat final showdown in Rome between Christians (Constantine and descendants) and pagans (Julian).
If only it were so. Though a fascinating read in and of itself, the book becomes a kind of historical narrative about Constantine-era political intrigue. Yes, the religious aspect is central, but the book careens from an overview of the conflict between two theologies to a truncated history book on the bloody chess game played between Roman Augusti and Caesars. I would love to read that book, but not here.
Certainly there was more to explore beyond Julian concerning mono-vs.-polytheism, even into the modern world. Why stop so short?
In all, a worthwhile read, though expect to go somewhat off track halfway through.
From my blog Near Earth Object.
Readers of my blog may already be aware of my deep affection for the thousand-plus-page tome The Rise and Fall of the Third Reich, journalist William Shirer's invaluable 1960 history of Hitler and his Germany. It was with great delight, then, that I was made aware of a history of that history, Steve Wick's The Long Night, telling the story of Shirer's years covering the tumult in Europe, mostly from the eye of the storm itself, Berlin.
Though I feel it is missing a crucial chapter, it is a stirring tale. As Wick himself notes, it reads as much more of an adventure tale than a formal history or biography. Shirer struggles daily for over a decade with Nazi censorship, separation from his wife and child, a lack of support from his employers back home, his deep disappointment with the German people, and his own hubris and failings.
We learn a great deal about the mindset of the period, as Shirer was a tuned-in, worldly journalist who had come from extremely humble, rural beginnings. Of particular note to some of this blog's readers is Shirer's impression of the Scopes “Monkey Trial,” the event in American history that in many ways began the culture wars in which we struggle today:
As Shirer saw it, the drama unfolding in Tennessee in anticipation of the upcoming trial was reason alone to take leave of his country. “I yearned for some place, if only for a few weeks, that was more civilized, where a man could drink a glass of wine or a stein of beer without breaking the law, where you could believe and say what you wanted to about religion or anything else without being put upon, where inanity had not become a way of life, and where a writer or an artist or a philosopher, or merely a dreamer, was considered just as good as, if not better than, the bustling businessman.”
Even then, the willfully ignorant mob was making the rest of civilization feel unwelcome, just as the Tea Party imbeciles do today. Indeed, even Shirer's struggles with a supposed journalistic need for “balance” over a human being's honest impression rings true today. And like today, honesty did not always win the day over bland neutrality:
As for Hitler's speech proposing peace for Europe, Shirer knew it was a lie. He was disgusted with himself for not declaring it so flat out. But he knew he could not, nor could he find a German outside the government to say it, and the frustration ate at him. “The proposal is a pure fraud, and if I had any guts, or American journalism had any, I would have said so in my dispatch tonight,” he wrote. “But I am not supposed to be ‘editorial.' ”
But as a fan of Shirer's definitive work, I concluded my reading with a slight sting of disappointment. Wick omits from his tale the writing of Rise and Fall; the process of putting this all-important book together is almost totally absent. Wick himself tells us near the book's end that to do so would mean a wholly separate volume. “A biographer will someday write the story of the enormous hurdles Shirer had to climb to sell the book,” demurs Wick, and one can't help but wish that this hypothetical book already existed within the one we were already reading.
What a herculean effort it must have been to pen such a book! Ten years of Shirer's life was poured into it, and its influence will be felt for generations. Surely, this story can be told as well as the formative experiences in Europe that led to the book's genesis. It is not Wick's fault that this is missing (though having the words “The Rise and Fall of the Third Reich” in the subtitle does lead one on), but its absence is palpable and deflating.
That said, the book as it is holds up, and it is a story that needed to be told. We learn so much about what it means to be a journalist, a pro-democracy American, a liberal, and a vulnerable human being caught in a volatile, insane world.
Though her politics, her insane desire for social progress to reverse, and her obvious antipathy for anyone with a college degree make me want to chop off my hands and bleed to death, this is still a useful, educational, and yes, witty and well written account of the Reagan White House - most particularly in regards to the art of writing in a political fishbowl.
I couldn't bear it. Terrible as far as I got, three chapters in. See my full critique and struggle: http://www.examiner.com/x-4275-Secularism-Examiner~y2009m12d15-Impatience-with-Frank-Schaeffer-or-my-failure-to-review-a-book
[See my full review at my Examiner.com column: http://www.examiner.com/examiner/x-4275-DC-Secularism-Examiner~y2009m5d10-Book-review-The-Atheists-Way-is-a-crucial-message-in-a-light-tome]
. . . Maisel's important mission is to help atheists face the truth of their circumstances, and in his book he gives some guidance as to what to do with once those circumstances are honestly understood. His message, I found, is crucial. His execution, however, is somewhat flawed, if nobly so.
This book offers a vital message that I think any nonreligious person needs to hear, even if they don't realize they need to hear it: There is no inherent “meaning of life,” existence really is a random, pointless phenomenon, and any meaning for which we may pine must be created by ourselves. Maisel levels with the reader, and insists that we establish our own parameters and values based on our consciences and intelligence, and encourages us to live these to our best ability. . . .
More here: http://www.examiner.com/examiner/x-4275-DC-Secularism-Examiner~y2009m5d10-Book-review-The-Atheists-Way-is-a-crucial-message-in-a-light-tome
Jack Lynch's fascinating book, The Lexicographer's Dilemma, is full of original insights, refreshing perspective, and delightful trivia about our mother tongue. It spans history and academia to lend understanding to what it means for a word to be considered an “official” part of the English language. The gist, as you might surmise, is that there is no such thing as the official version of the language. Dictionaries and pedants have over the centuries set down guidelines about propriety, some more sternly than others, but on the whole, the language is an ever-evolving, gelatinous swarm of words, idioms, and ideas. Lynch would have it no other way, and has little regard for those prescriptivists who attempt to nail it down.
[Note: this review was originally published at my blog, Near-Earth(dot)com. Visit, won't you?]
To give an idea of the book's overall theme, see Lynch's take on the word/non-word “ain't,” which he describes as...
. . . the most stigmatized word in the language . . . [which] every five-year-old is taught is not a word. But why not? Just because. It originally entered the language as a contracted form of am not (passing through a phase as an't before the a sound was lengthened) and first appeared in print in 1778, in Frances Burney's novel Evelina. We have uncontroversial contractions for is not (isn't) and are not (aren't), so what's wrong with reducing am not to ain't? The problem is that it was marked as a substandard word in the nineteenth century, people have been repeating the injunction ever since, and no amount of logic can undo it. It's forbidden simply because it's been forbidden.
. . . if including everything scientific is impossible, so is excluding everything scientific. Everyone recognizes the need to include some scientific words like fruit fly, koala, carbon, and salt. But why should a lexicographer include daffodil and atom but omit brasolaeliocattleya (a kind of orchid) and graviscalar bosons (theoretical subatomic particles)? There's no difference in the character of the words, only in the familiarity of the things they identify. If some future technological breakthrough makes us all familiar with graviscalar bosons, they'll eventually show up in the major general dictionaries. Until then, they have to remain in the language's antechamber.
The notion that particular words are taboo can probably be traced back to primitive beliefs about sympathetic magic, in which language can be used to injure people at a distance. It's telling that many of our unseemly words are known as curses, since the conception of offensive language seems to have derived from a belief in the power of a malefactor to place a curse on an enemy.
Convention exists, of course it does, but convention is no more a register of rightness or wrongness than etiquette is, it's just another way of saying usage: convention is a privately agreed usage rather than a publicly evolving one. Conventions alter too, like life. . . . Imagine if we all spoke the same language, fabulous as it is, as Dickens? Imagine if the structure, meaning and usage of language was always the same as when Swift and Pope were alive. Superficially appealing as an idea for about five seconds, but horrifying the more you think about it.
If you are the kind of person who insists on this and that ‘correct use' I hope I can convince you to abandon your pedantry. Dive into the open flowing waters and leave the stagnant canals be.
But above all let there be pleasure. Let there be textural delight, let there be silken words and flinty words and sodden speeches and soaking speeches and crackling utterance and utterance that quivers and wobbles like rennet. Let there be rapid firecracker phrases and language that oozes like a lake of lava. Words are your birthright.
(From my blog Near Earth Object)
About halfway through Bill Bryson's At Home: A Short History of Private Life, one can't help but come to a couple of stark conclusions. One, that most of humanity's domestic life, for the vast majority of time time we had domestic lives, was full of suffering and misery the likes of which we moderns can barely imagine. Two, that the tiny percentage of the species blessed with an overabundance of money and/or status have not been content to simply live well, but have wasted vast economic resources to spoil and aggrandize themselves in ways that would make Ozymandias cringe.
Bryson is a wonderful writer, and his storytelling is as usual conversational while remaining high-minded, as he clearly glories in his research and discoveries while allowing the space for the reader to catch up to him.
But his subject, I suppose, necessitated the retelling of these two central themes I've mentioned: The misery of the underclasses (disease, vermin, cold, being overwhelmed by feces, etc.) and the unabated vanity of the rich (who also, it should be noticed, were subject to disease and other unpleasantness, but often in Bryson's telling faced ruin by their own ignorance or hubris). But if it is necessary, it is also relentless. Story after story, anecdote after anecdote is a tail that either makes one feel deep pity for those who are crushed under the weight of their poverty or nausea over the largess of the aristocracy. In between are the triumphs, the brilliant ideas, the advances, but it becomes almost exhausting when one contemplates the mayhem from which the victories emerge.
Here's a good summation from the book, a quote from Edmond Halley (of comet fame), that I feel gets to the heart of the long crawl of human domesticity — human daily life — over the centuries.
How unjustly we repine at the shortness of our Lives and think our selves wronged if we attain not Old Age; where it appears hereby, that the one half of those that are born are dead in Seventeen years.... [So] instead of murmuring at what we call an untimely Death, we ought with Patience and unconcern to submit to that Dissolution which is the necessary Condition of our perishable Materials.
This is reprinted from my blog Near Earth Object.
The edition that I own of William Shirer's The Rise and Fall of the Third Reich advertises that the book is one that “shocked the conscience of the world.” I saw this mainly as an indication of what the book must have meant to a public that might not have been as familiar with the crimes of the Nazis and, well, accustomed as we are today to frequent and thoughtless analogies; from goofy Mel Brooks Hitler parodies to the Soup Nazi, as a society we seem to have digested this period of human history as just that, a period of history, distant and with little relevance.
I think we may be doing a disservice to ourselves. I don't mean to say that this terrible period should not be the subject of humor and satire — it must! — but having now completed Shirer's enormous book, I am beginning to think that we are forgetting too much.
It's easy to say that, for example, the tea-baggers calling Obama Hitler and comparing the health care bill to the Final Solution are out-of-bounds, an example of overheated rhetoric. But in a way, saying that these kinds of comparisons “go too far” really doesn't go nearly far enough. And it may upset some of the more bloodthirsty liberals as well to hear that, yes, even doing a Bush or Cheney-to-Hitler comparison is way, way off base.
Let's not even deal with the Obama/health care comparisons; they make no sense in the least. But the Bush/Cheney comparisons usually stem from the idea that the Bush team was imperialistic, hungry for the resources of other nations, and mainly heartless about who it hurt in its quest for power. Fine. All of that was true of Hitler. But it's also true of just about every other imperial power in human history. You can't be imperial unless you build an empire. You can't build an empire unless you take someone else's territory. You usually can't do that without committing — or at least sincerely threatening — unthinkable violence.
But we use Hitler and Nazism as the standard of human evil for good reason. The Holocaust might be the most evil, horrific event of our species' history even if had been merely a mass extermination — but it was not the first nor the last genocide, not the first or last slaughter of millions, that humans have known. The Holocaust was that plus, if it can be imagined, several additional levels of cruelty; the starvation, the slavery under unimaginable conditions, the insane medical experiments, the sadism of the Nazi captors, and the raw industrialism of the killing — rounding up the populations of already-rotted-out villages and systematically executing whole neighborhoods and families at once, forcing the soon-to-be dead to jump into pits filled with their dead neighbors and relatives before they themselves were murdered.
And when all was lost for the Third Reich, it was not enough to lose the war. Had Hitler had his way in his final days and hours, the entirety of Germany would have crumbled with him, as he ordered every aspect of German life — stores, waterworks, utilities, factories — destroyed so there would be nothing left for the Allies to take. As horrifically as he had treated his enemies, he was about to let the same happen to his own “superior” people for no other reason than pride.
Perhaps it's not worth trying to figure out whether anyone in human history was “worse.” I'm no historian by any means. There are probably men and systems that were more evil but had less opportunity to do such harm (I don't put it past the likes of Al Qaeda or the regime in North Korea to behave so madly and cruelly given the means to do so), and those who may have done more damage and caused more suffering, but are not remembered in the same way. But trivializing the terror that was Nazism in our daily parlance, to use the imagery as something applicable to our current politics is to forget. It's to forget the tens of millions who not only died because of Hitler and his henchmen, but to forget the deep, unspeakable suffering of all those who found themselves beneath the Nazi boot.
And it is to forget what it is that brought Nazism to the forefront of German life. It is instructive that Hitler never succeeded in some violent takeover of Germany, despite attempts to do so. In the end, Hitler achieved power through “official” channels, bit by bit gaining the approval and acquiescence of the government and institutions, and bit by bit exploiting a frustrated and angry populace by stoking its rage, its fears, and its pride. Shirer himself, in a 1990 edition of his 1960 book, wondered whether a then-newly-reunified Germany might be ripe for another similar episode. 20 years later, his fears have not come true. Not there, anyway.
But he was right to be watchful. To trivialize the Third Reich today is to lose sight of how it could happen again, not in the Obama-is-Hitler sense, but in the sense of a charismatic person or persons taking advantage of a weakened and frightened public and a spineless government, and doing things in their name that they did not think human beings were capable of. It can happen again, but if we don't learn the right lessons from history, we'll miss it. And it will be too late.
All that said, do your brain a favor and take the big chunk of time you'll need to read Shirer's book. Learn something, why don't you.
I can't recommend this highly enough. This is not an anti-religion screed at all, but comes at the topic of religion as a naturally emerging aspect of humanity in a thoughtful, funny, accessible way. It is “New Atheist” only in that it calls for open questioning and research of religion and its utility (and it's written by an atheist).
Very entertaining, very clever, sometimes a wee bit too cute. If you're well versed in science, it should be some pleasant amusement. If you're not, it's a great primer. Lots of wordplay and gags, but on the whole a worthwhile overview of the sciences.
This is the last of the Four Horsemen's books that I have read (and I heartily recommend all of them), and I was putting this one off because I assumed it might be something of a retread – choir-preaching, if you will. Indeed, if you have seen Hitchens debate or appear on television, you've come across a lot of what is in this book. What in one unified tome, God is Not Great is an excellent and quick read, at that (though perhaps more like a series of related essays than a single narrative, particularly in the second to last section which is something of a truncation of Jennifer Hecht's “Doubt: A History”). Happily, I can also say that even though it is Hitchens it, like the books of Dawkins, Harris, and Dennett, is not arrogant, it is not mean-spirited. Hitchens takes this subject very seriously, sees real consequences to superstition and theism, and makes a hard-nosed, unapologetic case. Confidence is not arrogance, telling hard truths is not mean. You may find more to disagree with in the more nuanced political positions he takes, but his case against religion is compelling.
A love letter to the wonders of scientific exploration (else it would not be Carl Sagan!), and at the same time a serious and sympathetic look at pseudoscience and credulity, foreshadowing The Demon-Haunted World. A beautiful read, save for a long point-by-point refutation of another author's absurd work. It's there to make an important point, but it's a bit much.
I sure do wish Sagan was still around. I'm sure he had many more books in him.
Robert Wright, in his latest book The Evolution of God, promises up front that he will make a plausible case for the existence of some force or intention behind the universe that could be called “divinity,” and does so in the midst of making a different case altogether: that our notions of the illusory “one true god” (and Wright does call the idea of God an “illusion”) adapt over time to the circumstances of the people believing in him.
On the second argument, he succeeds brilliantly. Not so much in that this is a revelation (is it a surprise to anyone that religious notions change to fit the times and situations of the humans inventing them?), but in the fluid, accessible, and vivid way in which he makes his case and educates the reader. 90 percent or so of The Evolution of God is utterly engrossing and fascinating in this way.
On the first argument, however, he fails, and it leaves one utterly puzzled.
To read the rest of this review, click here.
‘The Case for God' is a case not made, from my Examiner column.
Few religious thinkers have eased the consciences of spiritual liberals, anti-fundamentalist religious moderates, and functional nonbelievers unwilling to stake any affirmatively atheistic ground than Karen Armstrong. For years she has been making the assertion that her scholarship proves that the “great” monotheisms ought not be associated with the fear, xenophobia, irrational faith in the absurd, violence, or misogyny that so they so often encourage, but that they have their “true” foundations in love and tolerance–and anyone who doesn't think so hasn't been doing it right. As much as that assertion causes many skeptics to arch their eyebrows, it at least sounds like a good thing to which the faiths could aspire if they were so inclined. Alas.
Her latest book, The Case for God, is not meant to explain the various faiths' dispositions or ideological foundations, but to convince the reader that the most commonly held notions of God, those of a being that created the universe and “exists,” are false, and that in actuality, God is an unknowable, unfathomable concept for which the very term “existence” is too limited. If you think that sounds like a pretty weak basis for an argument when dealing with such a grand concept's veracity, you're right. And despite Armstrong's impressive breadth of knowledge and her nuanced grasp of various thinkers' positions throughout the generations, her case never adds up.
Part of the trouble, of course, is that her book's premise is challenged by her own explanation of what God is. It is nigh impossible for me to understand how someone can build a case for God if the central thesis is that God is an unknowable pseudo-entity-but-not-really, something that mere humans are wholly incapable of defining. Where does that leave your book?
[For the rest of this review, see my Examiner column here: http://www.examiner.com/examiner/x-4275-Secularism-Examiner~y2010m1d3-Book-review-The-Case-for-God-is-a-case-not-made]
Alan Jacobs, who readers of this blog (all ten of you) may know from previous references to his excellent blog TextPatterns, has recently released a wonderful book about reading that I simply can't recommend highly enough. The Pleasures of Reading in an Age of Distraction is just the sort of pithy, sympathetic tract that our times demand – it encourages bibliographic exploration, celebrates chance literary encounters, while offering sincere understanding for the would-be “well-read” among us who fear missing out on an overly massive menu of “great works.”
[Note: see more of my writing at Near Earth Object.]
Those chance literary encounters are the subject of this passage, which I found so delightful and even moving, that I thought I'd share it here.
The cultivation of serendipity is an option for anyone, but for people living in conditions of prosperity and security and informational richness it is something vital. To practice “accidental sagacity” is to recognize that I don't really know where I am going, even if I like to think I do, or think Google does; that if I know what I am looking for, I do not therefore know what I need; that I am not master of my destiny and captain of my fate; that it is probably a very good thing that I am not master of my destiny and captain of my fate. An accidental sagacity may be the form of wisdom I most need, but am least likely to find without eager pursuit. Moreover, serendipity is the near relation of Whim; each stands against the Plan. Plan once appealed to me, but I have grown to be a natural worshiper of Serendipity and Whim; I can try to serve other gods, but my heart is never in it. I truly think I would rather read an indifferent book on a lark than a fine one according to schedule and plan. And why not? After all, once upon a time we chose none of our reading: it all came to us unbidden, unanticipated, unknown, and from the hand of someone who loved us.
. . . people who know what it is like to be lost in a book, who value that experience, but who have misplaced it . . . They're the ones who need help, and want it, and are prepared to receive it. I had become one of those people myself, or was well on my way to it, when I was rescued through the novelty of reading on a Kindle. My hyper-attentive habits were alienating me further and further from the much older and (one would have thought) more firmly established habits of deep attention. I was rapidly becoming a victim of my own mind's plasticity, until a new technology helped me to remember how to do something that for years had been instinctive, unconscious, natural. I don't know whether an adult who has never practiced deep attention—who has never seriously read for information or for understanding, or even for delight—can learn how. But I'm confident that anyone who has ever had this facility can recover it: they just have to want that recovery enough to make sacrifices for it, something they will only do if they can vividly recall what that experience was like.
You don't really understand evolution until you read this book. Probably the best thing I can say about it is that it makes the thing we call “life” – usually described as this ethereal force or spark – far more concrete than one thought possible. Read this, then read The Beak of the Finch by Jonathan Weiner, and then this whole thing we call evolution will make sense on the micro and macro level.
Now don't get me wrong. He's a great writer with an amazing grasp of the subject matter. But lawd, lawd, he could have utilized endnotes or footnotes a TAD more. Very long, very drawn out, overly detailed. I must admit, I had to skim through some of the end because my brain was leaking out my ears. I loved Guns Germs and Steel, but this lacks the thrill of that particular discovery. Anyway, a perfect textbook for this subject, but just, well, too much.
A good read, but more importantly, a really solid education; Not simply in terms of the history of doubters, but the history of, well, thought. Of philosophy. For someone who didn't quite get the education he might have liked, this book is a great tour through different ways of thinking about the world, freed from the gauze and blur of supernaturalism.
Richard Holmes' tome is aptly titled. It's a wonder, and it takes an age to read it. Right. I wanted to get that out of the way, as the fact of its lengthiness weighs on me as I consider penning a reaction to its substance. It feels really long.
But, as with many efforts, it is worth it. The Age of Wonder is an exhaustive chronicle of the Romantic era of science – indeed, the dawn of the very term. It focuses primarily on a small cluster of main “characters,” beginning with the intrepid Joseph Banks (and his utterly fascinating adventures in Tahiti) all the way through the Herschel lineage (William, his sister Caroline, and William's son John) – and just before Charles Darwin takes his voyage on the Beagle. It is a tale of presumptions shattered, egos inflated and exploded, and orthodoxies forever upended – and not just those of stodgy religionists, but of even the most open-minded of explorers and philosophers. As Humphrey Davy, perhaps the most prominent of Holmes' subjects, said, “The first step towards the attainment of real discovery was the humiliating confession of ignorance.” There is a lot of that documented here.
Perhaps the most prominent theme throughout the book, with all of its detailed (often to a fault) recountings of experiments, arguments, and internal struggles, is that of the development of a professional discipline whose aim is more than the sum of its parts. What would eventually be known as science would become a practice not simply of confirming or denying the veracity of hypotheses, but it would perhaps be the one great force that ushers humanity beyond its terrestrial and provincial understanding of itself. Holmes summarizes the thinking of Samuel Taylor Coleridge on this subject:
. . . Coleridge was defending the intellectual discipline of science as a force for clarity and good. He then added one of his most inspired perceptions. He thought that science, as a human activity, ‘being necessarily performed with the passion of Hope, it was poetical'. Science, like poetry, was not merely ‘progressive'. It directed a particular kind of moral energy and imaginative longing into the future. It enshrined the implicit belief that mankind could achieve a better, happier world.
The art of living happy is, I believe, the art of being agreeably deluded; and faith in all things is superior to Reason, which, after all, is but a dead weight in advanced life, though as the pendulum to the clock in youth.
Yet it ought not to be altogether condemned. It promises prodigious faculties for locomotion, and will allow us to traverse vast tracts with ease and rapidity, and to explore unknown countries without difficulty. Why are we so ignorant of the interior of Africa? — Why do we not despatch intrepid aeronauts to cross it in every direction, and to survey the whole peninsula in a few weeks? The shadow of the first balloon, which a vertical sun would project precisely underneath it, as it glided over that hitherto unhappy country, would virtually emancipate every slave, and would annihilate slavery forever.
Side note
The Age of Wonder
There was no general term by which these gentlemen could describe themselves with reference to their pursuits.
‘Philosophers' was felt to be too wide and lofty a term, and was very properly forbidden them by Mr. Coleridge, both in his capacity as philologer and metaphysician. ‘Savans' was rather assuming and besides too French; but some ingenious gentleman [in fact Whewell himself] proposed that, by analogy with ‘artist', they might form ‘scientist' — and added that there could be no scruple to this term since we already have such words as ‘economist' and ‘atheist' — but this was not generally palatable.
The analogy with ‘atheist' was of course fatal. Adam Sedgwick exploded: ‘Better die of this want [of a term] than bestialize our tongue by such a barbarism.' But in fact ‘scientist' came rapidly into general use from this date, and was recognised in the OED by 1840. Sedgwick later reflected more calmly, and made up for his outburst by producing a memorable image. ‘Such a coinage has always taken place at the great epochs of discovery: like the medals that are struck at the beginning of a new reign.'
This argument over a single word — ‘scientists' — gave a clue to the much larger debate that was steadily surfacing in Britain at this crucial period of transition 1830-34. Lurking beneath the semantics lay the whole question of whether the new generation of professional ‘scientists' would promote safe religious belief or a dangerous secular materialism.