Strange New Thoughts

The place where I slam down gauntlets and pick up the pieces.

Wednesday, February 01, 2023

I Will Literally Change This Title When I Literally Remember What It Was Going To Be

             Hey you, don't help them to bury the light
                Don't give in / without a fight

                  - Pink Floyd, “Hey You”, The Wall


This social media wasteland, like some post-apocalyptic scorched earth, sparsely populated with roving pockets of survivors who reminisce wistfully about things like electricity and shops, regularly echoes with the loss of things we unwisely took for granted. The conversation frequently turns not only on those things lost, but to how we lost them in the first place.


I am in love with the English language and its many dialects. Despite its shortcomings (such as the gender-neutral nouns and pronouns that frustrate many for whom English is a secondary language), it teems as much with ringing oratory and towering prose as it does with much more prosaic and parochial forms of expression. I am a guitar player, not a philologist or linguist, so it might seem funny to you that I find fault with its development. But, for the sake of brevity in an article that threatens to test your endurance, I'll just spit it out. (Figuratively.)


What, not “literally”? Why, no. I am surrounded by intelligent, well-educated people who remind me (while they think they're informing me for the first time) that languages evolve, grow, shrink, often becoming or giving birth to new ones. This morning, in an ill-advised, pre-coffee post on Facebook, I lamented the death throes of the adverb “literally” and its less-commonly abused sister adjective, “literal”. Cue the harbingers of change, whose kindly admonitions cite the highly fluid nature of words, phrases and other elements of language. Yes, I'm acutely aware, and not in some curmudgeonly reactionary way, that words change, usually losing their original meanings.


In his pithy and informative essay “The Death of Words”, C. S. Lewis (in a not quite curmudgeonly, reactionary way) laments the gradual loss of meaning that words undergo over the course of generations or centuries. (Please click on the link and read it, if you never have done, or even if you have.) My concern here is not that the word “literally' is rapidly losing its dictionary definition, but that there is no adequate word with which to replace it. 


One of my favorite uses of this very specific word occurs in Sheldon Vanauken's article “The Bachelor”, first published in The Hillsdale Review in 1982, then in his book Under the Mercy in 1985, and used here without permission:


A knight bachelor was literally a free lance who used that lance under the flags of others, as opposed to a knight banneret, who led his retainers under his own flag.     

         

Now that we know the origin of the usefully evolved term “freelance”, we can touch on just how “literally” a knight could also be any sort of “lance”. Here the term is an old one, like that in which a “plump of spears” simply meant men armed with spears, or a “squadron of jet fighters” is a group of pilots, each of whom is wearing an airplane. So our knight bachelor isn't a long, pointy stick, (although Don Quixote de La Mancha may come close), but a mounted man in arms. In an extremely narrow, useless sense, he isn't “literally” a lance of any sort.


This is not what is happening to the word “literally”. I frequently hear things (usually among the less educated) like “I literally died when I got my test back with a D minus,” or “the dance floor was literally on fire.” Now it doesn't take much mental firepower to know that the student with the bad grade isn't speaking from beyond the grave, or that the dance floor was seen to be filled with energetic gyrations rather than going up in smoke. Nor is there any genuine harm in slang or colorful figures of speech. I get it. Words change, people change, cultures change.


But is that any reason to accept all of it? Suppose somebody started using the word “awesome” to describe a stick of gum or a tennis racket, rather than a 50-foot tsunami wave, or a B-52 thundering overhead at treetop level, or John Bonham's end-of-the-world drum intro to Led Zeppelin's “When The Levee Breaks”? Oops, too late. Now if I use the word “awesome” to describe the Grand Canyon or Niagra Falls, the listener as likely as not will think I'm expressing subjective approval rather than awe.


Never mind, we have other words. I confess that I now use the word “epic” where I really mean “awesome”, even though “epic” is itself in a state of flux. A cursory perusal of the thesaurus will reveal words sufficient to convey grandeur, sublimity, majesty. Awesome, dude.


But try that with “literally”. None of the synonyms listed - “actually”, “completely”, “directly”, “plainly”, “precisely”, “really”, “simply”, “truly” - quite conveys the essence, although a couple of them could be misused as I attempted to illustrate above.


This has been bothering me for a long time, but a recent journalistic gaffe is what put me over the edge (figuratively) and made me post my grievance in a dangerously decaffeinated state. The news article in question pertained to a tiny, dangerously radioactive capsule of Caesium-176 lost in the vast Australian outback by a mining firm. Upon its recovery, the authorities reported that they had “literally found the needle in the haystack”. The metaphor is a fitting one.


Except that the word “literally” transforms a metaphor (i.e., “That certainly let the cat out of the bag”) into a lie, unless you literally had a carbon-based cat in a physical bag and someone let it out, in which case somewhere, a cat thanks you. The word “virtually” would have conveyed the essence – they “virtually found the needle in the haystack”, which illustrates the apt comparison without the unnecessary implication that a physical needle in an Australian haystack would warrant an urgent search by emergency services.


Why the fuss, then? Nobody is harmed or killed by the misuse of such a word. A “gentleman”, over the course of many generations, gradually goes from being a landowner to being a man whose manners happen to please us. A “ballad” goes from being a song or poem with a repetitive narrative structure (think “Barbara Allen” or “The Wreck Of The Edmund Fitzgerald”) to being a slow-to-medium tempo pop song. (You get to pick your own here.) So the next time someone tries to explain to me the fluid nature of spoken and written languages, I'm literally going to explode. Ka-BOOOOOOOM.


See what I did there?


No, I'm not going to erupt into a deadly, noisy fireball. “He exploded” means something. (He was very angry and lost his temper.) “He literally exploded” means something unintended, and impossible, unless he's wearing a suicide vest.


C.S. Lewis points out in many of his writings that the perceived inevitability of an event is no reason to therefore accept it. Again, I quote in the knowledge that it is easier to ask forgiveness than permission:


Detestation for any ethic which worships success is one of my chief reasons for disagreeing with most communists. In my experience they tend, when all else fails, to tell me that I ought to forward the revolution because 'it is bound to come'. One dissuaded me from my own position on the shockingly irrelevant ground that if I continued to hold it I should, in good time, be 'mown down'— argued, as a cancer might argue if it could talk, that he must be right because he could kill me.

 

(CSL, “A Reply to Professor Haldane.”published in On Stories and Other Essays on Literature. 1966/1975 Harper-Collins. 117-118.)

 

It is intrinsically possible (unlike, say, teleportation, or time travel as it is popularly understood) that some sort of public movement could rally the troops and we could give the word “literally” a reprieve. Perhaps something as simple as a public statement by a public figure like Taylor Swift or LeBron James (“You're using the word 'literally' all wrong, yo!'”) could give some people pause, inspiring them to think more carefully about what they're actually saying.


I'm not the least bit optimistic. Global society shows little sign of growing subtlety or nuance. But, if I were faced with a scenario in which a news journalist, outside on location reporting on a tornado, were quite literally blown away on camera by the storm (hopefully surviving to broadcast another day), I'd like to be able to say he was “literally blown away” without being misunderstood to mean that he was really, really impressed by the tornado.


I'm tenacious. I've been married for over 30 years. I stay in bands for years on end. I lived in Venezuela for nine years. I don't give up easily. So please, not another word to the effect that I might as well give up. I have other fish to fry. Mmmm...fish...




Sunday, December 26, 2021

I Walked Right Into That One

Or, A Funny Guy's Quest To Develop A Sense Of Humor


I didn't realize when I started school at age five that I was any different than other kids. My likes and dislikes were strong, and I couldn't understand why other kids didn't share my passions for, say, eagles, or reading. It was a sure fire road to unpopularity, and I took it, oblivious to the risks. Fast forward to fourth grade, where I met a kid named Donald. Donald was popular, or at least he seemed so to me. (I still owe him a quarter for a Pez™ candy dispenser.) Donald was funny, always joking around, and best of all, he could laugh like Paul Winchell's Tigger (“Hoo-hoo-hoo-HOOOO!”) I resolved at once to be more like Donald, a watershed decision in my life, which has haunted me ever since. I couldn't do the Tigger laugh, but I tried to squeeze the humor out of everything in my path, which must have annoyed everything in my path. I developed the ability to identify and exploit humor wherever it could be found, but also frequently misidentified it and left a trail of damp squibs wherever I went.


If unpopularity was my goal, I should have been preparing my acceptance speech. Even without the affected humor I would have remained eccentric, scrawny and unathletic. If there exists a primary school where such a child could lead a quiet, un-bullied life, I should be intensely curious as to its whereabouts and details. It certainly wasn't where I went to school, and the cruelty and abuse I endured make it easier to understand some school violence than I wish I did. I must have survived, as I am writing this 50 years later. The impulse to joke about things also survived, and was so deeply ingrained that it came partly to define me, up until the time that my musical skills gradually began to provide me with an alternative path (slightly more successful) to peer acceptance.


In the 10th grade the most popular kid in my class – handsome, athletic, talented, smart – deigned to confront me in private and, point-blank, gave me both barrels about my flippancy. I was taken aback. Only my mother, herself a comic genius, had ever bothered to tell me I needn't be clever all the time. Now here was the proverbial Homecoming King, possibly risking his reputation by taking a moment to address the Class Nobody with the goal of helping me to shed some of my self-inflicted annoyingness. I could not then appreciate this unmerited gesture, and I'm sure I looked askance and muttered some vague concession. (He went on to be a pastor and I a missionary, but I digress.) I was not cured, but the self-awareness I lacked was, for the first time, at least brought to my attention. One has to start somewhere.


I stayed funny, though. A friend of mine and his girlfriend invited me to watch The Exorcist on HBO with them so as to provide comic relief. It was considered to be the scariest movie then playing, and I delivered Mystery Science Theater 3000-type running commentary to help keep them from entering too deep into the horror (and almost certainly spoiling the film in the process.) My humor sometimes proved convenient, and was allowed to stay.


At twenty one I was lead guitarist in a fairly popular local band, itself soon to become a casualty of its own volatility. Our lead vocalist/bass player was a clever songwriter and often an insufferable jerk. One day I happened to overhear him telling another bandmate, “Blake has no sense of humor.” WHAT??? Me, Mister Hilarious, no sense of humor? I'll show him, I vowed to myself. The next time we met I saw to it that I was in rare form, firing one-liners and ripostes like naval broadsides. He laughed. I was good. I had a sense of humor! Guess I showed him, didn't I?


Uh, no. I'm not sure how or when it happened, but my insecurity endured (and doubtless endures still), blurring the line between merely being funny and taking myself far too seriously. That is until…okay, I don't know until when. My conversion to Christianity in the early 80s almost certainly played a part, as the Christian is charged “not with thinking less of one's self, but of thinking of one's self less.” I somehow began to see that the bass player's observation about my lack of a sense of humor was based not on my abilities as a real-time gag writer, but my wounded pride, ever on the defensive of what dignity I could salvage as a starving musician. In short, I couldn't take a joke.


My new faith required far more of me than a mere realignment of my self-expression; it forced me to face my failings and foibles, but not, it turned out, without mercy, itself a cornerstone of Christianity. I could cease to take myself too seriously, without having to become a somber killjoy, even toward myself. Nonetheless, my default tendency toward jocosity continues to provide me with the challenge of reining it in.


One of my dearest friends, and the smartest guy I know, is, naturally, possessed of a scathing wit. One day he and I jointly concluded that, in spite of our being humorous, neither of us had a sense of humor. This was no observation of any particular personality trait so much as an indictment of one's own character. We had our work cut out for us.


This distinction is not widely known. I recall an anecdote in which a female acquaintance of a pre-fame Steve Martin, now universally considered to be one of the funniest people in the world, observed to a friend, “Poor Steve – he has no sense of humor.” This was doubtless thought by the author to be a delicious irony, a complete failure to recognize what was soon to be a defining phenomenon in modern comedy. But in her admirable effort to be seen as having been short-sighted (and thereby having a true sense of humor), she fails in her attempt to be wrong. Steve Martin may well have taken himself seriously enough (his excellent autobiography, Born Standing Up, provides all the insight one needs to reach a conclusion), but his famed leaps into affected self-importance, cluelessness and total absence of self-awareness were coolly calculated bids to sacrifice his perceived dignity on the comedic altar. These eventually, and ironically, earned him all the respect one could possibly garner in the entertainment world, facilitating his escape-velocity ascent as an author, musician, screenwriter, director, playwright and serious dramatic actor. If Steve does indeed possess a sense of humor (a humility ideally suited to his dizzying achievements), then its connection with its his mere funniness is murky at best.


Am I developing a sense of humor? As with any strong personality trait, this one occurs more readily in some people than in others. I can only identify two moral virtues in myself that come easily to me: One is that of being considerate (i.e., I try to let other drivers change lanes when possible, or not to hinder their progress), and empathy (I can feel others' pain, and can easily be moved to tears by those of another.) As for the rest, I am naturally impatient, perfectionist, lazy, disorganized, and sometimes just plain stupid (a moral, not mental deficiency, as defined in by Christ in Mark 7:22 as “folly”.) As for a sense of humor, one needn't be funny (at least not intentionally) to have one, and a good comedian might not. The two are not merely interchangeable. The great Alan Alda put it best: “'Stop me if you've heard this one...' Pow! Now that's funny.”



(P.S.: Donald, if you're out there, the 25¢ I owe you has, with compound interest, achieved the amount of approximately $32, not adjusted for taxes and inflation. You know where to find me.)

Friday, March 27, 2020

Seven Unplanned Steely Dan Moments


A veritable oxymoron, right? The Dan's notorious studio perfectionism would seem to negate the possibility of Messers Fagen & Becker ever releasing anything short of their customary stratospheric sheen. A keen ear, however, will detect a number of delightfully human (expensive humans, perhaps, but still human) musical moments that only serve to amp up the Steely Dan mystique.

1. “The Royal Scam”, The Royal Scam, 1976

This rags-to-riches saga features an embarrassment of riches, from Larry Carlton's subversive guitar work to some of the best orchestration on a Steely Dan record. Then, right in the middle of the guitar solo, Donald Fagen clears his throat. Not buried in the mix, as heard during the intro to Jimi Hendrix' “Purple Haze” - this one cuts right through, perfectly in time with the with the rhythm section. It sounds so aggressive that it actually dovetails with the grim narrative of the song. I bet it's not on the studio chart, though.

2. “Chain Lightning”, Katy Lied, 1975

Rick Derringer makes his second appearance on a Steely Dan record (the first was his slide work on “Show Biz Kids”), underscoring this creepy scenario of a couple of bystanders at a Nuremberg rally. As he digs into the second half of his bluesy 24-bar solo, he bends the 'B' string a minor third, but as it approaches its goal ('E'), it slips out from under his left hand and snaps back into its unbent state, blasting out an unintended staccato A9th chord on the open top strings. (Every guitarist here knows exactly what I'm talking about.) But it fits! A lot of guitarists punched in on 'Dan sessions (the process of resuming – on-the-fly – the recording of a track after the part you wish to keep), and it sure sounds like that's what Rick did here. As Pee Wee Herman might say after crashing his bike, “I meant to do that...”

3. “Black Friday”, Katy Lied

Before assuming his original role as Steely Dan's bassist, Walter Becker had spent years honing his guitar prowess (Donald thought he was black upon first hearing him soloing in another room at Bard College.) However, it wasn't until they recorded their cover of Duke Ellington's “East St. Louis Toodle-Oo” that Walter resumed his six-string duties, spelling the wah-wah trumpet with his talk-box guitar. By the time of “Katy Lied”, Walter was seriously starting to exorcise his guitar demons, exemplified by his venomous solo work on “Black Friday”. After peppering the second verse with face-melting 11th chords, he launches into a murderous 20 bar solo. At the 11th bar he goes for the same 'C#'-to-'E' bend that Rick Derringer tackles on 'Chain Lightning', only, instead of the string escaping from his grasp, the note itself starts to die out. Refusing to accept defeat, Walter furiously grinds the bent 'B' string into the fret with a desperate blues vibrato, causing it to caterwaul back to life, actually increasing in volume. (This is the same technique Jimi Hendrix uses on the intro to “Foxy Lady”.) A heroic save indeed, but one not likely premeditated.

4. “Aja”, Aja, 1977

Steve Gadd's drum performance on this masterpiece is still a benchmark for aspiring drum virtuosos worldwide. What nobody was prepared for was for Gadd to deliver the final goods on the first take. Steely Dan sessions were notorious for hours of unused takes and rotating rhythm sections in search of that elusive pass, so Gadd's flawless delivery not only gave us a track for the ages, but doubtless saved many hundreds, even thousands, of dollars' worth of studio time.

5. “Your Gold Teeth”, Countdown to Ecstasy, 1973

The liner notes indicate that “(i)n this number, several members of the Dan get to "'Stretch Out'". This they proceed to do, starting off at the sedate tempo of 133 bpm, but before they hit the first chorus they've achieved 137 bpm. By Fagen's electric piano solo on the fadeout, they've ramped up to a manic 146 bpm. Given the Dan's later later insistence on click tracks and computer-quantized drum tracks (achieved with the aid of engineer Roger Nichols' WENDEL digital drum machine), this display of wanton rhythmic haste is a decided departure for our usually surgically precise heroes.

6. “Do It Again”, Can't Buy a Thrill, 1972

A clam on a Steely Dan single? The first one? (We won't count “Dallas”/”Sail the Waterway”, which was pulled by ABC on the grounds that the 'A' side sounded too “country”.) Well, it's not exactly a clunker, but it's still there. On the second chorus, when Fagen sings “You go back...”, the chord played is a Cm7, same as the other two choruses, but Walter (who may not have had enough lead vocal in his headphones) plays a G instead. So what? Well, on the other two choruses he lands on the chord root, playing a C. Fortunately, the stray note is harmonically compatible with the chord, and he proceeds to the regularly scheduled next note, unfazed. Apparently nobody noticed, or thought it important, but it remains to this day, a stark testament to the limited studio time and recording budgets that the Dan would gradually abandon in favor of records whose final cost would make the most extravagant record label accountant swallow his cigar.

  1. Third Word Man”, Gaucho, 1980

Another clam on a Steely Dan record? Gaucho's almost excessive polish would seem to exclude even the possibility of any flaw, but the classic era-Dan's last song, like its opening salvo, shares, amazingly enough, the exact same bass mistake heard in “Do It Again”, only this time it's session great Chuck Rainey running afoul of the chart. At 4:18, where the chord is a Bbm, Chuck inadvertently goes to F, the chord's 5th. Just as Walter did eight years before, Chuck immediately corrects course and plays the chord's root (in this case Bb), continuing smoothly to the Eb with everyone else. How this insignificant slip escaped the baleful scrutiny of Donald and producer Gary Katz might be explained by Walter's absence from the studio, notably during the mixing phase. Having broken his leg in an accident (“The car and I were attempting to occupy the same place at the same time,” he explained), he was relegated to listening to mixes over the phone from his hospital bed. Telephones, whose poor bass response is the very essence of their sound, could not convey the low end information that might have caused Rainey to be called back to the studio to atone for this infinitesimal slip. However, by this time Steely Dan had become mired in their own perfectionism and Becker's personal problems. It seems possible that, had the stray note been, uh, noted, it would have been waved off as unworthy of further attention, given the writing on the studio walls by that time. Steely Dan would eventually rise from the ashes in a world better suited to their performing and recording needs, but it is the legacy of those first seven albums that confirms the true trajectory of that most subversive of partnerships, that of a Jew and a German set on infiltrating pop music with a musical and lyrical sophistication it never saw coming.




Thursday, January 03, 2019

There's No Such Thing As 70s Music


You'd think the dilemma was as old as the human race, but there is evidence that it's much, much more recent: Your kids think that everything you know is wrong/stupid/outdated. In the days before mass communication and the rise of popular culture, your parents' music was your music. Sure, there might be songs written for children, but these were passed on to their children when they came along. There was no constant shift in styles and trends. The word 'teenager' was centuries, then decades away.  But it was inexorably planning to take over the world. This it attempted first by nonconformity, then by rebellion, then sheer force, then finally by stealth, in which it eventually succeeded. But by now these victorious-yet-obsolete teenagers had new teens of their own, and the cycle was not so much repeated as reinvented. Raccoon coats and ukuleles were eventually supplanted by bobby sox and zoot suits, then...wait a minute. Pop culture has been synopsized by abler writers and cultural historians than myself, so before I go off on "the 78 begat the 45 begat the LP begat the 8-Track", I will narrow my focus to a particular decade whose musical impact resulted from a convergence of factors that cannot be duplicated.

My motive is admittedly ulterior: I have a 17-year old daughter who is musically talented to the point where she is beginning to recognize that any educational and career choices she makes will need to focus on the development of her creative and performing abilities. Like a legion of other teenagers, she's pretty sure that her parents' taste in music can be little more than one huge, ghastly mistake, to be studiously avoided. Since this aesthetic judgment is based largely on non-musical criteria, there is no possibility of our agreeing that my experience and knowledge amount to any valid defense of the music I grew up with.

There is convincing evidence that one's musical tastes are most strongly formed during adolescence and young adulthood. (My experience, and almost certainly yours, should settle the question to everyone's satisfaction.) My beginnings as a musician occurred right about 1970, a time when Flower Power still asserted itself on the airwaves with songs like "My Love Grows (Where My Rosemary Goes)" and Three Dog Night's "Joy To the World". By decade's end these sunny expressions had given way in turn to a stylistic gamut including glitter, prog, funk, disco, punk and  New Wave, with rumblings of hip hop and techno making themselves known before 1980.  A cursory examination of this quantum leap, comparable to that produced by the 1960s, is in order.

The Sixties, of course, are the musical and cultural egg from which the 'Me' decade hatched. The former has been so closely studied and analyzed as to be clinically proven to  kill 99.9% of penny loafers and poodle skirts on contact. The cultural climate that hatched ten years of fearless experimentation in music and its attendant artistic expressions (i.e., pop art, psychedelia, auto-destruction, cinema verité, etc.) led to multiple conclusions, ranging from morose disillusionment to wide-eyed optimism and the pursuit of spiritual enlightenment. Popular music had become such a reliable cash cow that record companies and executives were sufficiently convinced to sign unknown artists, performing unknown musical styles. By decade's end rock music had become big business, a commodity to be traded, with the last ten years of innovation and cash flow as its justification.

Technology, too, had changed the game almost immeasurably. The Beatles, whose contribution to the decade's musical progress cannot be adequately treated here, pioneered such studio innovations as tracking and overdubbing with headphones, recording electric instruments directly into the recording console (as opposed to with a microphone), automatic double tracking (ADT, now known as flanging), tape loops, sampling keyboards, and backwards recording, all common practice now but virtually unheard of when they were signed to the Parlophone label in 1962. The movement snowballed in their wake, resulting in records by other artists that sounded by turns bigger, brighter, more polished, more aggressive, softer, harder, more modern - in a word, more commercial. Happy listeners, happy recording artists, happy record executives - an oversimplification, yes, but a maturation of sorts, a coming-of-age for a turbulent decade's most cherished pop culture expression.

Just as casual listeners can usually approximate a song's epoch of origin by its sonic signature (i.e., the twangy, spring reverb-drenched guitars and cheesy combo organ that proclaim 'Early Sixties Surf'), there are ostensibly equivalent hallmarks of Seventies music that relegate it to stereotype status: The dry drum sounds, replete with fatback snare and toms with their bottom heads removed, monophonic synthesizers hooting out their then-futuristic sounding lines, phase shifted guitars and electric pianos slushing out the chords beneath, and - if the record company budget allowed (it often did), a real string section sweetening the deal often beyond reason. Today virtually any producer could duplicate these sounds in an attempt to create 'Seventies Music', and whether or not the results could fool an unsuspecting listener, the latter would at least be able to conclude that the attempt was recognizable, a reasonable facsimile.

But as time marches on, so does (or did) a decade's musical diversity. Having observed the last thirty years of popular musical trends, I perceive a certain closing up, to the point of creative strangulation. Everyone around me is sick of hearing me complain about how everything I hear on the radio - and worse, from the worship teams I play on in church - has the same four chords. There are no accidentals; it's all diatonic. Everything could be played on the white keys of a keyboard, perhaps with the transpose function enabled in case one actually desired to be in a key other than 'C'. (It happens.) Mass media have tightened the noose of acceptable musical styles. Television shows attempting to find the undiscovered 'Next Big Thing' (and to humiliate those who mistakenly believed it might be them) have reduced the musical vision of a generation to a brass ring to be lunged for, rather than a grueling, joyous process in which a garage band (the social unit, not the app) might slog its way to the top. 

This is, of course, a rant, an overgeneralization of a claustrophobic trend riddled with countless, glorious exceptions. But as cream rises, so do noxious gases. Dispensing with the gases for a moment, let's critique the cream in question.  I posit here that the non-category of  'Seventies Music' boils over with an embarrassment of stylistic, harmonic, cultural, creative and accessible riches without precedent, and sadly, without hope of any similar phenomenon happening in our lifetime. Like those that gave us the Beatles and their attendant impact, the convergence of factors that led to the riotous musical supernova of the Seventies can never again be duplicated.

That the year 1970 should demarcate some sort of musical or cultural frontier bears explanation: The last ten years had experienced cultural and political upheaval never before experienced, inflecting pop with such elements as world music, electronic instruments and processing, stylistic cross-pollination (i.e., country rock, psychedelic blues, raga rock, jazz rock, folk rock, classical rock), setting the stage for imminent breakout pop radio hits from distinct genres such as bluegrass ("Dueling Banjos"), black gospel ("Oh Happy Day"), electronica ("Hot Buttered Popcorn") and reggae ("I Can See Clearly Now").  A musical emancipation of sorts had paved the way for virtually unrelated genres to jockey for chart position.  No chord progression was too opaque, no lyric too weird, no voice too strange. 

As the money-grubbing record industry plowed through the era of Watergate, the Energy Crisis and the Carter administration, even second-tier selling genres like progressive rock (think Yes, Emerson Lake & Palmer, etc.) generated sales that compared favorably with the Internet-weakened music biz of today.  (Lest anyone think I'm blaming the Web for this present musical darkness, the gradual decline I write of was well underway even before dial-up.) The bewildering divergence between successful recording artists like Elton John, Patti Smith, ABBA, Bob Marley, Barry Manilow, John Denver, Elvis Costello, the Bee Gees, Aerosmith, the Carpenters and Lou Reed (sorry to have left out so many indispensables) virtually guarantees a staggering lack of distinguishing characteristics that could render a significant portion of the decade's musical output subject to any stereotype other than (a friend suggested this, not me) cocaine usage. From the stunning beauty and complexity of Steely Dan's 'Aja' to the moronic disposability of Rick Dees' 'Disco Duck', from the raging maelstrom of the Sex Pistols' 'God Save the Queen' to the lush sonic whipped cream of 10cc's 'I'm Not In Love', from the  hippy-dippiness of Sammy Johns' 'Chevy Van' to the cloying bombast of Barry Manilow's 'Mandy', from the snappy funk of the Average White Band's 'Pick Up the Pieces' to the nihilistic futurism of Devo's cover of 'Satisfaction', there is simply no common stylistic thread running through the decade's music.

All that was going to change, though – technological innovations in the music industry would soon become catalysts for trends destined to unify musicians, producers and engineers, for better or worse. The early 80s proliferation of personal computers, drum machines, affordable multitrack recorders, MIDI devices, digital audio recording and processing, sampling synthesizers and sequencers would soon have nearly everyone – me included – scrambling to put these developments to work in the service of creativity. But there was a price beyond that of the cost of these toys: A certain sameness, a common thread. Sure, a keen ear could pick out a Fender Rhodes piano or a notch-position Stratocaster on the radio, but now even an unknown, small-time producer like me could hardly turn on the radio without hearing Yamaha's glistening-yet-sterile DX-7 electric piano sound, a fatback snare sample from Roland's GM sample set, or patch 19 on the Alesis Midiverb II reverb unit and thinking, “Hmm, those are the sounds I used on that jingle last month.” The palette of available colors was shrinking at an alarming rate. Even records that didn't use the Linn 9000 drum machine were being crafted so that a real drummer could sound fake. Not even guitars were safe. The proliferation of so-called “Super Strats” (electric guitars based on Fender's venerable Stratocaster design, but with hot-rodded electronics and locking vibrato units that allowed the humblest local player to dive-bomb like Eddie Van Halen and still stay in tune) made the guitars we saw on the now-burgeoning MTV distinguishable from each other mainly by their garish paint jobs. The custom-built pedalboards and racks of effects used by many 70s guitar greats were gradually being replaced by mass-produced multi-effects units whose sonic signature many of us could identify instantly.

And now, perhaps the most glaring example of a sound that screams “80s” like no other – gated reverb. Once Phil Collins and his cronies accidentally stumbled onto a signal path that caused the tail of the reverberation on the drums to be cut short – a sound that cannot occur in nature – producers and engineers everywhere were scrambling to duplicate this huge-yet-claustrophobic sound (perhaps most famously demonstrated on the mammoth drum break before the final chorus of Collins' “In the Air Tonight”.) By 1982 commercially available digital reverb units were putting this sound - prepackaged - into the hands of guys like me who were scrambling to sound like everybody else. Most of my surviving recordings from the 1980s may sound low-budget, but my tools, being those used by most of my colleagues, left their sonic stamp on my music in a way that carbon-dates it as surely as Rubik's Cube and parachute pants.

The ensuing decades have alternately shied away from the homogeneity of 80s music and revisited it in altered forms, grunge rock and Americana/roots rock typifying the former, hip hop and electronica affirming the latter. It is currently de rigeur to use computers in studio recording, with only the astounding diversity of available tools rescuing many of us from the sameness that made so much 80s music so easy to single out on first listen. I can use these very tools to simulate music from any decade, but if a client wants “70s music”, I need clarification. The brainy, intricate heartland prog of Kansas? The glossy, melodic hard rock of Boston? The infectious, jazzy, horn-driven pop of Chicago? The strident proto-metal of Nazareth? The folksy soft rock of America? (If you haven't spotted the pattern in this list, go drink some coffee and report back here.)

As I stated at the beginning, I would never have belabored this point if not for the frustration I've experienced in trying to pass my best knowledge and experience on to a daughter whose musical skills are a source of continual amazement to me. Aristotle quotes Plato as having pointed out “the importance of having been definitely trained from childhood to like and dislike the proper things; this is what good education means.” (Arist. Eth. Nic. 1104a.20). While even my musical education has been mostly informal, it has served me well, and to be able to pass the 70s musical torch to my own offspring would be worth more than having a hit record.


Sunday, February 19, 2017

Churchill and Trump: Separated At Birth?


I recently commented to an old friend that I didn't like President Trump – I offered no specific criticism or rationale; I simply didn't like the man. My friend responded, “Then you wouldn't like Winston Churchill, either,” or words to that effect. I assured him that I had read a number of books about Churchill, whose only real rival as the greatest man of the 20th century is probably Albert Einstein. After the conversation I thought hard: Churchill, Trump – am I missing something? Having concluded that yes, I was, I humbly submit a few parallels I have tried to draw between the two men: 


‌• Both Churchill and Trump are men steeped in history. Churchill was an historian of considerable achievement, having written, among many other things, A History of the English-Speaking Peoples and the six-volume The Second World War. Donald Trump himself has occupied history, having lived at the time of the Korean and Vietnam wars, the Space Race and the Arab Spring, all of which are historically important historical events in history. He too, has written books, including How to Get Rich and Think Big and Kick Ass in Business and Life.

‌• As a young cavalry officer and war correspondent, Churchill gallantly served his country in combat. In South Africa he was captured and interned in a POW camp, from which he escaped, which made him a bit of a hero back in England. Trump was also captured, in 2005, on tape. Both men went on to become heads of state in spite of the adversity of having been captured.

‌• Winston Churchill's wit was legendary. When he met Labour MP and Tory-hater Bessie Braddock at a party in 1946, she told him: "Winston, you are drunk." "Madam," he replied, "you are ugly, and I will be sober in the morning." Trump also is known for his razor keen witticisms. Referring to the 9/11 tragedy, he quipped, “I was down there, and I watched our police and our firemen, down on 7-Eleven, down at the World Trade Center, right after it came down.” “7-Eleven” ? Get it?

‌• Churchill married Clementine, the love of his life, in 1908, and they remained married until his death in 1965. Not to be outdone, Trump has been married three times, in addition to his various extra-marital affairs, thus proving his love over and over again.

‌• As an orator, Churchill has had few equals in modern history. His speeches, seasoned with phrases such as “their finest hour” and “blood, toil, tears and sweat” galvanized the will of his nation to stand agains the evils of Nazi Germany, and have become part of the English language. In Trump we have a latter-day Churchill, giving us such stirring statements as “Part of the beauty of me is that I am very rich,” and “I’ve said that if Ivanka weren’t my daughter, perhaps I’d be dating her."

‌• Doubtless in response to countless observations made to him by mothers about their offspring, Churchill was wont to respond, “All babies look like me.” Trump has also been likened to a baby.

‌• Like many of his time and station, Churchill was susceptible to the paternalistic racism and perceived European superiority common among such men, having once referred to Mahatma Gandhi as “that half-naked fakir.” In order to combat charges of anti-Mexican sentiment leveled against him, Trump was compelled to insist, “The best taco bowls are made in Trump Tower Grill. I love Hispanics!”

‌• Their countries both menaced by the Third Reich, Churchill and the Russian dictator Josef Stalin formed an uneasy alliance, each aiding the other materially and militarily until the Axis was defeated, leaving Britain and the Soviet Union to square off in an ideological standoff that would last for decades. Trump likewise shares a controversial relationship with Russia's Vladimir Putin. Warning against the Russian president's ordering the killing of journalists, Trump cautioned, "He's running his country and at least he's a leader, unlike what we have in this country. I think our country does plenty of killing also." That a great deal of killing takes place in the U.S. cannot seriously be questioned.

‌• Winston Churchill's record of service in public office was extensive well before he became Prime Minister, having been, among other things, Lord of the Admiralty, a member of Parliament, Minister of Defense, Chancellor of the Exchequer, Secretary of State for War, President of the Board of Trade, and Home Secretary, among many others. Trump's political aspirations are also the stuff of legend, he having considered running for Governor of New York. He has been a member of several political affiliations, including the Republican, Democratic and Reform Parties, thus establishing his credentials as a public servant.

‌• Trump's respect for Winston Churchill is attested to by his reinstatement of Churchill's bust to a prominent place in the Oval Office. While I have been unable to unearth any direct quotes by Trump on Churchill, he has nonetheless expressed admiration for Saddam Hussein, Kim Jong-Un, Benito Mussolini, and Bashar al-Assad, also national leaders, as was Churchill. 


This is not a scholarly work; rather, I wished to independently affirm my friend's assertion that my distaste for President Trump must translate into an equal disdain for Churchill. I am now faced with a choice: Either admit Trump's greatness as a visionary, courageous statesman and historical figure, or relegate Churchill to the status of an ignorant, narcissistic, lewd, pompous ass. To some this dilemma may seem insuperable, but history must ultimately judge. Out of respect for the office of President of the United States, I hereby abstain. Time will tell. And tell it will.









      Wednesday, June 03, 2015

      BOY, 8, PREDICTS FUTURE

      "We're baffled," say scientists


      Now my own suspicion is that the Universe is not only queerer than we suppose, but queerer than we can suppose.” - J. B. S. Haldane, 1892 – 1964

      Professor Haldane, an avowed atheist, frequently made observations that any thinking theist could eagerly embrace as God's truth (remember, all truth is God's truth.) At the age of eight, before I could think clearly about my own faith in philosophical or metaphysical terms, I experienced a phenomenon that should have at once put me firmly in his camp. But not the atheistic one. Oh, no – this little episode spilled the beans about something apparently outside the Universe altogether. 

      Before I ask you to give credence to the recollections of a middle-aged man about something that happened in 1969 (there, I just carbon-dated myself), I will attempt to establish my admittedly subjective credentials. My long-term memory is not infallible, but surely far better than my short-term one. I can remember being baptized as a baby in the Mission San Diego de Alcalá, but I have no idea where I put down the cordless drill two minutes ago, and will, if unchecked, waste the next hour looking for it in vain. My friends are frequently surprised by the details and accuracy of my recollections about our collective past. It's how I'm hardwired. The following recollection of a glimpse into the future cannot be scientifically verified, but it has remained unchanged for decades, and is very simple in sequence.

      As a child I was obsessed with birds, and any close proximity to, or better still, interaction with them filled me with delight. Our family had moved from San Diego, CA, to Bozeman, MT in the summer of '69, just in time for Neil Armstrong to invent that future dance craze, the Moonwalk. I was crestfallen to find that Montana had civilization, and worst still, schools, but I was up for any adventures my new home would offer. Conveniently enough, our first Montana home was across the street from that of three brothers who were the same age as myself and my two younger brothers, with whom we could play or feud with as the moment dictated. One morning, right at that moment when we are most likely to recall a dream accurately, I dreamt the following:

      I had gone across the street to our friends' house, to find a magpie sitting on the fence in their backyard. It didn't outwardly appear to be injured or sick, but it apparently couldn't fly, either. It just perched on the back fence, defiantly emitting the standard magpie call, which I gleefully returned. Back and forth we went, until other activities took over, or, in this case, until I woke up.

      Which I did. I got up, dressed, and went across the street, that being my standard procedure at the time. And there, right out of my dream, sat the magpie. It couldn't fly, or I'm sure it would gladly have done so instead of hanging out with the likes of me. It squawked. I squawked back. This went on for a few minutes, and I told the kids there that I had, just minutes before, dreamed that “I was having an argument with a magpie!” I don't remember whether they took any notice of this, or whether they believed me. Eventually my eight-year old attention span faltered and I spent the rest of the day in the freedom a child enjoyed in those politically incorrect days of summer autonomy.

      I didn't closely analyze this brief-yet-uncanny sequence of events, first dreamed, then realized. I was at an age at which a child has yet to adopt a skeptical attitude toward the Universe, and had indeed been brought up on a mashup of Roman Catholicism and New Age weirdness, either of which might admit a child's dream as a conduit of the supernatural. I didn't read anything into it. I just accepted that my dream had foretold an actual event, and what of it? Didn't we live in a Universe fraught with infinite possibilities? 

      Or did we? I had yet to experience the Materialist view of things, which rules out the supernatural in all its forms. No spirit world, no God, no reincarnation, no Karma, only those things that can be observed and documented using the Scientific Method. Many years passed before I ever met a proper atheist, at least in the U.S. (The ones I encountered in China nearly twenty years later didn't count, having themselves been the product of mass indoctrination by a totalitarian state.) My Universe presumed a God, held me responsible, and, worst of all, threatened to make me reincarnate until I got it right. But it also seemed to allow for less stringent manifestations of the supernatural, in this case a prosaic, pedestrian preview of the immediate future. (My dreams tend toward the wildly irrational and random; this one stuck unimaginatively to the unadorned facts.)

      The skeptics, acting on the assumption that no dream could accurately forecast the future in such concrete terms, have the answer. Billions of people dream every night, for thousands of years on end, dreaming every kind of dream that can be dreamt. (Uh oh, we're going to run out of new things to dream.) Why couldn't it be that one child out of billions might accidentally dream an uncomplicated event that just happens to occur – verbatim - minutes later? (This is known as The Law of Large Numbers.) Oh, and might not I, between waking and dreaming, have heard the distant magpie distress signals coming from across the street, and constructed a dream around them? (A few years later I developed the questionable habit of sleeping with the radio on, and that did in fact interfere with a few dreams.) And wouldn't a boy who loves birds dream about them anyway?

      Enter William of Ockham (1287 – 1347), and his secret weapon, Occam's Razor. (The spelling must here remain unexplored.) His Razor states that “among competing hypotheses that predict equally well, the one with the fewest assumptions should be selected.” In layman's terms, the simplest explanation is often (though not always) the best. The specific details of my dream varied so violently from my “normal” visions (which might range from eggs hopping along a conveyor belt and singing in three-part harmony, to terrors too unspeakable to write here) that I can't simply ignore them. Might I not simply dream I was in school, or, as I sometimes do now, that I'm at my day gig making guitar straps? I bet most of us have useless dreams - I mean, if I'm going to do schoolwork or factory work, I might as well at least get a grade or a paycheck. 

      But a flightless magpie is something few people ever see, and if I'd seen this one before, there's no way I could have forgotten it. And the location – my friend's backyard fence. The squawking back and forth. Okay, I could have gotten that from the dream, but, caught up in the moment, I was more interested in my interaction with a wild bird than I was in the dream I'd just had. Not everything can be boiled down to such a pedantic sequence. Quite simply, my dream accurately foretold a future event.

      For those who insist that my experience cannot be scientifically verified, I say, Amen. No scientific study of clairvoyance or any form of ESP has shown any scientific basis for such phenomena, and indeed, any scientist who ever undertook such an investigation in the first place should have his slide rule broken over a knee, his pocket protector ripped from his lab coat, and be banished to Ken Ham's Creation Museum, forever to wallow in the silliest of pseudoscience. For a case such as my own, we must look somewhere besides science.

      What? Isn't science the basis for, well, everything? Tax codes, Thai kickboxing regulations, bluegrass etiquette, literary criticism, the Geneva Convention, the Golden Rule? So far, so bad. Some things fall squarely outside the realm of scientific inquiry, and, as one who loves science, I will insist on this as firmly as I insist that music theory does not apply to nuclear fission or plate tectonics. To maintain that everything that has ever happened in recorded history has a purely scientific explanation is to prostrate one's self before a false god, one deaf and blind to what it means to be human, conscious, free. The transcendent works of Bach, Beethoven, da Vinci, Rembrandt, Shakespeare, Tolstoy - strictly determinist animal behavior. The selfless courage of Mother Theresa, Nathan Howe, Oskar Schindler - nothing but higher animals obeying their herd instincts. To someone who actually believes in such a world, I have much pity, but very little to say.  Nothing I say here will concern them, so they may feel free to read something else.

      Much heavy weather has been made of late concerning the question of whether the existence of God, or any other supernatural entities or phenomena, can be proved or disproved by science. Here I will enlist the input of Stephen Hawking, whose scientific credentials cannot be seriously questioned. In spite of his recent atheistic stance, he nonetheless has yet to recant his statement in A Brief History of Time, where he states, "As far as we are concerned, events before the Big Bang can have no consequences, so they should not form part of a scientific model of the universe. We should therefore cut them out of the model and say that time had a beginning at the Big Bang." (emphasis mine.) If God initiated the latter, science can have nothing to say about it.

      Some occurrences can be proved, but not specifically via the scientific method.  But no proof, scientific or otherwise, will convince one whose assumptions will not admit truths they find unpalatable. In Harper Lee's classic novel To Kill a Mockingbird, the upstanding defense attorney Atticus Finch conclusively proves the innocence of a disabled black man falsely accused of raping a young white woman. Nonetheless, the all-white jury returns a guilty verdict, not because they have any reason to fall for the plaintiff's obviously fabricated testimony, but because this is Depression-era Alabama, and no black man could be believed over any white person. (To his credit, we later learn that one of the jurors dissented vehemently from his fellow white men, and only under duress cast his vote against the innocent–yet-doomed Tom Robinson.) I will here attempt to illustrate how a proof could be ironclad, yet without the aid of the scientific method:

      Suppose you and I are driving down the road in my battered Crown Victoria, and a song comes on the radio – in this case, “Shattered”, from the Rolling Stones' Some Girls album. My band played this back in 1979, and I know all the words, the lament of an Englishman navigating the perils of New York City. (And you thought Sting came up with that idea! For the record, P. G. Wodehouse was writing on that very subject 100 years ago, with hilarious results.) It's mostly a rap, just a bit of melody here and there.

      Well, I can't be prevented from rapping along with Mick Jagger, perhaps a major third lower, just for fun. “My brain's been battered / Splattered all over / Manhattan.” I'll also hammer out Charlie Watts' drum fills, note-for-note, on my steering wheel (while you wish I'd just steer the car instead.) I nail everything in the song, either to your amazement or to your annoyance, but one thing is clear to you – This guy has heard this song before. My driver's seat performance is absolute, ironclad proof that I've heard the song at least once previously.

      Or is it? Doesn't the Infinite Monkey Theorem prove that a Universe full of monkeys, provided with typewriters and reams of paper, will eventually type the complete text of Shakespeare's Hamlet? Mathematically, yes. (Maybe in Brazilian Portuguese, if they have the right sort of typewriter.)

      This, as any idiot can see, ignores the element of improbability entailed by our humanity. What if one of the monkeys, on a roll (“Get thee to a nunnery!”), accidentally starts typing the text of Hunter S. Thompson's Fear and Loathing in Las Vegas​  instead? Now we have to start over. Bad monkey.

      We needn't look to Shakespeare to give this asinine theory a decent burial. The chance that any of the monkeys, individually or in concert, will even (in the course of nine hundred billion trillion years), give us the words, “CLOSE COVER BEFORE STRIKING” misses the point in most impressive intellectual fashion.

      Back to my car: If you had some motive for supposing I could not possibly have heard “Shattered” before, you might start looking for alternate explanations for my faithful rendition: He studied the sheet music. (I like that one – I can read music!) Or perhaps I heard some other band do the song exactly like the record. Nice try. (Too close to my having heard the original.) Okay, Large Numbers to the rescue! In this, or perhaps another Universe (Multiverse is a hypothesis intended to include any possibility whatsoever), wouldn't it stand to reason that I might just happen to make all the right sounds with my mouth, and hit the steering wheel in perfect sync with Charlie's lovely, quirky drum fills? Émile Borel, with his proposed legions of monkey typists, might think so. Any court of law, however, could only convict me of having accidentally performed “Shattered” by their violating not only Occam's Razor, but by bringing a really stupid assumption (i.e., that I improvised along with the song, predicting the entire lyric and much of the drum part) to the courtroom.



      By attesting to the single most supernatural-appearing event of my life, I am not here attempting to prove Christian doctrine (to which I have nonetheless devoted much of my life) – I am merely asserting that there is more to the Universe than, well, the Universe. Science cannot prove me right nor wrong, but I can think of nothing that could possibly convince me that my dream was not informed, accurately and deliberately, by something completely outside of time and space. Not just something, but (I cannot believe otherwise) Somebody.

      Saturday, January 31, 2015

      Belated Book Review: The God Delusion by Richard Dawkins

      Ever since I read Dr. Francis Collins' statement that he keeps the complete works of Richard Dawkins and Christopher Hitchens on his bookshelf (“One must dig deeply into opposing points of view in order to know whether your own position remains defensible”), I've been nagged by the feeling that I was behaving somewhat like a Hobbit (who, according to J. R. R. Tolkien, “like books which tell them what they already know.”) I sustained a blow to this mentality many years ago when I read The Creator and the Cosmos by Hugh Ross, which made short work of my fundamentalist “cosmology” (such as the idea of an Earth which is thousands, rather than billions of years old.) I was relieved, rather than shaken – who really enjoys defending the indefensible? Dr. Collins' The Language of God, which helped me to reconcile biological science (including Natural Selection) with my Christian faith, completed for me the basic process, including a firm repudiation of Intelligent Design (ID), which negates the scientific method whenever a difficulty is encountered, replacing it with a miraculous default.

      We live during a time in which battle lines are now drawn not only between “Creation vs. Evolution” or “Science vs. Faith”, but, ever increasingly, “Young Earth Creation vs. Theistic Evolution”, in which Christians themselves are divided over whether a literalistic interpretation of Genesis trumps all the disciplines of mainstream science. Over and against this stands Richard Dawkins, the Chuck Norris of militant atheism, for whom any belief in God or the supernatural is not only wrong, but at best misguided, at worst evil incarnate. His magnum opus, The God Delusion (henceforth TGD), has become sort of an atheist Bible, and its enormous popularity is significant in an age where atheism has lost much of the stigma that has dogged it for centuries and kept it largely underground. Along with books such as the late Christopher Hitchens' God is Not Great: How Religion Poisons Everything, TGD has emboldened millions of people to abandon beliefs they held for reasons other than deep personal conviction (i.e., family tradition, indoctrination, etc.)

      Church, we had it coming. If half of Dawkins' observations about religious faith are true (I didn't try to keep score), then we are facing a massive reformation of our intellectual, moral and spiritual integrity. My decision to read TGD was, as I indicated, encouraged by Dr. Collins' fearless admonition to face alternate viewpoints head-on. Since there are numerous rebuttals, and even entire books challenging or purporting to refute TGD, the easy way would have been simply to read one or more of these and use it as ammunition against the atheistic onslaught.

      Having read TGD cover-to-cover, I'm glad I finally took the high road. Dawkins writes well, has a keen sense of humor and does his best (mostly) to be fair and objective. He is an eminent scientist with what he would deem "an appetite for wonder", and he makes many fascinating observations about nature, the universe and humanity. Much to my astonishment, I began to find Dawkins likable, and I found myself laughing with him far more often than at him. He is as worthy an opponent as a thinking person of faith is likely to cross swords with.

      And here I must unsheathe my own. But wait a minute - me, an electric guitar player, take on one of the world's most formidable atheists? Who do I think I am, anyway? I'm no scientist, nor philosopher nor theologian. I don't even have a college degree (which nonetheless didn't prevent me from once creating a university course.) My audacity usually extends as far as taking on musical challenges that I'm not 100% sure I'm qualified for. My confrontation with Richard Dawkins must be of the David vs. Goliath variety, or perhaps merely Quixotic.

      There are other ways to confront him – he is justifiably predisposed to quote the various cases made against himself, especially since they are frequently hate-steeped invectives made all the more disagreeable by their ignorance and outright stupidity. Nobody wants to look bad in a public showdown, and by reprinting his own hate mail, Dawkins underscores his own perceived moral, as well as intellectual superiority by demonstrating the ethical and logical bankruptcy and hypocrisy of some of his more virulent foes. If I am to score any points against him, I'd better leave smear tactics to those who possess no other weapons.

      That's not to say that Dawkins never resorts to unfair tactics himself. He tries hard not to sling too much mud, except at Yaweh, the deity of the Bible, against whom he unleashes his most contemptuous and hostile assault. Incomplete and one-sided though it may be, Dawkins' case against the biblical God should be answered, although it has elsewhere been treated by thinkers far superior to myself. Where he skewers statements by Augustine, Martin Luther and many others, I often find myself nodding in agreement. There is seldom, if ever, total agreement in any camp, even Dawkins' own. Far worse, to me, are his frequent attacks on straw men. Many of his favorite victims (i.e, baptism of infants, a mystically-addled Hitler, young earth creationism, experimental prayers for healing) are soft targets against which nearly any thinking religious person could make an equally compelling case. A cursory perusal of the works of C. S. Lewis would have eliminated the perceived need to include many of these in any rational attack on religious faith.

      Unfortunately, Dawkins doesn't face his most formidable adversaries head-on. His only mention of C. S. Lewis, the “apostle to the skeptics”, occurs in a cursory dismissal of Lewis' 'Trilemma' argument (Jesus as liar, lunatic or Lord), asserting that Lewis “should have known better.” Likewise, Dr. Collins, our foremost advocate of Biologos (the harmony of faith and science), warrants only passing mention, with no real attempt at refuting any of Collins' arguments for faith, despite his wholesale agreement with Dawkins on the question of evolution. Here would have been excellent sport, but Dawkins either dodged these bullets or didn't bother to notice they'd been fired. I will give him the benefit of my doubt here, since the latter seems at least intrinsically plausible. Nobody's perfect.

      My own quibbles with Dawkins involve the farthest-reaching questions, which he seems to have attempted to answer but repeatedly comes up short. He argues that religion is not the source of morality, but he needn't have looked any further than Romans 2:14-15 to find that not even Christianity makes such a claim (“For when Gentiles who do not have the Law do instinctively the things of the Law, these, not having the Law, are a law to themselves, in that they show the work of the Law written in their hearts, their conscience bearing witness and their thoughts alternately accusing or else defending them.”) Likewise, his attempts to explain traits such as altruism as holdovers of herd mentality, or misfirings of genetics, fail to address the moral dilemmas posed by something as subtle as cheating on an exam, or on your income taxes. Still worse is his total failure to address perhaps the Greatest Question(s) of All: What is the meaning of life, or of the universe?

      I can cut Dawkins no slack here. That is not to say that he has never pondered the question, or that he is in any way shallow – to the contrary, this guy is deep. I love his sense of awe at the beauty and complexity of the universe, and his insistence upon unraveling as much of it as he, and we, can. Before I pull the metaphysical trigger, I wish to reaffirm my agreement with him that we should aggressively oppose virtually all attempts to limit scientific research and knowledge. I share his contempt for small-mindedness, and no person of faith should fear any advance in scientific knowledge, since any such advance should automatically bolster our faith as it adds to our understanding of creation.

      Dawkins repeatedly insists that the existence or nonexistence of God is a scientific matter, but in no way can I fold, spindle or mutilate my brain into agreeing with him. Since what the Theist means by “God” is by definition something outside of time and space, it can never be detected by using the tools of science. (Here Dawkins declares himself to be at variance with fellow evolutionary biologist Stephen Jay Gould, who insisted that science has nothing to say on the matter.) Never say never, but I say never. Dawkins rightly insists on the picking apart of everything that can be observed, detected or hypothesized. Unfortunately, when something falls outside the realm of observation or other forms of verification, its hypothetical existence must be determined by other means, (i.e., revelation, which Dawkins open-mindedly dismisses out of hand.) The Hot Big Bang model of creation, that most mysterious and yet foundational of natural phenomena, can and should be subject to the most rigorous research we can devise. But even if it can be shown to have resulted from a random fluctuation in a quantum anomaly, we are forced (if we are brave enough) to ask where those came from. Stephen Hawking's assertion that it all came from the Laws of Physics (“which have always been there”) ignores the fact that laws can do nothing without a patient to act upon. Gravity cannot be the apple it causes to fall. I have long maintained that any child could see this, but not even a wall full of academic credentials can force their recipient to see what they wish not to see.

      TGD makes blessedly little of wishful thinking as a source of belief in God (or a lack thereof), but it should be acknowledged that for every person “deceived” into believing in God through wishful thinking, there might equally be a person hoping like mad that He doesn't exist. As C. S. Lewis observes, the question does nothing to move us toward a logical conclusion, since there are plenty of wishes on both sides. Of far greater portent is the Anthropic Principle (the narrow spectrum of conditions in which carbon-based life can develop), about which Dawkins makes a good deal of heavy weather. The existence or nonexistence of planets with the right moisture, temperature, gravity, etc. to support life is the subject of extensive research and conjecture, but only enters into the theological realm as regards the likelihood, or lack thereof, of life arising spontaneously from non-life. The now-popular Multiverse scenario (in which an indefinite number of universes existing outside our own, elevating the probability of matter morphing into life somewhere) cannot be tested, verified, nor falsified. The vastness of our own universe surely provides scope enough for life (which we know to have occurred in our own backyard) to materialize, whether accidentally or intentionally. As a straw to be grasped in the attempt to indicate that anything might happen, given enough time and material, Multiverse is (to borrow one of Dawkins' pet names) a cop-out of unimaginable proportions, or, at best, irrelevant. Life already happened; deal with it. We must look at the final chapter of his book to see the unhinged lengths to which he will go in order to defend his position.

      In his invocation of the miraculous-as-theoretically possible, a marble statue of the Madonna could theoretically, through a random flash mob rebellion against normal molecular movement, wave its hand. (While somebody reliable was watching.) Rather than delve into the near-infinite absurdity of this scenario (conclusively trashed by authorities on all sides of the issue, even his own), I will simply state that Dawkins-as-final authority is a chimera, just one more example of wishful thinking, of which most or all of us are equally guilty. This colossal cock-up should serve as a warning to those who would set up any man, be it Dawkins, Darwin, C. S. Lewis, Stephen Hawking, Einstein, Karl Marx, Francis Collins, the Dalai Lama, or the Pope as bulletproof. (I plead guilty myself.)

      I hasten to add that Dawkins is, barring his foibles, an important figure in the worldview debate. Iron sharpens iron. One of the more delightful surprises awaiting me in TGD was his vast appreciation of P. G. Wodehouse, to his mind and mine, “the greatest writer of light comedy in English”, not the least reason being (for both of us) his profound biblical, and therefore cultural literacy. Dawkins justly, and admirably, points out that an ignorance of the King James Version of the Bible as a vast source of our culture's rich verbal heritage would be an impoverishment. For example, Wodehouse's bumbling hero Bertie Wooster comparing his own hangover to Jael driving a tent peg through Sisera's temple (the side of his head, not a building) would ruin the inside joke, lost on a biblical illiterate. Dawkins gives us a veritable laundry list of biblical tropes, including “my brother's keeper”, “coat of many colors”, “kill the fatted calf”, “the stars in their courses”, and “the patience of Job”, among dozens of others. Though these concessions in no way diminish his contempt for religion, they, perhaps unwittingly, give it more importance than he intends.

      I believe Dawkins' most glaring oversight to be his total failure to address the greatest questions ever asked: Who am I? What is the meaning of life, and of the universe? Does the universe have a purpose? Whether by brushing these aside or merely forgetting to treat them, he makes, what seems to me, the worst blunder a man on such a mission as his could possibly commit. I plead an unlearned man's ignorance of the minutiae of various schools of thought regarding the meaning of the universe, but with America's Founding Fathers (for whom Dawkins professes boundless symathy), I subscribe to the proposition that certain truths can “be self evident”. If an accidental, irrational universe can somehow “acquire” meaning and purpose, somebody had better spell the process out for me in language that even I, and others even stupider than myself, can understand.

      I applaud Dawkins for his commitment to the values of honesty, compassion and decency, even as he saws off the branch he's sitting on. To those who attack him with far more hatred and violence than he himself allows himself toward even the most contemptible or misunderstood of his adversaries, I had best not say what I really think of them. My surprise at being able to enjoy so much of TGD will probably lead me to read his other books. I should very much like to meet him. If I condemn some of his more unkind characteristics, I likewise condemn them in myself (tact being a wildly variable resource in my Asperger's-infested toolbox.)

      I challenge every thinking Christian or theist to read The God Delusion; likewise I challenge all agnostic or atheistic Hobbits to venture beyond the familiar and comfortable and read C. S. Lewis' Mere Christianity or Francis Collins' The Language of God. And I would encourage us all to read The Righteous Mind by Jonathan Haidt, a Jewish atheist who demonstrates more courage and fairness in evaluating opposing sides of the ideological spectrum than I would have dared dream possible. It's a lot of work seeing both sides of the story, but for the Biblical theist it isn't supposed to be an option - “The first to put forth his case seems right, until someone else steps forward and cross-examines him.” (Proverbs 18:17). And as for the atheist or anti-theist? What have you got to be afraid of?