Strange New Thoughts

The place where I slam down gauntlets and pick up the pieces.

Friday, March 27, 2020

Seven Unplanned Steely Dan Moments


A veritable oxymoron, right? The Dan's notorious studio perfectionism would seem to negate the possibility of Messers Fagen & Becker ever releasing anything short of their customary stratospheric sheen. A keen ear, however, will detect a number of delightfully human (expensive humans, perhaps, but still human) musical moments that only serve to amp up the Steely Dan mystique.

1. “The Royal Scam”, The Royal Scam, 1976

This rags-to-riches saga features an embarrassment of riches, from Larry Carlton's subversive guitar work to some of the best orchestration on a Steely Dan record. Then, right in the middle of the guitar solo, Donald Fagen clears his throat. Not buried in the mix, as heard during the intro to Jimi Hendrix' “Purple Haze” - this one cuts right through, perfectly in time with the with the rhythm section. It sounds so aggressive that it actually dovetails with the grim narrative of the song. I bet it's not on the studio chart, though.

2. “Chain Lightning”, Katy Lied, 1975

Rick Derringer makes his second appearance on a Steely Dan record (the first was his slide work on “Show Biz Kids”), underscoring this creepy scenario of a couple of bystanders at a Nuremberg rally. As he digs into the second half of his bluesy 24-bar solo, he bends the 'B' string a minor third, but as it approaches its goal ('E'), it slips out from under his left hand and snaps back into its unbent state, blasting out an unintended staccato A9th chord on the open top strings. (Every guitarist here knows exactly what I'm talking about.) But it fits! A lot of guitarists punched in on 'Dan sessions (the process of resuming – on-the-fly – the recording of a track after the part you wish to keep), and it sure sounds like that's what Rick did here. As Pee Wee Herman might say after crashing his bike, “I meant to do that...”

3. “Black Friday”, Katy Lied

Before assuming his original role as Steely Dan's bassist, Walter Becker had spent years honing his guitar prowess (Donald thought he was black upon first hearing him soloing in another room at Bard College.) However, it wasn't until they recorded their cover of Duke Ellington's “East St. Louis Toodle-Oo” that Walter resumed his six-string duties, spelling the wah-wah trumpet with his talk-box guitar. By the time of “Katy Lied”, Walter was seriously starting to exorcise his guitar demons, exemplified by his venomous solo work on “Black Friday”. After peppering the second verse with face-melting 11th chords, he launches into a murderous 20 bar solo. At the 11th bar he goes for the same 'C#'-to-'E' bend that Rick Derringer tackles on 'Chain Lightning', only, instead of the string escaping from his grasp, the note itself starts to die out. Refusing to accept defeat, Walter furiously grinds the bent 'B' string into the fret with a desperate blues vibrato, causing it to caterwaul back to life, actually increasing in volume. (This is the same technique Jimi Hendrix uses on the intro to “Foxy Lady”.) A heroic save indeed, but one not likely premeditated.

4. “Aja”, Aja, 1977

Steve Gadd's drum performance on this masterpiece is still a benchmark for aspiring drum virtuosos worldwide. What nobody was prepared for was for Gadd to deliver the final goods on the first take. Steely Dan sessions were notorious for hours of unused takes and rotating rhythm sections in search of that elusive pass, so Gadd's flawless delivery not only gave us a track for the ages, but doubtless saved many hundreds, even thousands, of dollars' worth of studio time.

5. “Your Gold Teeth”, Countdown to Ecstasy, 1973

The liner notes indicate that “(i)n this number, several members of the Dan get to "'Stretch Out'". This they proceed to do, starting off at the sedate tempo of 133 bpm, but before they hit the first chorus they've achieved 137 bpm. By Fagen's electric piano solo on the fadeout, they've ramped up to a manic 146 bpm. Given the Dan's later later insistence on click tracks and computer-quantized drum tracks (achieved with the aid of engineer Roger Nichols' WENDEL digital drum machine), this display of wanton rhythmic haste is a decided departure for our usually surgically precise heroes.

6. “Do It Again”, Can't Buy a Thrill, 1972

A clam on a Steely Dan single? The first one? (We won't count “Dallas”/”Sail the Waterway”, which was pulled by ABC on the grounds that the 'A' side sounded too “country”.) Well, it's not exactly a clunker, but it's still there. On the second chorus, when Fagen sings “You go back...”, the chord played is a Cm7, same as the other two choruses, but Walter (who may not have had enough lead vocal in his headphones) plays a G instead. So what? Well, on the other two choruses he lands on the chord root, playing a C. Fortunately, the stray note is harmonically compatible with the chord, and he proceeds to the regularly scheduled next note, unfazed. Apparently nobody noticed, or thought it important, but it remains to this day, a stark testament to the limited studio time and recording budgets that the Dan would gradually abandon in favor of records whose final cost would make the most extravagant record label accountant swallow his cigar.

  1. Third Word Man”, Gaucho, 1980

Another clam on a Steely Dan record? Gaucho's almost excessive polish would seem to exclude even the possibility of any flaw, but the classic era-Dan's last song, like its opening salvo, shares, amazingly enough, the exact same bass mistake heard in “Do It Again”, only this time it's session great Chuck Rainey running afoul of the chart. At 4:18, where the chord is a Bbm, Chuck inadvertently goes to F, the chord's 5th. Just as Walter did eight years before, Chuck immediately corrects course and plays the chord's root (in this case Bb), continuing smoothly to the Eb with everyone else. How this insignificant slip escaped the baleful scrutiny of Donald and producer Gary Katz might be explained by Walter's absence from the studio, notably during the mixing phase. Having broken his leg in an accident (“The car and I were attempting to occupy the same place at the same time,” he explained), he was relegated to listening to mixes over the phone from his hospital bed. Telephones, whose poor bass response is the very essence of their sound, could not convey the low end information that might have caused Rainey to be called back to the studio to atone for this infinitesimal slip. However, by this time Steely Dan had become mired in their own perfectionism and Becker's personal problems. It seems possible that, had the stray note been, uh, noted, it would have been waved off as unworthy of further attention, given the writing on the studio walls by that time. Steely Dan would eventually rise from the ashes in a world better suited to their performing and recording needs, but it is the legacy of those first seven albums that confirms the true trajectory of that most subversive of partnerships, that of a Jew and a German set on infiltrating pop music with a musical and lyrical sophistication it never saw coming.




Thursday, January 03, 2019

There's No Such Thing As 70s Music


You'd think the dilemma was as old as the human race, but there is evidence that it's much, much more recent: Your kids think that everything you know is wrong/stupid/outdated. In the days before mass communication and the rise of popular culture, your parents' music was your music. Sure, there might be songs written for children, but these were passed on to their children when they came along. There was no constant shift in styles and trends. The word 'teenager' was centuries, then decades away.  But it was inexorably planning to take over the world. This it attempted first by nonconformity, then by rebellion, then sheer force, then finally by stealth, in which it eventually succeeded. But by now these victorious-yet-obsolete teenagers had new teens of their own, and the cycle was not so much repeated as reinvented. Raccoon coats and ukuleles were eventually supplanted by bobby sox and zoot suits, then...wait a minute. Pop culture has been synopsized by abler writers and cultural historians than myself, so before I go off on "the 78 begat the 45 begat the LP begat the 8-Track", I will narrow my focus to a particular decade whose musical impact resulted from a convergence of factors that cannot be duplicated.

My motive is admittedly ulterior: I have a 17-year old daughter who is musically talented to the point where she is beginning to recognize that any educational and career choices she makes will need to focus on the development of her creative and performing abilities. Like a legion of other teenagers, she's pretty sure that her parent's taste in music can be little more than one huge, ghastly mistake, to be studiously avoided. Since this aesthetic judgment is based largely on non-musical criteria, there is no possibility of our agreeing that my experience and knowledge amount to any valid defense of the music I grew up with.

There is convincing evidence that one's musical tastes are most strongly formed during adolescence and young adulthood. (My experience, and almost certainly yours, should settle the question to everyone's satisfaction.) My beginnings as a musician occurred right about 1970, a time when Flower Power still asserted itself on the airwaves with songs like "My Love Grows (Where My Rosemary Goes)" and Three Dog Night's "Joy To the World". By decade's end these sunny expressions had given way in turn to a stylistic gamut including glitter, prog, funk, disco, punk and  New Wave, with rumblings of hip hop and techno making themselves known before 1980.  A cursory examination of this quantum leap, comparable to that produced by the 1960s, is in order.

The Sixties, of course, are the musical and cultural egg from which the 'Me' decade hatched. The former has been so closely studied and analyzed as to be clinically proven to  kill 99.9% of penny loafers and poodle skirts on contact. The cultural climate that hatched ten years of fearless experimentation in music and its attendant artistic expressions (i.e., pop art, psychedelia, auto-destruction, cinema verité, etc.) led to multiple conclusions, ranging from morose disillusionment to wide-eyed optimism and the pursuit of spiritual enlightenment. Popular music had become such a reliable cash cow that record companies and executives were sufficiently convinced to sign unknown artists, performing unknown musical styles. By decade's end rock music had become big business, a commodity to be traded, with the last ten years of innovation and cash flow as its justification.

Technology, too, had changed the game almost immeasurably. The Beatles, whose contribution to the decade's musical progress cannot be adequately treated here, pioneered such studio innovations as tracking and overdubbing with headphones, recording electric instruments directly into the recording console (as opposed to with a microphone), automatic double tracking (ADT, now known as flanging), tape loops, sampling keyboards, and backwards recording, all common practice now but virtually unheard of when they were signed to the Parlophone label in 1962. The movement snowballed in their wake, resulting in records by other artists that sounded by turns bigger, brighter, more polished, more aggressive, softer, harder, more modern - in a word, more commercial. Happy listeners, happy recording artists, happy record executives - an oversimplification, yes, but a maturation of sorts, a coming-of-age for a turbulent decade's most cherished pop culture expression.

Just as casual listeners can usually approximate a song's epoch of origin by its sonic signature (i.e., the twangy, spring reverb-drenched guitars and cheesy combo organ that proclaim 'Early Sixties Surf'), there are ostensibly equivalent hallmarks of Seventies music that relegate it to stereotype status: The dry drum sounds, replete with fatback snare and toms with their bottom heads removed, monophonic synthesizers hooting out their then-futuristic sounding lines, phase shifted guitars and electric pianos slushing out the chords beneath, and - if the record company budget allowed (it often did), a real string section sweetening the deal often beyond reason. Today virtually any producer could duplicate these sounds in an attempt to create 'Seventies Music', and whether or not the results could fool an unsuspecting listener, the latter would at least be able to conclude that the attempt was recognizable, a reasonable facsimile.

But as time marches on, so does (or did) a decade's musical diversity. Having observed the last thirty years of popular musical trends, I perceive a certain closing up, to the point of creative strangulation. Everyone around me is sick of hearing me complain about how everything I hear on the radio - and worse, from the worship teams I play on in church - has the same four chords. There are no accidentals; it's all diatonic. Everything could be played on the white keys of a keyboard, perhaps with the transpose function enabled in case one actually desired to be in a key other than 'C'. (It happens.) Mass media have tightened the noose of acceptable musical styles. Television shows attempting to find the undiscovered 'Next Big Thing' (and to humiliate those who mistakenly believed it might be them) have reduced the musical vision of a generation to a brass ring to be lunged for, rather than a grueling, joyous process in which a garage band (the social unit, not the app) might slog its way to the top. 

This is, of course, a rant, an overgeneralization of a claustrophobic trend riddled with countless, glorious exceptions. But as cream rises, so do noxious gases. Dispensing with the gases for a moment, let's critique the cream in question.  I posit here that the non-category of  'Seventies Music' boils over with an embarrassment of stylistic, harmonic, cultural, creative and accessible riches without precedent, and sadly, without hope of any similar phenomenon happening in our lifetime. Like those that gave us the Beatles and their attendant impact, the convergence of factors that led to the riotous musical supernova of the Seventies can never again be duplicated.

That the year 1970 should demarcate some sort of musical or cultural frontier bears explanation: The last ten years had experienced cultural and political upheaval never before experienced, inflecting pop with such elements as world music, electronic instruments and processing, stylistic cross-pollination (i.e., country rock, psychedelic blues, raga rock, jazz rock, folk rock, classical rock), setting the stage for imminent breakout pop radio hits from distinct genres such as bluegrass ("Dueling Banjos"), black gospel ("Oh Happy Day"), electronica ("Hot Buttered Popcorn") and reggae ("I Can See Clearly Now").  A musical emancipation of sorts had paved the way for virtually unrelated genres to jockey for chart position.  No chord progression was too opaque, no lyric too weird, no voice too strange. 

As the money-grubbing record industry plowed through the era of Watergate, the Energy Crisis and the Carter administration, even second-tier selling genres like progressive rock (think Yes, Emerson Lake & Palmer, etc.) generated sales that compared favorably with the Internet-weakened music biz of today.  (Lest anyone think I'm blaming the Web for this present musical darkness, the gradual decline I write of was well underway even before dial-up.) The bewildering divergence between successful recording artists like Elton John, Patti Smith, ABBA, Bob Marley, Barry Manilow, John Denver, Elvis Costello, the Bee Gees, Aerosmith, the Carpenters and Lou Reed (sorry to have left out so many indispensables) virtually guarantees a staggering lack of distinguishing characteristics that could render a significant portion of the decade's musical output subject to any stereotype other than (a friend suggested this, not me) cocaine usage. From the stunning beauty and complexity of Steely Dan's 'Aja' to the moronic disposability of Rick Dees' 'Disco Duck', from the raging maelstrom of the Sex Pistols' 'God Save the Queen' to the lush sonic whipped cream of 10cc's 'I'm Not In Love', from the  hippy-dippiness of Sammy Johns' 'Chevy Van' to the cloying bombast of Barry Manilow's 'Mandy', from the snappy funk of the Average White Band's 'Pick Up the Pieces' to the nihilistic futurism of Devo's cover of 'Satisfaction', there is simply no common stylistic thread running through the decade's music.

All that was going to change, though – technological innovations in the music industry would soon become catalysts for trends destined to unify musicians, producers and engineers, for better or worse. The early 80s proliferation of personal computers, drum machines, affordable multitrack recorders, MIDI devices, digital audio recording and processing, sampling synthesizers and sequencers would soon have nearly everyone – me included – scrambling to put these developments to work in the service of creativity. But there was a price beyond that of the cost of these toys: A certain sameness, a common thread. Sure, a keen ear could pick out a Fender Rhodes piano or a notch-position Stratocaster on the radio, but now even an unknown, small-time producer like me could hardly turn on the radio without hearing Yamaha's glistening-yet-sterile DX-7 electric piano sound, a fatback snare sample from Roland's GM sample set, or patch 19 on the Alesis Midiverb II reverb unit and thinking, “Hmm, those are the sounds I used on that jingle last month.” The palette of available colors was shrinking at an alarming rate. Even records that didn't use the Linn 9000 drum machine were being crafted so that a real drummer could sound fake. Not even guitars were safe. The proliferation of so-called “Super Strats” (electric guitars based on Fender's venerable Stratocaster design, but with hot-rodded electronics and locking vibrato units that allowed the humblest local player to dive-bomb like Eddie Van Halen and still stay in tune) made the guitars we saw on the now-burgeoning MTV distinguishable from each other mainly by their garish paint jobs. The custom-built pedalboards and racks of effects used by many 70s guitar greats were gradually being replaced by mass-produced multi-effects units whose sonic signature many of us could identify instantly.

And now, perhaps the most glaring example of a sound that screams “80s” like no other – gated reverb. Once Phil Collins and his cronies accidentally stumbled onto a signal path that caused the tail of the reverberation on the drums to be cut short – a sound that cannot occur in nature – producers and engineers everywhere were scrambling to duplicate this huge-yet-claustrophobic sound (perhaps most famously demonstrated on the mammoth drum break before the final chorus of Collins' “In the Air Tonight”.) By 1982 commercially available digital reverb units were putting this sound - prepackaged - into the hands of guys like me who were scrambling to sound like everybody else. Most of my surviving recordings from the 1980s may sound low-budget, but my tools, being those used by most of my colleagues, left their sonic stamp on my music in a way that carbon-dates it as surely as Rubik's Cube and parachute pants.

The ensuing decades have alternately shied away from the homogeneity of 80s music and revisited it in altered forms, grunge rock and Americana/roots rock typifying the former, hip hop and electronica affirming the latter. It is currently de rigeur to use computers in studio recording, with only the astounding diversity of available tools rescuing many of us from the sameness that made so much 80s music so easy to single out on first listen. I can use these very tools to simulate music from any decade, but if a client wants “70s music”, I need clarification. The brainy, intricate heartland prog of Kansas? The glossy, melodic hard rock of Boston? The infectious, jazzy, horn-driven pop of Chicago? The strident proto-metal of Nazareth? The folksy soft rock of America? (If you haven't spotted the pattern in this list, go drink some coffee and report back here.)

As I stated at the beginning, I would never have belabored this point if not for the frustration I've experienced in trying to pass my best knowledge and experience on to a daughter whose musical skills are a source of continual amazement to me. Aristotle quotes Plato as having pointed out “the importance of having been definitely trained from childhood to like and dislike the proper things; this is what good education means.” (Arist. Eth. Nic. 1104a.20). While even my musical education has been mostly informal, it has served me well, and to be able to pass the 70s musical torch to my own offspring would be worth more than having a hit record.


Sunday, February 19, 2017

Churchill and Trump: Separated At Birth?


I recently commented to an old friend that I didn't like President Trump – I offered no specific criticism or rationale; I simply didn't like the man. My friend responded, “Then you wouldn't like Winston Churchill, either,” or words to that effect. I assured him that I had read a number of books about Churchill, whose only real rival as the greatest man of the 20th century is probably Albert Einstein. After the conversation I thought hard: Churchill, Trump – am I missing something? Having concluded that yes, I was, I humbly submit a few parallels I have tried to draw between the two men: 


‌• Both Churchill and Trump are men steeped in history. Churchill was an historian of considerable achievement, having written, among many other things, A History of the English-Speaking Peoples and the six-volume The Second World War. Donald Trump himself has occupied history, having lived at the time of the Korean and Vietnam wars, the Space Race and the Arab Spring, all of which are historically important historical events in history. He too, has written books, including How to Get Rich and Think Big and Kick Ass in Business and Life.

‌• As a young cavalry officer and war correspondent, Churchill gallantly served his country in combat. In South Africa he was captured and interned in a POW camp, from which he escaped, which made him a bit of a hero back in England. Trump was also captured, in 2005, on tape. Both men went on to become heads of state in spite of the adversity of having been captured.

‌• Winston Churchill's wit was legendary. When he met Labour MP and Tory-hater Bessie Braddock at a party in 1946, she told him: "Winston, you are drunk." "Madam," he replied, "you are ugly, and I will be sober in the morning." Trump also is known for his razor keen witticisms. Referring to the 9/11 tragedy, he quipped, “I was down there, and I watched our police and our firemen, down on 7-Eleven, down at the World Trade Center, right after it came down.” “7-Eleven” ? Get it?

‌• Churchill married Clementine, the love of his life, in 1908, and they remained married until his death in 1965. Not to be outdone, Trump has been married three times, in addition to his various extra-marital affairs, thus proving his love over and over again.

‌• As an orator, Churchill has had few equals in modern history. His speeches, seasoned with phrases such as “their finest hour” and “blood, toil, tears and sweat” galvanized the will of his nation to stand agains the evils of Nazi Germany, and have become part of the English language. In Trump we have a latter-day Churchill, giving us such stirring statements as “Part of the beauty of me is that I am very rich,” and “I’ve said that if Ivanka weren’t my daughter, perhaps I’d be dating her."

‌• Doubtless in response to countless observations made to him by mothers about their offspring, Churchill was wont to respond, “All babies look like me.” Trump has also been likened to a baby.

‌• Like many of his time and station, Churchill was susceptible to the paternalistic racism and perceived European superiority common among such men, having once referred to Mahatma Gandhi as “that half-naked fakir.” In order to combat charges of anti-Mexican sentiment leveled against him, Trump was compelled to insist, “The best taco bowls are made in Trump Tower Grill. I love Hispanics!”

‌• Their countries both menaced by the Third Reich, Churchill and the Russian dictator Josef Stalin formed an uneasy alliance, each aiding the other materially and militarily until the Axis was defeated, leaving Britain and the Soviet Union to square off in an ideological standoff that would last for decades. Trump likewise shares a controversial relationship with Russia's Vladimir Putin. Warning against the Russian president's ordering the killing of journalists, Trump cautioned, "He's running his country and at least he's a leader, unlike what we have in this country. I think our country does plenty of killing also." That a great deal of killing takes place in the U.S. cannot seriously be questioned.

‌• Winston Churchill's record of service in public office was extensive well before he became Prime Minister, having been, among other things, Lord of the Admiralty, a member of Parliament, Minister of Defense, Chancellor of the Exchequer, Secretary of State for War, President of the Board of Trade, and Home Secretary, among many others. Trump's political aspirations are also the stuff of legend, he having considered running for Governor of New York. He has been a member of several political affiliations, including the Republican, Democratic and Reform Parties, thus establishing his credentials as a public servant.

‌• Trump's respect for Winston Churchill is attested to by his reinstatement of Churchill's bust to a prominent place in the Oval Office. While I have been unable to unearth any direct quotes by Trump on Churchill, he has nonetheless expressed admiration for Saddam Hussein, Kim Jong-Un, Benito Mussolini, and Bashar al-Assad, also national leaders, as was Churchill. 


This is not a scholarly work; rather, I wished to independently affirm my friend's assertion that my distaste for President Trump must translate into an equal disdain for Churchill. I am now faced with a choice: Either admit Trump's greatness as a visionary, courageous statesman and historical figure, or relegate Churchill to the status of an ignorant, narcissistic, lewd, pompous ass. To some this dilemma may seem insuperable, but history must ultimately judge. Out of respect for the office of President of the United States, I hereby abstain. Time will tell. And tell it will.









      Wednesday, June 03, 2015

      BOY, 8, PREDICTS FUTURE

      "We're baffled," say scientists


      Now my own suspicion is that the Universe is not only queerer than we suppose, but queerer than we can suppose.” - J. B. S. Haldane, 1892 – 1964

      Professor Haldane, an avowed atheist, frequently made observations that any thinking theist could eagerly embrace as God's truth (remember, all truth is God's truth.) At the age of eight, before I could think clearly about my own faith in philosophical or metaphysical terms, I experienced a phenomenon that should have at once put me firmly in his camp. But not the atheistic one. Oh, no – this little episode spilled the beans about something apparently outside the Universe altogether. 

      Before I ask you to give credence to the recollections of a middle-aged man about something that happened in 1969 (there, I just carbon-dated myself), I will attempt to establish my admittedly subjective credentials. My long-term memory is not infallible, but surely far better than my short-term one. I can remember being baptized as a baby in the Mission San Diego de Alcalá, but I have no idea where I put down the cordless drill two minutes ago, and will, if unchecked, waste the next hour looking for it in vain. My friends are frequently surprised by the details and accuracy of my recollections about our collective past. It's how I'm hardwired. The following recollection of a glimpse into the future cannot be scientifically verified, but it has remained unchanged for decades, and is very simple in sequence.

      As a child I was obsessed with birds, and any close proximity to, or better still, interaction with them filled me with delight. Our family had moved from San Diego, CA, to Bozeman, MT in the summer of '69, just in time for Neil Armstrong to invent that future dance craze, the Moonwalk. I was crestfallen to find that Montana had civilization, and worst still, schools, but I was up for any adventures my new home would offer. Conveniently enough, our first Montana home was across the street from that of three brothers who were the same age as myself and my two younger brothers, with whom we could play or feud with as the moment dictated. One morning, right at that moment when we are most likely to recall a dream accurately, I dreamt the following:

      I had gone across the street to our friends' house, to find a magpie sitting on the fence in their backyard. It didn't outwardly appear to be injured or sick, but it apparently couldn't fly, either. It just perched on the back fence, defiantly emitting the standard magpie call, which I gleefully returned. Back and forth we went, until other activities took over, or, in this case, until I woke up.

      Which I did. I got up, dressed, and went across the street, that being my standard procedure at the time. And there, right out of my dream, sat the magpie. It couldn't fly, or I'm sure it would gladly have done so instead of hanging out with the likes of me. It squawked. I squawked back. This went on for a few minutes, and I told the kids there that I had, just minutes before, dreamed that “I was having an argument with a magpie!” I don't remember whether they took any notice of this, or whether they believed me. Eventually my eight-year old attention span faltered and I spent the rest of the day in the freedom a child enjoyed in those politically incorrect days of summer autonomy.

      I didn't closely analyze this brief-yet-uncanny sequence of events, first dreamed, then realized. I was at an age at which a child has yet to adopt a skeptical attitude toward the Universe, and had indeed been brought up on a mashup of Roman Catholicism and New Age weirdness, either of which might admit a child's dream as a conduit of the supernatural. I didn't read anything into it. I just accepted that my dream had foretold an actual event, and what of it? Didn't we live in a Universe fraught with infinite possibilities? 

      Or did we? I had yet to experience the Materialist view of things, which rules out the supernatural in all its forms. No spirit world, no God, no reincarnation, no Karma, only those things that can be observed and documented using the Scientific Method. Many years passed before I ever met a proper atheist, at least in the U.S. (The ones I encountered in China nearly twenty years later didn't count, having themselves been the product of mass indoctrination by a totalitarian state.) My Universe presumed a God, held me responsible, and, worst of all, threatened to make me reincarnate until I got it right. But it also seemed to allow for less stringent manifestations of the supernatural, in this case a prosaic, pedestrian preview of the immediate future. (My dreams tend toward the wildly irrational and random; this one stuck unimaginatively to the unadorned facts.)

      The skeptics, acting on the assumption that no dream could accurately forecast the future in such concrete terms, have the answer. Billions of people dream every night, for thousands of years on end, dreaming every kind of dream that can be dreamt. (Uh oh, we're going to run out of new things to dream.) Why couldn't it be that one child out of billions might accidentally dream an uncomplicated event that just happens to occur – verbatim - minutes later? (This is known as The Law of Large Numbers.) Oh, and might not I, between waking and dreaming, have heard the distant magpie distress signals coming from across the street, and constructed a dream around them? (A few years later I developed the questionable habit of sleeping with the radio on, and that did in fact interfere with a few dreams.) And wouldn't a boy who loves birds dream about them anyway?

      Enter William of Ockham (1287 – 1347), and his secret weapon, Occam's Razor. (The spelling must here remain unexplored.) His Razor states that “among competing hypotheses that predict equally well, the one with the fewest assumptions should be selected.” In layman's terms, the simplest explanation is often (though not always) the best. The specific details of my dream varied so violently from my “normal” visions (which might range from eggs hopping along a conveyor belt and singing in three-part harmony, to terrors too unspeakable to write here) that I can't simply ignore them. Might I not simply dream I was in school, or, as I sometimes do now, that I'm at my day gig making guitar straps? I bet most of us have useless dreams - I mean, if I'm going to do schoolwork or factory work, I might as well at least get a grade or a paycheck. 

      But a flightless magpie is something few people ever see, and if I'd seen this one before, there's no way I could have forgotten it. And the location – my friend's backyard fence. The squawking back and forth. Okay, I could have gotten that from the dream, but, caught up in the moment, I was more interested in my interaction with a wild bird than I was in the dream I'd just had. Not everything can be boiled down to such a pedantic sequence. Quite simply, my dream accurately foretold a future event.

      For those who insist that my experience cannot be scientifically verified, I say, Amen. No scientific study of clairvoyance or any form of ESP has shown any scientific basis for such phenomena, and indeed, any scientist who ever undertook such an investigation in the first place should have his slide rule broken over a knee, his pocket protector ripped from his lab coat, and be banished to Ken Ham's Creation Museum, forever to wallow in the silliest of pseudoscience. For a case such as my own, we must look somewhere besides science.

      What? Isn't science the basis for, well, everything? Tax codes, Thai kickboxing regulations, bluegrass etiquette, literary criticism, the Geneva Convention, the Golden Rule? So far, so bad. Some things fall squarely outside the realm of scientific inquiry, and, as one who loves science, I will insist on this as firmly as I insist that music theory does not apply to nuclear fission or plate tectonics. To maintain that everything that has ever happened in recorded history has a purely scientific explanation is to prostrate one's self before a false god, one deaf and blind to what it means to be human, conscious, free. The transcendent works of Bach, Beethoven, da Vinci, Rembrandt, Shakespeare, Tolstoy - strictly determinist animal behavior. The selfless courage of Mother Theresa, Nathan Howe, Oskar Schindler - nothing but higher animals obeying their herd instincts. To someone who actually believes in such a world, I have much pity, but very little to say.  Nothing I say here will concern them, so they may feel free to read something else.

      Much heavy weather has been made of late concerning the question of whether the existence of God, or any other supernatural entities or phenomena, can be proved or disproved by science. Here I will enlist the input of Stephen Hawking, whose scientific credentials cannot be seriously questioned. In spite of his recent atheistic stance, he nonetheless has yet to recant his statement in A Brief History of Time, where he states, "As far as we are concerned, events before the Big Bang can have no consequences, so they should not form part of a scientific model of the universe. We should therefore cut them out of the model and say that time had a beginning at the Big Bang." (emphasis mine.) If God initiated the latter, science can have nothing to say about it.

      Some occurrences can be proved, but not specifically via the scientific method.  But no proof, scientific or otherwise, will convince one whose assumptions will not admit truths they find unpalatable. In Harper Lee's classic novel To Kill a Mockingbird, the upstanding defense attorney Atticus Finch conclusively proves the innocence of a disabled black man falsely accused of raping a young white woman. Nonetheless, the all-white jury returns a guilty verdict, not because they have any reason to fall for the plaintiff's obviously fabricated testimony, but because this is Depression-era Alabama, and no black man could be believed over any white person. (To his credit, we later learn that one of the jurors dissented vehemently from his fellow white men, and only under duress cast his vote against the innocent–yet-doomed Tom Robinson.) I will here attempt to illustrate how a proof could be ironclad, yet without the aid of the scientific method:

      Suppose you and I are driving down the road in my battered Crown Victoria, and a song comes on the radio – in this case, “Shattered”, from the Rolling Stones' Some Girls album. My band played this back in 1979, and I know all the words, the lament of an Englishman navigating the perils of New York City. (And you thought Sting came up with that idea! For the record, P. G. Wodehouse was writing on that very subject 100 years ago, with hilarious results.) It's mostly a rap, just a bit of melody here and there.

      Well, I can't be prevented from rapping along with Mick Jagger, perhaps a major third lower, just for fun. “My brain's been battered / Splattered all over / Manhattan.” I'll also hammer out Charlie Watts' drum fills, note-for-note, on my steering wheel (while you wish I'd just steer the car instead.) I nail everything in the song, either to your amazement or to your annoyance, but one thing is clear to you – This guy has heard this song before. My driver's seat performance is absolute, ironclad proof that I've heard the song at least once previously.

      Or is it? Doesn't the Infinite Monkey Theorem prove that a Universe full of monkeys, provided with typewriters and reams of paper, will eventually type the complete text of Shakespeare's Hamlet? Mathematically, yes. (Maybe in Brazilian Portuguese, if they have the right sort of typewriter.)

      This, as any idiot can see, ignores the element of improbability entailed by our humanity. What if one of the monkeys, on a roll (“Get thee to a nunnery!”), accidentally starts typing the text of Hunter S. Thompson's Fear and Loathing in Las Vegas​  instead? Now we have to start over. Bad monkey.

      We needn't look to Shakespeare to give this asinine theory a decent burial. The chance that any of the monkeys, individually or in concert, will even (in the course of nine hundred billion trillion years), give us the words, “CLOSE COVER BEFORE STRIKING” misses the point in most impressive intellectual fashion.

      Back to my car: If you had some motive for supposing I could not possibly have heard “Shattered” before, you might start looking for alternate explanations for my faithful rendition: He studied the sheet music. (I like that one – I can read music!) Or perhaps I heard some other band do the song exactly like the record. Nice try. (Too close to my having heard the original.) Okay, Large Numbers to the rescue! In this, or perhaps another Universe (Multiverse is a hypothesis intended to include any possibility whatsoever), wouldn't it stand to reason that I might just happen to make all the right sounds with my mouth, and hit the steering wheel in perfect sync with Charlie's lovely, quirky drum fills? Émile Borel, with his proposed legions of monkey typists, might think so. Any court of law, however, could only convict me of having accidentally performed “Shattered” by their violating not only Occam's Razor, but by bringing a really stupid assumption (i.e., that I improvised along with the song, predicting the entire lyric and much of the drum part) to the courtroom.



      By attesting to the single most supernatural-appearing event of my life, I am not here attempting to prove Christian doctrine (to which I have nonetheless devoted much of my life) – I am merely asserting that there is more to the Universe than, well, the Universe. Science cannot prove me right nor wrong, but I can think of nothing that could possibly convince me that my dream was not informed, accurately and deliberately, by something completely outside of time and space. Not just something, but (I cannot believe otherwise) Somebody.

      Saturday, January 31, 2015

      Belated Book Review: The God Delusion by Richard Dawkins

      Ever since I read Dr. Francis Collins' statement that he keeps the complete works of Richard Dawkins and Christopher Hitchens on his bookshelf (“One must dig deeply into opposing points of view in order to know whether your own position remains defensible”), I've been nagged by the feeling that I was behaving somewhat like a Hobbit (who, according to J. R. R. Tolkien, “like books which tell them what they already know.”) I sustained a blow to this mentality many years ago when I read The Creator and the Cosmos by Hugh Ross, which made short work of my fundamentalist “cosmology” (such as the idea of an Earth which is thousands, rather than billions of years old.) I was relieved, rather than shaken – who really enjoys defending the indefensible? Dr. Collins' The Language of God, which helped me to reconcile biological science (including Natural Selection) with my Christian faith, completed for me the basic process, including a firm repudiation of Intelligent Design (ID), which negates the scientific method whenever a difficulty is encountered, replacing it with a miraculous default.

      We live during a time in which battle lines are now drawn not only between “Creation vs. Evolution” or “Science vs. Faith”, but, ever increasingly, “Young Earth Creation vs. Theistic Evolution”, in which Christians themselves are divided over whether a literalistic interpretation of Genesis trumps all the disciplines of mainstream science. Over and against this stands Richard Dawkins, the Chuck Norris of militant atheism, for whom any belief in God or the supernatural is not only wrong, but at best misguided, at worst evil incarnate. His magnum opus, The God Delusion (henceforth TGD), has become sort of an atheist Bible, and its enormous popularity is significant in an age where atheism has lost much of the stigma that has dogged it for centuries and kept it largely underground. Along with books such as the late Christopher Hitchens' God is Not Great: How Religion Poisons Everything, TGD has emboldened millions of people to abandon beliefs they held for reasons other than deep personal conviction (i.e., family tradition, indoctrination, etc.)

      Church, we had it coming. If half of Dawkins' observations about religious faith are true (I didn't try to keep score), then we are facing a massive reformation of our intellectual, moral and spiritual integrity. My decision to read TGD was, as I indicated, encouraged by Dr. Collins' fearless admonition to face alternate viewpoints head-on. Since there are numerous rebuttals, and even entire books challenging or purporting to refute TGD, the easy way would have been simply to read one or more of these and use it as ammunition against the atheistic onslaught.

      Having read TGD cover-to-cover, I'm glad I finally took the high road. Dawkins writes well, has a keen sense of humor and does his best (mostly) to be fair and objective. He is an eminent scientist with what he would deem "an appetite for wonder", and he makes many fascinating observations about nature, the universe and humanity. Much to my astonishment, I began to find Dawkins likable, and I found myself laughing with him far more often than at him. He is as worthy an opponent as a thinking person of faith is likely to cross swords with.

      And here I must unsheathe my own. But wait a minute - me, an electric guitar player, take on one of the world's most formidable atheists? Who do I think I am, anyway? I'm no scientist, nor philosopher nor theologian. I don't even have a college degree (which nonetheless didn't prevent me from once creating a university course.) My audacity usually extends as far as taking on musical challenges that I'm not 100% sure I'm qualified for. My confrontation with Richard Dawkins must be of the David vs. Goliath variety, or perhaps merely Quixotic.

      There are other ways to confront him – he is justifiably predisposed to quote the various cases made against himself, especially since they are frequently hate-steeped invectives made all the more disagreeable by their ignorance and outright stupidity. Nobody wants to look bad in a public showdown, and by reprinting his own hate mail, Dawkins underscores his own perceived moral, as well as intellectual superiority by demonstrating the ethical and logical bankruptcy and hypocrisy of some of his more virulent foes. If I am to score any points against him, I'd better leave smear tactics to those who possess no other weapons.

      That's not to say that Dawkins never resorts to unfair tactics himself. He tries hard not to sling too much mud, except at Yaweh, the deity of the Bible, against whom he unleashes his most contemptuous and hostile assault. Incomplete and one-sided though it may be, Dawkins' case against the biblical God should be answered, although it has elsewhere been treated by thinkers far superior to myself. Where he skewers statements by Augustine, Martin Luther and many others, I often find myself nodding in agreement. There is seldom, if ever, total agreement in any camp, even Dawkins' own. Far worse, to me, are his frequent attacks on straw men. Many of his favorite victims (i.e, baptism of infants, a mystically-addled Hitler, young earth creationism, experimental prayers for healing) are soft targets against which nearly any thinking religious person could make an equally compelling case. A cursory perusal of the works of C. S. Lewis would have eliminated the perceived need to include many of these in any rational attack on religious faith.

      Unfortunately, Dawkins doesn't face his most formidable adversaries head-on. His only mention of C. S. Lewis, the “apostle to the skeptics”, occurs in a cursory dismissal of Lewis' 'Trilemma' argument (Jesus as liar, lunatic or Lord), asserting that Lewis “should have known better.” Likewise, Dr. Collins, our foremost advocate of Biologos (the harmony of faith and science), warrants only passing mention, with no real attempt at refuting any of Collins' arguments for faith, despite his wholesale agreement with Dawkins on the question of evolution. Here would have been excellent sport, but Dawkins either dodged these bullets or didn't bother to notice they'd been fired. I will give him the benefit of my doubt here, since the latter seems at least intrinsically plausible. Nobody's perfect.

      My own quibbles with Dawkins involve the farthest-reaching questions, which he seems to have attempted to answer but repeatedly comes up short. He argues that religion is not the source of morality, but he needn't have looked any further than Romans 2:14-15 to find that not even Christianity makes such a claim (“For when Gentiles who do not have the Law do instinctively the things of the Law, these, not having the Law, are a law to themselves, in that they show the work of the Law written in their hearts, their conscience bearing witness and their thoughts alternately accusing or else defending them.”) Likewise, his attempts to explain traits such as altruism as holdovers of herd mentality, or misfirings of genetics, fail to address the moral dilemmas posed by something as subtle as cheating on an exam, or on your income taxes. Still worse is his total failure to address perhaps the Greatest Question(s) of All: What is the meaning of life, or of the universe?

      I can cut Dawkins no slack here. That is not to say that he has never pondered the question, or that he is in any way shallow – to the contrary, this guy is deep. I love his sense of awe at the beauty and complexity of the universe, and his insistence upon unraveling as much of it as he, and we, can. Before I pull the metaphysical trigger, I wish to reaffirm my agreement with him that we should aggressively oppose virtually all attempts to limit scientific research and knowledge. I share his contempt for small-mindedness, and no person of faith should fear any advance in scientific knowledge, since any such advance should automatically bolster our faith as it adds to our understanding of creation.

      Dawkins repeatedly insists that the existence or nonexistence of God is a scientific matter, but in no way can I fold, spindle or mutilate my brain into agreeing with him. Since what the Theist means by “God” is by definition something outside of time and space, it can never be detected by using the tools of science. (Here Dawkins declares himself to be at variance with fellow evolutionary biologist Stephen Jay Gould, who insisted that science has nothing to say on the matter.) Never say never, but I say never. Dawkins rightly insists on the picking apart of everything that can be observed, detected or hypothesized. Unfortunately, when something falls outside the realm of observation or other forms of verification, its hypothetical existence must be determined by other means, (i.e., revelation, which Dawkins open-mindedly dismisses out of hand.) The Hot Big Bang model of creation, that most mysterious and yet foundational of natural phenomena, can and should be subject to the most rigorous research we can devise. But even if it can be shown to have resulted from a random fluctuation in a quantum anomaly, we are forced (if we are brave enough) to ask where those came from. Stephen Hawking's assertion that it all came from the Laws of Physics (“which have always been there”) ignores the fact that laws can do nothing without a patient to act upon. Gravity cannot be the apple it causes to fall. I have long maintained that any child could see this, but not even a wall full of academic credentials can force their recipient to see what they wish not to see.

      TGD makes blessedly little of wishful thinking as a source of belief in God (or a lack thereof), but it should be acknowledged that for every person “deceived” into believing in God through wishful thinking, there might equally be a person hoping like mad that He doesn't exist. As C. S. Lewis observes, the question does nothing to move us toward a logical conclusion, since there are plenty of wishes on both sides. Of far greater portent is the Anthropic Principle (the narrow spectrum of conditions in which carbon-based life can develop), about which Dawkins makes a good deal of heavy weather. The existence or nonexistence of planets with the right moisture, temperature, gravity, etc. to support life is the subject of extensive research and conjecture, but only enters into the theological realm as regards the likelihood, or lack thereof, of life arising spontaneously from non-life. The now-popular Multiverse scenario (in which an indefinite number of universes existing outside our own, elevating the probability of matter morphing into life somewhere) cannot be tested, verified, nor falsified. The vastness of our own universe surely provides scope enough for life (which we know to have occurred in our own backyard) to materialize, whether accidentally or intentionally. As a straw to be grasped in the attempt to indicate that anything might happen, given enough time and material, Multiverse is (to borrow one of Dawkins' pet names) a cop-out of unimaginable proportions, or, at best, irrelevant. Life already happened; deal with it. We must look at the final chapter of his book to see the unhinged lengths to which he will go in order to defend his position.

      In his invocation of the miraculous-as-theoretically possible, a marble statue of the Madonna could theoretically, through a random flash mob rebellion against normal molecular movement, wave its hand. (While somebody reliable was watching.) Rather than delve into the near-infinite absurdity of this scenario (conclusively trashed by authorities on all sides of the issue, even his own), I will simply state that Dawkins-as-final authority is a chimera, just one more example of wishful thinking, of which most or all of us are equally guilty. This colossal cock-up should serve as a warning to those who would set up any man, be it Dawkins, Darwin, C. S. Lewis, Stephen Hawking, Einstein, Karl Marx, Francis Collins, the Dalai Lama, or the Pope as bulletproof. (I plead guilty myself.)

      I hasten to add that Dawkins is, barring his foibles, an important figure in the worldview debate. Iron sharpens iron. One of the more delightful surprises awaiting me in TGD was his vast appreciation of P. G. Wodehouse, to his mind and mine, “the greatest writer of light comedy in English”, not the least reason being (for both of us) his profound biblical, and therefore cultural literacy. Dawkins justly, and admirably, points out that an ignorance of the King James Version of the Bible as a vast source of our culture's rich verbal heritage would be an impoverishment. For example, Wodehouse's bumbling hero Bertie Wooster comparing his own hangover to Jael driving a tent peg through Sisera's temple (the side of his head, not a building) would ruin the inside joke, lost on a biblical illiterate. Dawkins gives us a veritable laundry list of biblical tropes, including “my brother's keeper”, “coat of many colors”, “kill the fatted calf”, “the stars in their courses”, and “the patience of Job”, among dozens of others. Though these concessions in no way diminish his contempt for religion, they, perhaps unwittingly, give it more importance than he intends.

      I believe Dawkins' most glaring oversight to be his total failure to address the greatest questions ever asked: Who am I? What is the meaning of life, and of the universe? Does the universe have a purpose? Whether by brushing these aside or merely forgetting to treat them, he makes, what seems to me, the worst blunder a man on such a mission as his could possibly commit. I plead an unlearned man's ignorance of the minutiae of various schools of thought regarding the meaning of the universe, but with America's Founding Fathers (for whom Dawkins professes boundless symathy), I subscribe to the proposition that certain truths can “be self evident”. If an accidental, irrational universe can somehow “acquire” meaning and purpose, somebody had better spell the process out for me in language that even I, and others even stupider than myself, can understand.

      I applaud Dawkins for his commitment to the values of honesty, compassion and decency, even as he saws off the branch he's sitting on. To those who attack him with far more hatred and violence than he himself allows himself toward even the most contemptible or misunderstood of his adversaries, I had best not say what I really think of them. My surprise at being able to enjoy so much of TGD will probably lead me to read his other books. I should very much like to meet him. If I condemn some of his more unkind characteristics, I likewise condemn them in myself (tact being a wildly variable resource in my Asperger's-infested toolbox.)

      I challenge every thinking Christian or theist to read The God Delusion; likewise I challenge all agnostic or atheistic Hobbits to venture beyond the familiar and comfortable and read C. S. Lewis' Mere Christianity or Francis Collins' The Language of God. And I would encourage us all to read The Righteous Mind by Jonathan Haidt, a Jewish atheist who demonstrates more courage and fairness in evaluating opposing sides of the ideological spectrum than I would have dared dream possible. It's a lot of work seeing both sides of the story, but for the Biblical theist it isn't supposed to be an option - “The first to put forth his case seems right, until someone else steps forward and cross-examines him.” (Proverbs 18:17). And as for the atheist or anti-theist? What have you got to be afraid of?













      Monday, August 11, 2014

      Requiem For a Smoker

      Smoking used to be cool. Humphrey Bogart, the iconic tough good guy, the cigarette between his lips extending his world-weary cool beyond his actual person. Jimmy Page or Keith Richards, stalking the stadium stage, guitar slung low, cigarette protruding defiantly from the rebellious rock 'n' roll sneer. Franklin Delano Roosevelt, cigarette holder jauntily jutting from his confidence-inspiring grin, leading his nation through Depression and war. General Douglas MacArthur, swaggering ashore to liberate the Philippines, RayBan™ shades and corncob pipe underscoring exactly who had returned. Winston Churchill, facing the camera with his bulldog scowl and pinstripe suit, chomping his Havana cigar and wielding a Thompson submachine gun.  (The Nazis tried to use this photo as propaganda, portraying Churchill as a gangster; this backfired, partly due to the perceived glamour of the Depression-era mobsters.)

      If I have left out your favorite smoker (likely), then that only serves to point out the traditional cultural acceptability, yea, even the respectability afforded tobacco use for centuries. Even historic Christian figures including J. S. Bach, Charles Spurgeon, Dietrich Bönhoeffer, C. S. Lewis and Chuck Colson were known to light up on a regular basis. Conveniently, it would seem, there is no reference to smoking in Scripture. Instead we are left to draw inferences from passages referring to 'destroy(ing) God's temple' (conveniently “countered” by Jesus' assertion that “(W)hat goes into a man's mouth does not defile him” (Matthew 15:11). Since the perceived morality of smoking is not my focus here, we can happily move on.

      At my workplace there's a gentleman on maintenance staff, possibly in his mid-60s, but appearing to be possibly as old as 70. (He can fix anything I break, so I tell him that, as long as I work there, he'll have job security.) He's a wiry little guy who fought in Vietnam, and he smokes whenever the opportunity comes up. Here in Montana it sometimes gets down to -20º F. or even colder in the winter, and yet he dutifully steps outside and lights one up, as do his fellow co-workers/smokers. Now here's a guy who defied the odds and came back alive from 'Nam, choosing (I use that word loosely) to kill himself slowly, on the installment plan. But to add insult to injury, smoking is no longer permitted in the break room, nay, in the building at all. Not even a separate lounge for the smokers, just banishment to the great outdoors, even for one who honorably served his country, in a war he was too good for.

      It's easy to confuse legitimate health concerns with political correctness, since in this case they often overlap. What could easily pass for some sort of ideological hysteria turns out to be not only good science, but a blessing for those of us who endured years of secondhand smoke in order to play or facilitate live music. I quit smoking at 22, four years after I started, which had the unexpected effect of making me much more hostile to the presence of cigarette smoke than I had ever been before I started smoking. I never knew that one day the official consensus regarding tobacco would one day follow suit, kicking the hapless smoker out-of-doors, and making the memory of lighting a cigarette in a movie theater or on a commercial flight seem like the memory of performing a minstrel show in blackface and a nappy wig.

      I can think of no comparable fall from grace experienced by any other cultural phenomenon, good or bad. (Except maybe the Record Business.) The tobacco industry went, in a relatively short time, from claiming the health benefits of cigarettes, to rubbing our noses in the mortal perils of the same. That which was proudly advertised by everybody from Fred Flintstone and Barney Rubble to future president Ronald Reagan is now a vice for pariahs. Can you imagine anybody telling George Burns, Audrey Hepburn, Clint Eastwood, Frank Sinatra or Mark Twain to 'take it outside'?

      The good (?) news is, nearly everybody I listed here is dead, a few of them even due to of smoking-related causes. No need to be an iconoclast – the iconic smoker is history, perhaps never to be replaced.

      Thirty years ago I smoked my last cigarette, scraped the cigarette burns off the headstock of my Fender Stratocaster (thanks to Eric Clapton for making me think my guitar should join in the fun) and survived the process of becoming an ex-smoker. And yet, thirty years later, not a day goes by that I don't find myself taking a drag from an imaginary Marlboro. It's not about the nicotine – I was never seriously tempted, say, to chew tobacco. Rather, it was the act of lighting up, something to do with my hands, something that was somehow relaxing. (The imaginary cigarette usually turns up when I'm faced with some uncomfortable or embarrassing memory.) And yet I pity those next to me in the store who lay out exorbitant sums of money for something I used to get for $10 per carton, who will likely not be permitted to smoke even in their own homes, let alone an airport, a nightclub or a restaurant.

      I'm not the least bit conflicted about the wonderful, ubiquitous freedom from secondhand smoke we experience today, just amazed at the sea change that sank such a mighty ship, now replaced by a tramp steamer, its trail of smoke disappearing forlornly over the distant horizon. May it never return, but thanks for the memories (I think.)



      Monday, June 30, 2014

      Death To Originality

      (For its own sake)


      Those who know me personally may find such an injunction coming from me to be either ironic or crazy – I don't consider myself particularly original, but many who know me do in fact find me unconventional, even uncomfortably so. My two-part heading may be taken either to mean a) that one should not try to be original simply for the sake of being original, or b) that originality itself must perhaps not even be attempted, lest something inferior result from the attempt.

      I've been stewing in this juice all day, ever since I traced it to my bungled attempt to play a particular guitar figure correctly in church this morning. By 'correctly' I mean duplicating the intro as played on the original recording of the song we were playing. The intro in question consists of playing a simple eighth note pattern against a triplet echo produced by a black box at my feet. The result is the same sort of figure heard in a song like 'Where the Streets Have No Name' by U2 or GNR's 'Welcome To The Jungle, the kind of thing I've been playing ever since Barrett Golding gave me my first digital delay line thirty years ago. But the song in question has an intro whose note sequence bears only a shadowy relationship to the song's actual harmonic content. Hearing it would cause any listener to expect the song that follows to be in the key of 'F', not the key of 'C' the rest of the song is actually in. (You may not even know what I mean by 'key', but you would nonetheless be adrift until the vocal comes in and establishes what's really going on.)

      Sounds like I'm whining, doesn't it? Well, if you don't read certain Psalms because they're “whiny”, then you needn't read this either. Still, I should have immersed myself in listening to this well-recorded, well-played bit of musical disinformation instead of listening to Frank Zappa while I cleaned the kitchen, right? (Frank himself would have something to say about that.) Should have practiced it instead of going to Barrett's barbecue? Well, that might have counteracted the lackluster intro I played today, which I attributed to a) having a cold, which always affects my ability to do anything right, even if it's easy, and b) the part's inherent semi-musicality.

      I'm so unoriginal that I'm going to pass the pen to C. S. Lewis for a minute, since, if I've ever been guilty of letting someone do my thinking for me, then he's the one I trust to do so:

      Even in literature and art, no man who bothers about originality will ever be original: whereas if you simply try to tell the truth (without caring twopence how often it has been told before) you will, nine times out of ten, become original without ever having noticed it. - C.S. Lewis, Mere Christianity (IV, 11)

      “Well, that's alright for him,” someone may say. “He's already original to begin with!” Well, yes. (He was also one of the best-read men in England.) Having an imagination is one thing, but as soon as I start thinking about my imagination, I run out of ideas. So rather than bore you, I'll try to make my point and let you get back to whatever it was.

      Originality is almost universally hailed as a virtue. And so it is. When a creative work takes us to new territory, challenges our preconceived notions and opens our eyes to new possibilities or shows us timeless truths in a new light, then originality becomes the unobtrusive servant of art. But even the most startling originality in the arts can nearly always be found to have deep roots in its medium's storied past. Pablo Picasso leaps to mind.

      It's almost cheating to cite Picasso, but nowhere is there a better example of what I'm trying to say. For many, Picasso is nearly the only 20th century artist they can even name, let alone describe. His pioneering work in Cubism, collage and sculpture are among the most easily identified departures from the status quo during the 20th century, and their fragmentation of conventional visual themes virtually defines the word “original”, as though it all sprang without warning or precedent from the mind, heart and hands of one Spaniard. Which it didn't.

      Those for whom the name “Picasso” only conjures mental images of people and objects with noses and eyes in the wrong places or pointed in wrong directions are usually ignorant of his earlier works, in which people and things are treated very realistically  . He didn't break the rules until he knew them, and then not until it was time to do so. He earned the right to innovate, which he did gradually. But even his innovations, from Modernism to his fabled 'Blue Period' to Cubism andy beyond, had roots. The Spanish Renaissance painter El Greco influenced him, as did African tribal masks and sculpture. I feel like I've already opened Pandora's can of wasp worms by dragging Picasso into our discussion, but I hope I've made my point. His influences are too many and varied to be ignored, but, filtered through the Picasso grid, they combine, crossbreed, tangle and weave into a body of work that has brought boundless vitality and joy to each new generation that discovers it.

      What does this mean for the contemporary artist, musician, dancer, writer or filmmaker? I hope it means that doing good work comes to be more important than reinventing the wheel. This came home to me several years ago when I was playing bass in my church, with a drummer whose traditional vocabulary seemed quite limited, but who nonetheless seemed determined to jettison what little he had. Instead of keeping time on the ride cymbal or hi hat (two instruments, BTW, that have lent themselves to daring innovation by many great drummers from Tony Williams to Travis Barker), he tended to wander off into attempts to change it up by tapping his sticks on the rims or shells of the toms. That itself has been a useful technique for occasional use by great drummers, but in our context it only served a) to call undue attention to itself, and b) to make me (and probably others) long for a regular, solid rock drum groove that would give the song, the congregation and the band the foundation we needed to drive the message home. There's a REASON that you only seldom hear that sort of thing. Why drive a nail with flashlight when there's a hammer nearby?

      I also had a very dear friend years ago who was a very skilled electric bassist, and had been regularly performed this gig with a nationally known Christian artist. But, once free of her constraints (it was children's music), he decided that his mission was to to explore strange new worlds, to seek out new life and new civilizations, to boldly go where no man has gone before. Now, if you'll scroll down a bit you'll see my essay, “How Low Should You Go?”, in which I try to make the case for the bassist's job being that of making sure everyone else sounds good. My friend cared little or nothing for that – every constraint was to be thrown off, every note tried, every musical style explored, whether in or out of its natural context. I love salsa bass lines (more correctly, the tumbao pattern that helps define much afro-latin music), but not so much when it doesn't correspond to the rock groove the rest of us are playing.

      Musical styles are indeed born of cross-pollination – reggae, latin jazz, highlife, western swing, none of these genres could have been born in a vacuum. Curiously enough, though, they are not the results of somebody's attempt to be original. They are, in fact, the results of the time-honored musical practice of stea . . . um, borrowing from other musical traditions. Many years ago in China I was mesmerized by the background music accompanying a Chinese acrobatic routine – someone was playing some plaintive, Chinese melodies on a notch-position Stratocaster. (The same sound you hear on 'Sweet Home Alabama' or 'Sultans of Swing'.) The result was somehow ancient and hip at the same time. Alas, I had no tape recorder with me at the time. (I have since made it a point always to bring a recording device to any country I visit.)

      Chances are, your favorite music and songs are likewise Frankenstein'd together from the artiste's collective musical background. One need look no further than The Beatles to see how this can work. And while the Beatles were purposeful in their creativity, the latter was mainly the result of their simply trying to create good music with the resources at hand.

      And then there's Christian worship music. How often do I go into a situation where the drummer has taken what would have been a good hi hat groove and instead transferred it to the bass drum for 64 measures, where it fascinates those whose idea of classic rock is 90's worship music, but for us old-school curmudgeons only points up the song's lack of craftsmanship? If the song is good enough, you don't have to do screwy things to the drum part to compensate for shoddy songwriting.

      I'm not being completely fair. Some of these songs are reasonably well written, and as such would be at home in multiple musical genres. My old friend Karen Lafferty's standard “Seek Ye First” has thrived in classical, death metal, bluegrass, electronica and bebop settings. (I'm afraid to ask her what other styles the song has sported.) She wasn't trying to be original, she was simply expressing musically what she felt God wanted her to express. The result? Millions of people worldwide hiding the words of Christ (Matthew 6:33) in their hearts, in whatever musical heart language they love best.

      Well, that paragraph didn't help me to be any more fair to anyone, so I shall instead apologize to those who may have been offended (at least I didn't name any song titles, in order to protect the guilty.) Please, feel free to innovate. Just remember, though, that innovation means begging, borrowing and stealing creative ideas anywhere an everywhere you can find them. And hopefully becoming as knowledgable about them as you can. Then “you will, nine times out of ten, become original without ever having noticed it.”