Culture, Health Steven Gray Culture, Health Steven Gray

Intermittent fasting and the myth of "three squares a day."

Scenario #1 Standard American Life

I wake up in the morning and start my day with a bowl of cereal, toast and a glass of juice.

At noon, I eat a sandwich with chips and a Snickers bar for dessert.

At 3pm, I'm hungry and focusing on work is difficult, so I eat another Snickers and wash it down with a cup of coffee.

For dinner at 6pm, I eat a "real meal" of meat, vegetables and a dinner roll.  A piece of leftover cake follows for dessert.

I go to sleep and repeat the sequence the next day.

Scenario #2

Primal Blueprint

I wake up in the morning and start my day with some eggs and salsa or a leftover piece of meat from the night before.

At noon, I have a salad with grilled chicken strips, drizzled with olive oil and balsamic vinegar.  I eat a few squares of dark chocolate for dessert.

At 3pm, I'm feeling a bit peckish, so I toss back a handful of nuts.

For dinner at 6pm, I eat meat, vegetables and some fruit as a garnish or a dessert.

I go to sleep and repeat the sequence the next day.

Scenario #3

Primal Blueprint + Intermittent Fasting

I wake up in the morning.  I ate a big meal the night before, so I drink a cup of coffee and decide to wait until I'm hungry to eat again.

At noon, I'm hungry for lunch, so I go out to my favorite restaurant for a bunless hamburger with plenty of onions and mushrooms on top, served with a side salad or some mixed vegetables.  A few squares of dark chocolate round out the meal to satisfaction.

At 3pm, the fats and proteins in the burger are satisfying enough so that I am not hungry and can work through the day without loss of focus.

At 6pm, I'm hungry but not ravenous.  Dinner is another arrangement of meat and vegetables, light portions.

I go to sleep and repeat the sequence the next day, with variation in schedule and meals skipped based on hunger.

Which of these makes the most sense?  The latter two are obviously the more healthy choices of food, as well as in which order the meals are consumed, i.e., starting the day with protein and fat instead of simple carbs...but what's all that nonsense about skipping meals?

Question for your Sunday: Why do we eat three meals a day?  Do we eat because we're truly hungry, or because a government-recommended diet high in simple carbohydrates has conditioned us to want three meals a day?

America has an epidemic.  It isn't obesity or diabetes or heart disease; those are symptoms.

The epidemic is herd mentality.  Blind acceptance of a status quo.

A USDA stamp on a box does not make a food nutritious or ideal as an energy source.  It simply means that it has the required amount of certain ingredients or "fortifications" to make it passable to be sold to consumers.  60 Minutes aired a piece several months ago that showed how companies actually engineer processed foods to have the same qualities as addictive or controlled substances.  Pre-made food bought in colorful boxes is created specifically to manipulate you into feeling hungry sooner, desire that taste again, and buy more.  Think about that.

Healthy foods, that is to say, whole foods, meat and produce, raised or grown without additives, are where true nutrition is to be found.  They provide necessary fats, proteins, vitamins and minerals for human life.  And most importantly to the human experience, they provide satiety.

When your food is satisfying and provides your body with what it needs to replenish cells and nourish your muscles and organs, there is no reason to eat, unless you are hungry.  The problem is, food is so easy to procure in our culture that we often forget what hunger actually feels like, resulting in snacking and overeating.  One reason for this is the Western attachment to the idea of three meals a day, and the oft-repeated mantra that breakfast is somehow the most important meal of the day.

However, if your meals are complete and provides actual nutrition, you might not really be hungry first thing in the morning.  If so, don't eat!  Alternatively, if you are hungry in the morning and eat breakfast, and the satisfaction from breakfast stays all the way until the lunch hour, do you really need to eat lunch?

This applies to any meal, or more than one meal.  Modern life often requires a lot of time spent being sedentary, either working behind a desk or, in my case, spending time sitting in a college lecture/regurgitate-lecture-on-paper environment.  The assumption that we need to constantly replenish the very minor caloric expenditure of sitting is just silly.

In a hunter-gatherer society, or at the very least, a society that is not dependent on grain agriculture (something that wasn't necessary until humans started congregating in cities and found it necessary to sustain large populations with cheap, bulk crops), food isn't always readily available.  That is why primitive cultures who still hunt and gather instead of rely on farming for their food sources tend to be incredibly healthy until "heroes from the West" descend to "civilize" them.

If you're not hungry at one of the culturally prescribed 8am/12pm/6pm meal times, do yourself a favor and just wait.  The idea that "one size fits all," that something terrible will happen if you skip a meal, is just silly.  What you put into your body is an individual experience, and should be a conscious choice.  If you're not hungry, no one has the right to make you eat.

Every now and again, I like to go twenty-four hours without consuming food.  I'll drink some black coffee (no sugar) or tea, but I give my body time to reset.  It accelerates fat-burning, it sharpens my mind through consequent ghrelin production and restores insulin sensitivity.  And when I am between meals, I try not to snack; my liver needs a break now and again.  This comes in handy on long flights, where the unapologetically disgusting food served on airplanes actually does more to discourage one from eating.

But, perhaps most importantly, it makes me appreciate food.  You have to eat properly before you can skip meals properly.  When you eat real foods like meat, fowl, fish, vegetables, fruits and nuts, your palate becomes much more sensitive; the act of enjoying a meal when nobly hungry takes on special significance.

When you choose to set your own schedule, you are no longer one of the herd.  Your relationship with food changes.  Instead of mindlessly shoving back lab-engineered, factory-assembled crap every few hours, the food experience becomes just that: an experience.

I am fasting as I write this.  I indulged in a large meal of Indian food yesterday, liberal helpings of chicken and vegetables topped off by an indulgence in the heavenly Indian dessert gajar halwa.  I haven't been hungry since, so I haven't eaten.  It's been almost twenty-four hours now, and I feel fantastic.  I am awake and alert; the words are flowing freely as I write.

Respect yourself.  Respect your food.  Eat when hungry or not at all.

Further reading:

  • Mark Sisson's "Why Fast?" Series [Mark's Daily Apple]
  1. Weight Loss
  2. Cancer
  3. Longevity
  4. Brain Function
  5. Exercise
  6. Methods
Read More
Miscellany Steven Gray Miscellany Steven Gray

College Life

I was reminded again today of how nothing is ever as easy as it seems.  One would think that, by now, one could write a paper in the well-designed and intuitive interface of Apple's word-processing software Pages and export the finished document to a Word format without irreparable loss. Not so much.  Furthermore, apparently these files are read differently by different computers, even if they all also use Word.  I summed up this afternoon's dot-doc foibles in the following Facebook status update.

Write paper in Pages, using required margin and typeface settings: exceed page count. Review paper in Pages: gee that's pretty!

Review paper in Word at same margin and typeface settings: fall short of page count.

Review paper on another computer, in Word, at the same margin and typeface settings: the footnote numbers have now become Roman numerals.

[Bang head against wall]

Sometimes, life imitates The Oatmeal.

Some new entries in the works for tomorrow or the next day.  As this is my "train of thought," I have some more stuff to get off my chest about communication.

Read More
Culture Steven Gray Culture Steven Gray

Easter versus Christmas

As a Christian, I support Easter.  I support it on the basis of principle.  Contrary to many people, I prefer it to Christmas. Ghandi once said "I like your Christ, I do not like your Christians. Your Christians are so unlike your Christ.”  And it is so true!  I say this as a Christian.  Christ lived his life without fanfare, without dispensing judgement on everyone he met.  He lived a life summed up by the word "love."  It staggers me how few Christians' lives can be summed up in similar terms.

Christ did not ask for His birth to be remembered.  However, he did request that we remember his death through the rite of communion.  Communion is a purely symbolic practice, and Christ laid it out in the following terms:

And when the hour was come, he sat down, and the twelve apostles with him.  And he said unto them, With desire I have desired to eat this passover with you before I suffer: For I say unto you, I will not any more eat thereof, until it be fulfilled in the kingdom of God. And he took the cup, and gave thanks, and said, Take this, and divide it among yourselves: For I say unto you, I will not drink of the fruit of the vine, until the kingdom of God shall come. And he took bread, and gave thanks, and brake it, and gave unto them, saying, This is my body which is given for you: this do in remembrance of me. Likewise also the cup after supper, saying, This cup is the new testament in my blood, which is shed for you. [Luke 22:14-20]

No such request was made for Christmas.  Christians of the more conservative persuasion often contend that the early church practice of moving into a new area, "converting" the population and remodeling the native holidays with Christian iconography, means that Christians today should not recognize these holidays at all.  Christmas, after all, is simply a replacement for pagan winter solstice holidays, and Easter, as a holiday, is tainted by lingering pagan fertility symbols.

While I would never be opposed to removing the commercial hoopla from what should be holidays of the spirit, I tend to see things along a different tack.  Rather than grasping at some desperate argument from how holidays came to be celebrated, I prefer to go back to the source material for an answer.  I realize that reading the Bible is become a novelty, what with the convenient alternatives of groupthink and popular opinion, but in the Bible, Christ himself said that he wanted his followers to remember the sacrifice he made for them.  He didn't make that request of Christmas.  Oddly enough, Christmas became the holiday most violently hijacked from its origins, extrapolated from the destitute birth of a child in a manger into a corporate juggernaut.

Meanwhile, Easter is simply a means for candy companies to break even through a few weeks of sugar-laden sales.  What a shame.  What a waste.

Read More
Culture Steven Gray Culture Steven Gray

"Pointing the finger..."

rudolf_arnheim-img_assist_custom.jpg

Sometimes I become genuinely concerned about the future of interpersonal communication between people of my own generation.  There is no shortage of ways to spread ideas, but there seems to be a lack of faculty to utilize these avenues.

Before going further, I want to preface my own thoughts with a quote from Rudolf Arnheim.  Arnheim's essays throughout the 1930s on the subject of film, mass communication and psychology were far more insightful than most of what is written on the subject today.  The following quote comes from Arnheim's 1938 essay "A Forecast of Television:"

Television is a new, hard test of our wisdom.  If we succeed in mastering the new medium it will enrich us.  But it can also put our mind to sleep.  We must not forget that in the past the inability to transport immediate experience and to convey it to other made the use of language necessary and thus compelled the human mind to develop concepts.  For in order to describe things one must draw the general from the specific; one must select, compare, think.  When communication can be achieved by pointing with the finger, however, the mouth grows silent, the writing hand stops, and the mind shrinks.

Read it again, but replace "television" with the "Twitter," "Facebook" or any other social networking service which has made shorthand communication popular and accessible.  I firmly believe that these services have led to problems between how people relate to each other face-to-face.

Social networks are not a problem in and of themselves.  From cuneiform inscriptions to Gutenberg's printing press to the iPad, ideas, throughout history, always utilize the latest advances in technology to spread from person to person.  However, until the past few years, the communication of what happens in daily life required complete thoughts to be committed to letters or emails.

We must not forget that in the past the inability to transport immediate experience and to convey it to other made the use of language necessary and thus compelled the human mind to develop concepts.

Today, the capabilities of smartphones have finally equaled the possibilities offered by online social networks.  It is no longer necessary to harness the power of words to describe what interesting things we saw in the course of a day; we can take a photograph with a mobile device and share it with the entire world in the space of a few seconds.  I don't imply that this is a good or bad thing in and of itself, it is simply the place to which we as a culture have come.

Where I see a very definite problem with social networking is the irresponsibility with which it is used by the people who have grown up with it.  The children of the Baby Boomers viewed the arrival of everything from text messaging to Facebook with varying degrees of suspicion, while their kids, who have known these advances from an early age, are not only comfortable with them, but are increasingly reliant on on them to communicate.

As a result of this reliance, the "shrinkage of the mind" which Arnheim mentions is increasingly apparent in conversation.  There is an experiment which I like to perform to gauge people's use of language.  When someone mentions having seen a new film or read a book, I ask them what it is about.  If they start to tell me what happens in the plot, I stop them and say "I don't want to know what happened, I want to know what it was about; what the theme was."  And, sadly, very few people seem concerned with the true meaning of what they watch or read.  They fail to "draw the general from the specific."

I realize, and have previously written, that entertainment is less and less concerned with offering ideas that transcend aesthetics.  As such, it isn't surprising that stories are viewed by most audiences as little more than a chain of events strung together without deeper meaning.  However, I am growing concerned that an entire generation has grown up with little regard, or even awareness of thematics and meaning.

For in order to describe things one must draw the general from the specific; one must select, compare, think.  When communication can be achieved by pointing with the finger, however, the mouth grows silent, the writing hand stops, and the mind shrinks.

Communication is necessary to life.  But it isn't enough to "point the finger" with a photograph or a star rating.  Language, and the full usage of it, is important.  When George Orwell wrote 1984, he explored the idea of an oppressive state reducing the breadth of language to in order to communicate ideas efficiently and without emotion, as detailed by the character of Syme in chapter three:

Don't you see that the whole aim of Newspeak is to narrow the range of thought?… Has it ever occurred to you, Winston, that by the year 2050, at the very latest, not a single human being will be alive who could understand such a conversation as we are having now?…The whole climate of thought will be different. In fact, there will be no thought, as we understand it now. Orthodoxy means not thinking—not needing to think. Orthodoxy is unconsciousness.

Prior to writing 1984, Orwell wrote on this subject in his 1946 essay "Politics and the English Language," in which he discussed the effects of thought upon language, and of language upon thought:

A man may take to drink because he feels himself to be a failure, and then fail all the more completely because he drinks. It is rather the same thing that is happening to the English language. It becomes ugly and inaccurate because our thoughts are foolish, but the slovenliness of our language makes it easier for us to have foolish thoughts.

I feel vindicated in my feelings on this subject when they are confirmed by a mind like Orwell's.  However, unlike me, Orwell was able to find a foreseeable solution.

The point is that the process is reversible. Modern English, especially written English, is full of bad habits which spread by imitation and which can be avoided if one is willing to take the necessary trouble. If one gets rid of these habits one can think more clearly, and to think clearly is a necessary first step towards political regeneration: so that the fight against bad English is not frivolous and is not the exclusive concern of professional writers.

It only remains necessary to impress the importance of language upon culture--language as a living, complete, exciting way of expressing thoughts and ideas.  And in the age of convenience, when it there is the constant opportunity to reduce the human experience to a shared photo or a "check-in," therein lies the challenge.

Read More
Books, Culture, Movies Steven Gray Books, Culture, Movies Steven Gray

Sir Arthur Conan Doyle: Redux

sir_arthur_conan_doyle.jpg

There is nothing quite like a Victorian adventure story.  Victorian adventure novels have a unique flavor; detached, yet oddly engaging.  Often written in the first person as diary entries or a journalist's notes, they offer a unique perspective on adventure and action in a style that is now coming back into vogue in books like The Hunger Games and World War Z, which seem to be reviving the art of first-person narrative.

In the world of Victorian literature, one name stands apart from the rest.  You can talk about H. Rider Haggard or Jules Verne, but Sir Arthur Conan Doyle left the greatest literary legacy of his era in the creation of Sherlock Holmes.  I could almost stop with that, because just the name "Sherlock Holmes" carries enough weight and individual associations that my thoughts on the subject are, honestly, entirely superfluous.

Much like I enjoy Doctor Who without feeling the need to identify as a "Whovian," the adventures of Sherlock Holmes occupy a special place in my heart, but I don't call myself a "Sherlockian" or a "Baker Street Irregular."  I enjoy good books and good films, and Doyle's stories happen to be some of the best one can ask for in either medium.  I have enjoyed the stories since before I was old enough to fully appreciate them.  The annotated editions are on the shelf next to me as I write this piece, and through the added maps, background information and photographs, the books inspired me to take a sincere interest in the actual history of London and the life of the man who wrote the stories.

The impact of the character of Sherlock Holmes is indicative of the brilliance of Sir Arthur Conan Doyle.  Harlan Ellison, the great science fiction writer and endlessly entertaining raconteur, went so far as to make the following statement in an interview:

"You want to be smart?...Read the Arthur Conan Doyle Sherlock Holmes stories.  You read the entire canon--there aren't that many--you read the entire canon and you will be smarter than you ever need to be.  Because, every one of them is based on the idea of deductive logic.  Keep your eyes open and be alert.  That's what all good writing says: wake up and pay attention!"

Ellison was right.  If you read a Sherlock Holmes story online or on a device, make the text as small as possible and look at it statistically.  Most of the stories are made up of questions.  Holmes asks questions until the interviewees are out of answers.  When he has asked enough questions, he sifts through all of the pertinent facts in his mind and often deduces a correct conclusion without leaving his apartment.  Solving a crime was, for him, an intellectual exercises, and one in which he engaged largely for selfish reasons.  This fact was made clear in a passage from The Sign of the Four that is most often included in adaptations for its perfect summation of his character.

“My mind," he said, "rebels at stagnation. Give me problems, give me work, give me the most abstruse cryptogram or the most intricate analysis, and I am in my own proper atmosphere. I can dispense then with artificial stimulants. But I abhor the dull routine of existence. I crave for mental exaltation. That is why I have chosen my own particular profession, or rather created it, for I am the only one in the world.” 

Holmes is incredibly nuanced and interesting as an individual, but beyond the literary skill required to create good characters, Doyle had to create a believable genius.  Holmes couldn't satisfy readers or project brilliance by simply ascribing titles and backstories to the people he observed; he had to be able to explain how he knew what he knew.  And that is where Doyle was truly brilliant.

Doyle was able to take simple elements of daily life, from splatters of mud on clothing to a dog's tooth marks on a walking stick and extrapolate correlations and plausible causes from them in his stories.  Bear in mind, he wrote for his audience.  The distance of time between the original publication and the present day can lull modern readers into a casual acceptance of "that's just what it says," but that is a cheap form of acceptance!  Doyle made Holmes impressive because he made perfect sense to his readers in 1887.  His stories were authentic because they referred to tools, professions, crimes, international political climates, pets, clothing and customs with which his readers were intimately familiar.  And he did it so well that his stories were extremely popular in their day.  If they had been outlandish statements that didn't ring true with his readers, such popularity would not have been the case.

Nevertheless, Holmes is still a fictional character.  Detractors from the stories will likely remind the reader that many of Holmes deductions never reference any unspoken margin of error, and were furthermore dependent on the strictly defined social customs and not-yet-disproven pseudosciences of the Victorian age.  This is exemplified in the following passage from The Adventure of the Blue Carbuncle, in which Holmes draws conclusions from trace clues found inside a hat.  His deductions only work in an era in which phrenology is accepted as science and women were expected to maintain their husbands' accoutrements, but Doyle's level of detail is nonetheless staggering:

“I have no doubt that I am very stupid, but I must confess that I am unable to follow you. For example, how did you deduce that this man was intellectual?”

For answer Holmes clapped the hat upon his head. It came right over the forehead and settled upon the bridge of his nose. “It is a question of cubic capacity,” said he; “a man with so large a brain must have something in it.”

“The decline of his fortunes, then?”

“This hat is three years old. These flat brims curled at the edge came in then. It is a hat of the very best quality. Look at the band of ribbed silk and the excellent lining. If this man could afford to buy so expensive a hat three years ago, and has had no hat since, then he has assuredly gone down in the world.”

“Well, that is clear enough, certainly. But how about the foresight and the moral retrogression?”

Sherlock Holmes laughed. “Here is the foresight,” said he putting his finger upon the little disc and loop of the hat-securer. “They are never sold upon hats. If this man ordered one, it is a sign of a certain amount of foresight, since he went out of his way to take this precaution against the wind. But since we see that he has broken the elastic and has not troubled to replace it, it is obvious that he has less foresight now than formerly, which is a distinct proof of a weakening nature. On the other hand, he has endeavoured to conceal some of these stains upon the felt by daubing them with ink, which is a sign that he has not entirely lost his self-respect.”

“Your reasoning is certainly plausible.”

“The further points, that he is middle-aged, that his hair is grizzled, that it has been recently cut, and that he uses lime-cream, are all to be gathered from a close examination of the lower part of the lining. The lens discloses a large number of hair-ends, clean cut by the scissors of the barber. They all appear to be adhesive, and there is a distinct odour of lime-cream. This dust, you will observe, is not the gritty, grey dust of the street but the fluffy brown dust of the house, showing that it has been hung up indoors most of the time, while the marks of moisture upon the inside are proof positive that the wearer perspired very freely, and could therefore, hardly be in the best of training.”

“But his wife—you said that she had ceased to love him.”

“This hat has not been brushed for weeks. When I see you, my dear Watson, with a week's accumulation of dust upon your hat, and when your wife allows you to go out in such a state, I shall fear that you also have been unfortunate enough to lose your wife's affection.”

“But he might be a bachelor.”

“Nay, he was bringing home the goose as a peace-offering to his wife. Remember the card upon the bird's leg.”

“You have an answer to everything. But how on earth do you deduce that the gas is not laid on in his house?”

“One tallow stain, or even two, might come by chance; but when I see no less than five, I think that there can be little doubt that the individual must be brought into frequent contact with burning tallow—walks upstairs at night probably with his hat in one hand and a guttering candle in the other. Anyhow, he never got tallow-stains from a gas-jet. Are you satisfied?”

“Well, it is very ingenious,” said I, laughing; “but since, as you said just now, there has been no crime committed, and no harm done save the loss of a goose, all this seems to be rather a waste of energy.”

Waste of energy, indeed.  But impressive, for both Doyle and Holmes.

As the development of entertainment technology increased by leaps and bounds very soon after the introduction of Sherlock Holmes into popular literature, it is no surprise that Sherlock Holmes started appearing onscreen as early as 1900.  It is hard to imagine any literary figure who has been adapted for the screen more times than Sherlock Holmes.  At present, Wikipedia lists seventy-three men who have played Holmes on the stage, large and small screens, and radio.

The two actors who have most recently brought Holmes back into the public consciousness, Robert Downey, Jr. and Benedict Cumberbatch, have reintroduced Holmes to the world in unique ways.  The interpretations of Doyle's stories have been incredibly unique when compared to previous adaptations, but also surprisingly respectful to Doyle in their respective steampunk and modern-day treatments of the stories.

Looking at it objectively, Guy Ritchie's first film adaptation of Holmes, starring Downey Jr., is much closer to the original material than most critics give it credit for being.  In Sherlock Holmes, which I saw with my family on Christmas Day, 2009, draws much of its dialogue verbatim from Doyle's stories.  Of course, the story itself is a new narrative for Holmes, one with manifold problems, but a fun story nonetheless.  Where it succeeded most, however, was in its interpretation of Holmes himself.

In the stories, Holmes is constantly referred to by others as having skills and abilities which he used when necessary.  But, Doyle was careful to structure his stories so that Holmes is never actually seen by Watson when engaged to the fullest extent of his abilities.  Holmes is shown to the readers via Watson as action in repose.  We only see him when his mind is doing the work, but throughout the short stories and novels, Holmes talked of by others as a superb boxer, a chemist in the tradition of mad scientists, and an accomplished collegiate theatre actor who used his craft professionally to completely assumed new identities while in disguise.

The screenplay of Guy Ritchie's Sherlock Holmes diverged from its source material by showing Holmes fighting and assimilating his disguises.  Whereas Watson's point of view, often catching nothing more than the aftermath of a fight or hearing the story of a journey in disguise from Holmes after the fact, is the reader's only glimpse of Holmes in the text, Ritchie's camera follows Holmes when Watson is absent.

Through this shift in viewpoint, we are treated to the Holmes that actually did exist in the text; the difference lies in which side of him we see.  Sadly, last year's sequel Game of Shadows, while having moments of brilliance, was very inferior to its predecessor as a film as well as an adaptation what makes Sherlock Holmes the character that he is.  When Sherlock Holmes gets too far away from London, he is no longer Holmes, and the most recent film inadvertently turned him into James Bond.  I will say, however, that the casting of Jared Harris as Moriarty is a decision for which I will never cease applauding.

Most recently, the BBC has brought an entirely new perspective to the Sherlock Holmes mythos, delivered through the mind of writer and show runner Steven Moffat.  More and more, Steven Moffat is styling himself as the Leonardo da Vinci of screenwriting.  He possesses a mind with a seemingly endless wellspring of creativity, and a propensity to turn viewers on their ears with plot twists, overlapping timelines and character deaths.  In the space of five years, he created and ran the underrated Jekyll, took over the writing of Doctor Who's two most staggeringly complex seasons to date, co-wrote the script for The Adventures of Tintin, only to leave Tintin early to be the guiding hand behind Sherlock.

True to form, Moffat wasted no time in making Sherlock thoroughly unique.  He accomplished this by doing something that no one else had done before: he placed Sherlock Holmes and John Watson in modern-day London.  Guy Ritchie and Robert Downey Jr. had created a very modern interpretation of Holmes, but they retained him in his original, Victorian environs; the overall effect being one of confinement for the character's personality.  By contrast, Moffat's reasons for total commitment to a modern setting were staggeringly obvious:

“We just decided we were going to update him properly; he’d be a modern man because he’s a modern man in the Victorian version, he’s always using newfangled things, like telegrams. He’s someone who appreciates and enjoys technology; he’s a bit of a science boffin, he’s a geek, he would do all those things. I just think it’s fun, I don’t think all the fantastic tech we’ve got limits the storytelling, I think you can use it in all sorts of ways.” [Link]

"Conan Doyle's stories were never about frock coats and gas light; they're about brilliant detection, dreadful villains and blood-curdling crimes - and frankly, to hell with the crinoline. Other detectives have cases, Sherlock Holmes has adventures, and that's what matters." [Link]

As previously stated, staggeringly obvious.  These reasons are also in keeping with the spirit of Sherlock Holmes as a character.  As the Victorian Holmes was always on the cutting edge of the era's science, publishing articles in print journals on the subject of science in deduction, Moffat's Holmes does exactly the same thing, albeit with newer science and the internet.  Moffat even went so far as to placate hardcore fans with some long-awaited catharsis, allowing Sherlock to poke fun at the enduring image of himself as constantly wearing a deerstalker cap.  It could even be said that Moffat "lucked out" with the recent British involvement in the War on Terror in Afghanistan, which allowed him to retain even more of John Watson's original character as a wounded veteran fresh from the Afghani desert.

Sherlock Holmes, as Moffat indicated, is an individual who transcends the limitations of a specific time or place.  Furthermore, the level of respect which Moffat has shown to Doyle has been deep.  Obscure lines of dialogue and camera setups which perfectly mimic Sidney Paget's Strand illustrations make appearances in the BBC series, and are a never-ending source of delight for attentive fans.  To Moffat's further credit, he has kept the show confined to London for two seasons, with the exception of the obligatory Baskerville episode, apparently feeling no need superfluously bloat the supposed importance of a case by giving it global or supernatural import.

The idea of Holmes as an eternally modern man is also why I can defend the Guy Ritchie adaptations, albeit to a lesser extent.  Culture evolves. As Stephen Fry said, "Evolution is all about restless and continuous change, mutation and variation."  The more time that passes between the present day and that moment in 1886 when Doyle first put pen to paper and wrote Holmes into existence in A Study in Scarlet, the more necessary it becomes to update the adaptations to appeal to the very different culture that might be seeing it for the first time.

Provided that the Doyle estate protects Sir Arthur's stories from being tampered with or expanded by new writers (such as the recent continuation of the late Robert Ludlum's Jason Bourne novels under the authorship of Eric von Lustbader), so that future generations may continue to experience the stories as they were written, not just as stories, but as a reflection of Victorian culture, and a stellar example of the period's style of writing.

The Ritchie/Downey films have reached the widest audience thus far in recent years, and they have their flaws.  However, they retain enough of the character's essence to make people want to read the books.  I am personally unprepared to admit that the BBC's Sherlock has any flaws, but I will concede that they are unconventional in their unabashed commitment to Holmes as a modern man.

Where too many literary fans of Doyle and Holmes make a mistake, (and this holds equally true for fans of all book franchises which have been adapted for the screen), is in confusing the quality of a film or television show with the fidelity of the adaptation from its source.  Simply being different from the source material does not automatically make a film "bad" in any objective sense of screenwriting or production quality.

The root cause of many adaptations being popularly labeled as "bad," is that good books have the tendency to become the equivalent of good friends to devoted readers, and any deviation from what fans already know and love consequently feels like a very personal slight.  The more popular the book, the better the odds are that subjective fan opinions will color popular opinion far more than objective reviews which weigh the adaptation on its own merit.

However, there is one point on which I believe that all fans of Doyle's stories can agree.  If either of the two (soon to be three) franchises currently celebrating the writing of Sir Arthur Conan Doyle inspire their audiences to seek out the source material and discover the brilliance of Doyle's work on their own, then the adaptations, no matter how disagreeable to some fans, have succeeded.  And I think we can all be happy about that.

Read More