Pilgrimage: A Preface

On Saturday, K and I leave with a group of young clergy and their spouses (many of whom we are already close with; the rest of whom I assume we’ll be close with by trip’s end) for Israel for just about two weeks.

In perhaps unprecedented verbosity, I’ll be attempting to post notes about each day of the journey. We’ll be in Galilee early on and spend the rest of the trip staying within the walls of Old Jerusalem and making various day trips to bibliohistorical sites (not sure whether I’m inventing words here). I’m told we’ll be hiking about 100 miles over the course of the two weeks.

It’s been a while since we made the decision to go on this trip, but it’s only now feeling real. In previous posts, I’ve expressed some of my thoughts as I’ve done the preparatory work for the trip. I hope that the mini-travelogue of the journey will bring some clarity to those thoughts and inspire new ones to share. As with K, I think my biggest fear is that the journey will not be so profound as we expect and hope for it to be. Only time will tell, but I have faith.

My second fear has been dispelled just this morning–the Church of the Holy Sepulcher had closed in protest of a tax issue in Israel, but has reopened today.

I will be spending what (little) downtime I have during the trip (that is not spent enjoying the company of my fellow pilgrims or posting my daily post to the blog) continuing to work on my novel in-progress and other creative endeavors, though I don’t expect to be posting any of that during the trip. It’s also high time for another Avarian short story to pique the interest of prospective readers, so I’ll likely be devoting some time to that after my return. I am also working on a pen & paper roleplaying game for Avar Narn, pieces of which will likely be posted to the blog (or a “living document” as I’m writing and working out the kinks in rules). In addition to providing an innovative ruleset and deep setting for fantasy roleplayers (that’s a high bar to set, these days), it will provide a resource for the background of the setting as well.

But for immediate future, I hope you’ll join me on my pilgrimage to the holy land and finds something worth considering as my progressive and existential theology meets the geography and history of the place were the Bible took form, where the Israelites became a people and where God came to Earth. I hope that you’ll leave comments, thoughts and questions on the upcoming travel posts–I will endeavor to respond to anything posted to an entry by the day that follows (in local time).

For the next entry, click here.

Kingdom Come: Deliverance – Playing at History (an early review).

I backed Kingdom Come (KC:D) a long time ago–maybe more than two years. An open-world historical medieval RPG? Yes, please! Just the sort of thing that pulls at the desires of a person whose favorite video game is the Witcher 3 and who, for a time, was a professional student of the medieval.

There was, of course, a long roller-coaster of development that followed–teasers, delays, the realization that my computer wouldn’t be able to run the game, the revelation that it would be released on console and so my computer didn’t matter, etc., etc.

Finally, it arrived this week, and I’ve spent some significant time becoming immersed in the Bohemia of 1403. With the caveat that I’m nowhere near finished with this game, this is what I have to report to the present:

If you are the type of person who plays Fallout and Skyrim on survival mode, this game will appeal to you. You must sleep and eat. Your food rots over time, and spoilt food will make you sick. Eat too much and you’ll be sluggish. Take an injury (whether in combat or not!) and you might begin to bleed. Fix it with a bandage quickly or prepare to die. Keep your weapons and armor in good repair or they’ll become ineffective. Get your clothes bloody or dirty and people will notice–and they don’t take you as seriously when they do. Carry weights are (relatively) realistic, and you improve your skills by using them–not easy to do when it comes to using a sword.

The game is relatively “on rails” for the first few hours of play–while you can do your own thing for long whiles at a time, only advancing the main quest will get you to the point where you can seriously begin to play the game. It’s a slow start that left me, at first, with an unfavorable impression of how gameplay with develop that is still being dispelled as I move through the game.

So far, the game doesn’t feel as “open world” as I had hoped. It is true that there are sidequests (and perhaps I just haven’t discovered many of them yet) and you can easily spend hours just “living” in the medieval world–practicing a trade, acting as a merchant, traveling and fighting bandits, etc. In a certain way, I think you could ignore the quests altogether and simply view the game as a “medieval emulator.”

Further, there seems to be an intimation that the world will be expanded and even more opportunities for self-directed tasks will become available as the game progresses. Despite my several hours of play, I’m sure that I just have no gotten that far into the story yet.

And that main story is, at least, an interesting one. Set within a discrete historical event–King Sigismund of Hungary’s invasion of Bohemia on “behalf” of his half-brother King Wenceslaus IV (“the Idle”), who Sigismund had kidnapped, you are thrust into the world as the son of a blacksmith and the vassal of a lord loyal to Wenceslaus and targeted by Sigimunds’ invading army.

The attitudes and motivations of the characters seem deep. You get the expected behavior of some nobility toward the peasantry (particularly in Sir Hans), but this is never flat or without nuance:you earn the friendship and respect of Sir Hans as the story progresses and he is–in private at least–willing to admit his own faults and the shortcomings of his behavior. The struggle between adherence to duty and ideals when faced with the grim necessities of the day plays out on multiple levels, both personal and political. No assumption of medieval life is treated as straightforward, with a range of different lifestyles and living situations that more accurately portrays the era in a way we often miss in movies, dry history books and, especially, fantasy roleplaying, where the “medieval” is more often a pastiche or a facade than an actual description of setting.

Despite this, at least as far as I’ve played, the real joy of the game is in the way it immerses you into the historical world with a sense of realism and reasonableness. For instance, fighting several poorly armed bandits by yourself is difficult; attacking multiple well-armed or well-trained enemies (to say nothing of those who are both) is near suicidal. Unless you use tricks, like stealth, surprise and ambush, weakening the enemy with ranged weapons, hit and run tactics and any other approach that generally makes the fight less fair. This was the reality of the middle ages, just as it is today–no matter how good you are, fights are brutal and deadly, and fighting honorably will likely just get you killed.

Each fight is, however, very interesting. As a student of historical medieval martial arts myself, as both scholar (my Master’s Thesis was entitled “Shakespeare, the Sword and Self-fashioning”) and a martial artist (mostly with the Association for Renaissance Martial Arts or ARMA), I’m especially keen on in-game fighting that captures something of the speed, grace and precision of actual swordplay–something very difficult to do in a video-game because of the infinite array of techniques, maneuvers and responses in combat with a blade. KC:D does the best I’ve seen yet, with the combat not only accounting for the directionality of attacks, but incorporating parries, feints, grappling, counter-attacks and animations that perfectly capture some of the techniques used. This is no clumsy hack-and-slash; the only video game that has even come close to this kind of swordplay was Mount & Blade (whose new edition should be out later this year). While satisfying, this also means that combat is difficult and partially based upon your own twitchy-skill. It should be noted that there is only one difficulty mode for the game (so far as I’ve discovered): realistic.

As a side note, I am note a fan of the Dark Souls games. I just feel that should be said when I communicate how much I’ve enjoyed the difficulty of the game.

For the first few hours of the game, I was very frustrated by the save system. The game automatically saves when you sleep, complete an important quest step, or drink Saviour Schnapps. Saviour Schnapps is expensive, takes up inventory space, and can get you drink. At the beginning, when your skills are low and the game is at its most difficult, you will die a lot and have to replay moderate sections of the game (at least I did). As I progressed into the game and got into the mindset, I actually began to enjoy the save system. In a game that strives for immersion and realism, this save system reinforces these without becoming full-on rogue-like. You cannot get lucky for a minimal gain, save, and replay until you get the next minimal gain and save again. Three men in armor down that path? Best just to go a different way. This goes a long way into breaking the hero mentality we usually carry with us into video games; I particularly respect that.

This is not to say that playing heroic (or superheroic) characters in games is not appropriate, good design, or fulfilling–it certainly can be. But the occasional game that makes us live in an alternate world as a regular person–even one who may be an exceptional fighter (though still clearly mortal) provides a truly rewarding exception as well. In some sense, I do wish the game had some aspect of the fantastic to it, but that’s really only because I’m such a fan of fantasy. Realistically (and more sensibly), it’s great to see such an enjoyable game and interesting world and narrative created without any need to resort to the “unrealistic.”

As is probably indicated by the amount of words I’ve dedicated to this preemptive review, I’m really enjoying this game. If you’re willing to devote the time to acclimate to this game’s approach to play–and you’re willing to accept the design principles on which the game was built–I think you’ll find a lot to enjoy here.

In some ways, at its heart, this game is a history lesson you play–one about everyday living in the medieval world.

Destiny 2: A Horror Story in Reverse

I’d fought my way through waves of countless enemies, scaled strange landscapes and tracked down my quarry, a Fallen Captain supported by underlings, powerful Servitors and other baddies all determined to end me.

Getting to the point where I’d finally cornered my prey and he could no longer flee had cost me dearly–I had no ammunition for my Power weapon, only a handful of rounds left on my Energy weapon, and my Super would not be charged for what seemed like an eternity.

Desperate, I charged in, Kinetic weapon blazing. Return fire shredded me to pieces in milliseconds, my body ripped apart. I died.

Seconds later, I was back, resurrected by my Ghost companion. Seconds after that, I was dead again, but so was one of the Captain’s minions. This process repeated in multiple iterations–I respawned, I took out one more enemy, I died.

But respawning in Destiny is not merely a handwaved mechanic–it is a conceit of the gameworld. As a Guardian of the Light, your Ghost has the ability to reconstitute your body infinitely. There is no death for a Guardian.

As I whittled down the my enemy through sheer will, pure attrition and an unending supply of lives to throw at the problem, I began to think how that Fallen Captain must feel, watching as he repeatedly defeats an enemy who simply returns a few seconds later to destroy more of his brothers-in-arms. Movies like Friday the 13th and Halloween immediately came to mind–the unstoppable, unkillable force who relentless pursues his vengeance.

The terror and helplessness the Captain must have felt surely became too much to bear. I shortly relieved him of his worldly worries, but I can’t say that I felt good about it. Certainly not heroic (no matter what the difficulty level told me).

That’s when I realized it: Destiny 2 is not sci-fi; it’s a horror game where you play the role usually referred to as “the bad guy.” While the world does set things up as a struggle between Light and Darkness, and you are told that you’re on the side of peace, truth and justice, and your enemies do some despicable things, I’m not sure that the gameplay bears that out.

Destiny 2 was not a game I expected to give me some sort of existential crisis; I was only looking for some fun co-op with friends or a mindless activity for my hands while I listened to an audiobook. But what I got was a great uneasiness about the setting, one I can’t seem to shake.

 

A Rebuttal to Materialist Science

Yesterday, I came across an article (“Are you sleepwalking now?”) on the digital magazine Aeon that I could not help but respond to, because it seems to be such a patent example of someone misusing science to “prove” things well beyond science’s ken.

The article is here: https://aeon.co/essays/are-you-sleepwalking-now-what-we-know-about-mind-wandering. It is well written and certainly thought-provoking, so it’s potentially worth reading on its own. More to the point, it is required reading for this post.

To practice what I preach, here’s my fair disclosure at the beginning, in case this is the first of my posts that you’re reading. I’m a faithful progressive Christian who believes in both science and God. As an existentialist theologian and somewhat of an epistemological pessimist (I’d say “healthy skeptic,” I believe that personal consciousness and experience is the foundational starting place of examining metaphysical questions. Hence why I might take the article so personally, though I think that my arguments stand on their own and I’m explicitly trying to go out of my way (unlike Dr. Metzinger, I think) to admit to what I believe that cannot be proved and what does or does not actually meet with standards of scientific inquiry.

The article was posted on the 22nd of this year by Dr. Thomas Metzinger, a professor at the university of Mainz, where he teaches theoretical philosophy with a focus on the philosophy of the mind (the subject of his article). He has written numerous books, given a TED talk and is undoubtedly a highly-intelligent person well-versed in the subject matter.

Nevertheless, I have to take issue with the assertions he makes in his article.

The article begins with what I can only describe as a masterful metaphor for the movement of “thoughts and ideas” from un- or subconscious to conscious, one that equates them to the motion of dolphins traveling at speed, occasionally breaking the surface of the water and often under it.

From there, Metzinger poses the questions he believes he can answer. He writes, “Philosophers of mind often fall into the trap of assuming that goal-directed, rational thought is the paradigmatic case of conscious cognition. But if we are only ever partly aware of what is happening in our own minds, surely we can’t be in absolute command of our thoughts, let alone causing them? Is it ever possible to distinguish between mental actions, which we can direct and select, from the more general category of mental events, which simply happen to us? In what sense are we ever genuinely mental agents, capable of acting freely, as opposed to being buffeted by forces beyond our control?” (emphasis Metzinger’s).

This question perhaps the most fundamental philosophical question when it comes to thinking about the mind. Experientially, I think that we can agree that we have thoughts that we would assert we have consciously and willfully called to mind and formed and those thoughts that seem to be generated spontaneously and inexplicably—in other words, the conscious and the subconscious.

The only complaint that I have with Metzinger’s formulation of these questions is the rhetoric that subtly slips in to begin his arguments from the inception of the question. On the other hand, this is easily forgivable as something most, if not all, of us are likely to do even unintentionally.

The next paragraph begins Dr. Metzinger’s tenuous assertions. Relying on the “empirical findings” of neuroscience and experimental psychology in mind-wandering, he asserts that, “Much of the time we like to describe some foundation ‘self’ as the initiator or cause of our actions, but this is a pervasive myth” (again, emphasis is Dr. Metzinger’s).

Here’s my first complaint: there is no description of these “empirical findings.” Dr. Metzinger does not explain what experiments have been conducted, whether they are peer reviewed, whether they have been replicated, what the specific results are—or, really anything other than that they exist and we should allow him to interpret them for us. This is not evidence; this is the basic rhetorical technique of asking the audience to rely on your authority as evidence enough.

The first sentence of the following paragraph gets to the heart of the matter: “Mind-wandering research suggests that we need to get rid of naïve, black-and-white distinctions such as ‘free-will’ versus ‘determinism’, ‘conscious’ versus ‘unconscious’, and what philosopher’s call ‘personal’ versus ‘subpersonal’ processes (roughly, accounts of cognition that look at the whole person’s reasons and beliefs, versus those based on biological or physiological functions).” What!?! How did we go from “empirical findings” suggesting that there are a lot of subconscious activities going on to positing that we should look to a solely biological basis for consciousness? This is a logical non-sequitur in the extreme.

Nevertheless, the statement is revealing: it’s a 21st Century version of the “bag of chemicals” argument made in the early 20th Century (i.e., that all of our thoughts and actions are really the result of chemical reactions in body and brain without any real volition or self) so readily rebutted by G.K. Chesterton in Orthodoxy.

Rather than solely referring to Mr. Chesterton (whose arguments should most definitely be read), I’ll point out a few of the specific problems: (1) lack of any evidence for this provided; (2) lack of consideration of the broader findings of neurological research (which I’ll refer to in more detail in a moment); (3) the solipsism and circularity of the argument (how is it that Dr. Metzinger is so special as to realize the falsity of the illusion and then to explain it to others by random chance of his own mental events)?; (4) the complete and willful ignorance of the human experience. We might phrase the last objection in terms of Occam’s Razor: which is more likely, that when we feel we are exercising our will we are or that there are multiplicative, subtler and (so far) inexplicable mental processes going on that cause this illusion?

In the case of neurological research that seems to point to other than a solely materialistic explanation for cognition, I’d point you to Dr. Mario Beauregard’s The Spiritual Brain: A Neuroscientist’s Case for the Existence of the Soul in counterargument. In that book, Dr. Beauregard (a neuroscientist rather than a philosopher) explains how in certain experiments regarding addiction relief, it has been shown that the active cognition of the mind can actually alter the material function of the brain over time by creating new neural pathways. The whole topic of “neuroplasticity,” which is showing us that our brains remain more subject to change in adulthood than we previously thought, seems to cut against Dr. Metzinger’s argument.

As a caveat, when Dr. Metzinger says we ought to get rid of “black-and-white distinctions,” I think he’s right in that we need more complex and nuanced ways to think about the topic of free will as some interaction between personal volition and influence (or perhaps deterministic) influences. But this is nothing new in the philosophy of the mind (or theology, for that matter) and I’ve myself argued for such a position in previous posts. But when Dr. Metzinger’s seemingly-suggested resolution is to ignore one half of the equation entirely, we’re stepping backward instead of forward.

The logic further falters as Dr. Metzinger continues, writing: “As the dolphin story hints, human beings are not Cartesian egos capable of complete self-determination.” I would remind you that the dolphin story is a metaphor, by itself it cannot logically hint at anything except the to extent that it can be shown that the metaphor validly represents the things it is trying to explain (though this article contains none of that).

There’s a glimmer of reason after this, though, where Dr. Metzinger says, “Nor are we primitive, robotic automata. Instead, our conscious inner life seems to be about the management of spontaneously emerging mental behavior. Most of what populates our awareness unfolds automatically, just like a heartbeat or autoimmune response, but it can still be guided to a greater or lesser degree.”

I’d like to point out in the above that Dr. Metzinger wisely uses the words “seems to be” to indicate that he is speculating here. The problem, though, is that despite these subtle hints about the actual logical foundation of his argument (being very slight), he presents most of his ideas as authoritative through the rest of the article’s language.

For sake of time and space, I’m going to skip a few paragraphs where Dr. Metzinger discusses the positive and negative effects of daydreaming. He continues, “My view is that the mind-wandering and the DMN [what he calls the default-mode network of the active parts of the brain during rest periods) basically serve to keep our sense of self stable and in good shape. Like an automatic maintenance program, they constantly generate new stories, weaving back and forth between different time-horizons, each micro-narrative contributing to the illusion that we are actually the same person over time” (this time, emphasis is mine).

Again, Dr. Metzinger begins with words of speculation (“My view is…”) but then makes assertions as if they are fact. He’s put the cart before the horse here by assuming that the idea of the self is an illusion rather than a reality. And he’s done that without any evidence whatsoever. It seems here, as I think has become fashionable for some intellectuals investigating the still relatively terra incognita of the mind, to assume a Buddhist sort of worldview and then force the science to fit that mold. But the Buddhist idea that the self is an illusion is a religious and philosophical idea, not a scientific one. There is no defensible logic to starting with that assumption and working backwards. That’s simply not how science works.

The truth will out, as they say, and it certainly does in the next paragraph. Dr. Metzinger writes, “I should come clean at this point and confess that I don’t believe in any such entity or thing as ‘the self’” (emphasis mine). It’s a little late in the game here to make that confession—honest scholarship starts with a confession of biases that are known to the writer and probably unknown to the reader so that the reader can read critically. I think that this drives home the disingenuity on Metzinger burying the language of speculation with such extensive assertions of truth.

But it’s the assertion itself that is so ironic—who is making the confession if there is no self? The sentence, under Metzinger’s argument, is itself nonsense. And therein lies perhaps the biggest problem with the materialist approach to the mind—even the people who maintain that position cannot (and do not seem to try to) live as if it were true. The only way it is possible to interact with the world is through an understanding of self. That understanding may see itself as more or less connected to everything around it, but no one acts or thinks without reference to an “I.” If that “I” is an illusion, then there’s really no “I” to make the discovery that it is an illusion in the first place. Hence the circularity of this kind of logic.

To drive the weakness of Dr. Metzinger’s philosophy home, he then refers to “evolutionary psychology,” that perennial favorite of materialist thinkers like Richard Dawkins and Stephen Pinker. Evolutionary psychology is the field of making unfalsifiable assumptions about the development of the brain (and therefore mind) according to subjectively selected “societal needs” and then presenting those assumptions as fact. Dr. Metzinger joins in by arguing about the societal role of the “fiction” of the self, how “[h]umans have evolved to be a bit like method actors,” and asserting that “The self-as-agent is just a useful fiction, a neurocomputational artefact of our evolved self-model.”

This statement is unfalsifiable by scientific method because consciousness and self are, by their very nature, subjective. And yet, Metzinger presents his assumptions as the inevitable conclusions of science despite the fact that true scientific method (nor basic philosophical logic) would touch such a conclusion with a 10-foot pole. Further, Metzinger delicately (and probably quite deliberately) avoids issues like the “hard problem of consciousness” by simply denying that there is one.

In a further bout of spontaneous honesty, “But just as there is no ‘real’ character, there’s also no such thing as ‘a self’, and probably nothing like an immortal soul either.”

Metzinger is, for such an esteemed scholar, remarkably willing to conflate belief with fact and then to work backward from there.

I think it is sufficient to stop with a detailed rebuttal of Metzinger’s argument there, as the rest of the (lengthy) article simply repeats the same logical errors, rhetorical slight-of-hand and materialism as religious belief (in that it is the given from which all other inquiry begins) as science.

On the one hand, perhaps it is the arguments of the religious that have generated this kind of reactionary response. When we deny the usefulness of science because of religion (which, as I’ve often argued, we oughtn’t) it seems a natural (though not logical) response to use science to deny religion. And that’s really what these kinds of arguments are ultimately about (otherwise, why explicitly deny the existence of an immortal soul when the very argument makes such a distinction meaningless).

Frankly, I’m tired of it, on both sides. I’m tired of atheist materialists trying to claim philosophical and metaphysical truth through science and I’m tired of fundamentalist Christians denying evolution because the Bible doesn’t mention it.

To be clear, I have no problem with atheists saying that science leads them to believe in a solely materialist explanation for existence—they’re well within their right to draw that conclusion, even if I think it is the wrong one, just as some are led to faith because of their interpretation of metaphysical likelihoods based on science. Reasonable people may disagree, as we lawyers like to say. It’s when they claim that science proves their belief that I become offended as a person of deep faith who nevertheless is willing to make careful distinction between what science shows us (and often defers to science to inform theology) and what must be left to faith and belief.

At the same time, I’m upset both by the closemindedness and bad theology of those who question science based on Scripture that in no way asserts that that’s a proper (or even valid) way to analyze the world and the fact that, knowing I’m a Christian, many people with whom I’d like to have a real (and respectful) conversation about these kinds of topics will not listen logically because they somehow assume I’m that kind of Christian.

As I’ve said many times in the past, science is simply not equipped to answer metaphysical questions, which unfortunately must be relegated to the realm of belief, conviction, uncertainty and doubt. Let’s use science to examine and explore the material world, to learn what we can about all that we can. But let’s also admit when science is of no use and properly categorize those beliefs about the metaphysical as matters of faith, no matter who they come from, believer or not.

Sacrifice and Eternity

I don’t know what the world to come will be like. I have no special insight into what happens to us and where we go when we die, what will occupy us in eternity. Like most of us, I suppose, I have my hopes and comforting beliefs about what the life eternal will be; my stubbornesses about which I say, “If there’s not X, I’m not going,” like I have some control over the situation; those feelings you–every rare once in a while–feel and think, “This. This feeling lasting forever; that must be heaven.”

I try not to cling too tightly to any preconceived notion of what awaits us, instead trying to trust in God that it will be far greater than I could imagine anyway. Frankly, I’m not too concerned about whether heaven is some entirely spiritual dimension of existence or embodied life in a world restored by God to perfection on the last day.

But there is one thing that I do believe strongly: even in the life to come, there will be sacrifice.

I don’t mean sacrifice on the cosmic scale, no dying that others may live, no giving up all that I have so that others may have something, not the sort of things that make us look at others who can make those kinds of sacrifices with such awe and respect. I mean the more mundane, everyday sacrifices we’re already called to make. The lesser sacrifices necessary to mutual relationship: I’d rather do X in our free time, but since you want to do Y, let’s do that instead and we can do X another time. That sort of thing.

Perhaps this sounds a strange thing to fixate on, but I think it’s a necessary expectation based on the few things that the scriptures and traditions of our faith do seem to tell us about the world to come. Christ promises us “eternal life.” The words used to create that phrase in the New Testament carry the connotation of not just surviving, but thriving–active, vigorous, fulfilled, unceasing, experiential life.

That means a few things. First, we will be ourselves–perfected perhaps, the dross burned away–but recognizable as ourselves. And why wouldn’t we be if we believe that we are purposefully, “wonderfully and fearfully made?” Second, we will not be idle. I don’t know what kind of activity is planned for us–though I’m fairly certain it will not be sitting on clouds strumming harps, singing ceaseless hymns and trying not to think of toilet-paper commercials. Third, we will be together. There’s no good and dependable answer to the question who “we” is (I believe, based on my limited knowledge of God, that all will be there eventually), so let’s sidestep that conundrum for now. Third, we will be in relationship with God and others forever. Again, why wouldn’t we be? The overarching narrative of both the Bible as a whole and the Gospels in particular is that ours is a God who values relationship.

Let’s look at those three things together. We’ll be ourselves. That means that, just like we do now, we will have our personalities, our preferences, our likes and our dislikes (we can go down some dizzying rabbit trails trying to think about the scope and limitations of what preferences and personality traits qualify as being righteously permissible, just as we could with what kinds of activities will be permissible, but let’s not today). We will do things. As a person with preferences, there are some things I like to do better than others. And we will do them with others.

When people come together, it is natural for personalities to clash at times, for preferences to butt up against one another. Unhealthy relationships can be ruined by this, but healthy ones engage in minor sacrifice to work out such petty disputes. Those relationships that do become stronger for it–when you know that I’m willing to give in for your sake sometimes and you’re willing to give in for mine in others, we know that we mean something to one another.

It is fashionable to talk about this phenomenon as a quid pro quo these days, “deposits and withdrawals from the love bank” if we’d like to resort to some especially cloying purple prose. Human nature being what it is–and some people being who they are–I suppose that sometimes that’s a fitting description. But for the best relationships, you don’t make those compromises simply to get future compromises; you do them simply because you care about the other person.

There’s every reason to think that the relationships we enjoy in the life to come will be the best of relationships. For that to be the case, we will eternally need to make some minor sacrifices for each other at various times.

So best to start practicing now, because some aspects of the life to come are not about the where, the when and the what; they’re about the how, how you see others and your relationships with them. That’s one small reason that we say that the Kingdom of Heaven is both a present reality and a future promise: there are aspects of perfected existence that we may participate in here on Earth if we’re willing to. Christ’s incarnation accomplished more than cosmic salvation–it gave us an example of how to think and act so that we do not have to wait to participate in the fullness of the life to come.

As Milton says in Paradise Lost, “The mind is its own place, and in itself can make a heaven of hell, a hell of heaven.” Jesus showed us how to do the former. Those happy little sacrifices we make for the ones we love are an essential aspect of creating that double joy of relationships that both affirm each person involved and become something greater than either. No reason to think that will change in the world to come.

History and Historicity

I wrote in a recent post about some of the difficulties with issues of history and historicity in the Old Testament I’ve had in preparing for my impending journey to Israel. Having had some time to clarify my thoughts, I thought I’d share them.

First, I want to focus on an exemplum of my thoughts and then I’ll speak more generally. Let’s begin with the Bablyonian Captivity. Or, rather, a little bit before that.

In 1 Kings 18, the prophet Elijah confronts Ahab, the monarch of the Kingdom of Israel, on Mount Carmel in a rather memorable set of contests. Really, Elijah is confronting the worship of Baal in the Kingdom of Israel here, but Ahab is culpable for allowing the Israelites to stray from the worship of Yahweh alone.

The four-hundred and fifty prophets of Baal are asked to pick between two bulls brought to the mountain, to cut it to pieces and to smoke if over a fire; Elijah–as Yahweh’s sole remaining prophet–will do the same with the other. Then they will each call upon their respective gods and see who “shows up.” As the Baalite priests beseech their god, they get no response. With memorable taunts (Maybe your god is sleeping and needs to be awakened? Maybe he’s traveling? Maybe he’s busy defecating?), Elijah insults Baal’s prophets until it comes time for him to beseech Yahweh. When he does, the Israelite God sends his “fire” down to earth to light the prepared wood, burn up the bull carcass and the stones, soil and water prepared around the altar. After this, the priests of Baal are slaughtered by the gathered people.

I’m not actually interested in the historicity of this particular story but in what it tells us about the culture of the time (Ahab’s existence is attested outside of the Bible and he was probably king of Israel around the middle of the 9th Century BCE). As we find in the cultures surrounding Isreal-Palestine at that time, gods were viewed to be local; they were the gods of particular cities or nations. We see this explicit in other places even in the Bible, where the Isrealite God states that “he” is the God of Israel (hence the epithet “Israelite God,” I suppose).

What’s happening between the lines in this passage in Kings is a divine turf war. Baal (which is a title that means “lord” and which is borne by several distinct deity figures and used generally to mean “a god”) is a god of the Phoenicians in the city of Tyre. If you look on a map of Biblical Israel, you’ll see that Tyre is on the coast of the Mediterranean Sea (on an island, actually) just a short journey north from Mount Carmel. The question being answered by Elijah’s story is, roughly put, “If Baal is the god of Tyre, and Yahweh is the god of Israel, and they’re both geographically close to one another, which has dominion in the middle ground?” Clearly the answer is Yahweh.

I mention the above passage because it sets us up for the real point about history and historicity in the Old Testament that I want to make in this post. When in the (very early) 6th Century BCE the Babylonians under Nebuchadnezzer sacked Jerusalem and deported the Israelites to Bablyon, a crisis of faith occurred. Again, as a brief aside, this event is attested in the historical record outside of the Bible. If the Israelites were to worship the God of (the nation/land) Israel, how could they do that when they’d been transported to Babylon, the land of the Babylonian gods.

And here comes the prophet Ezekiel. In the verses that open the book that bears his name, Ezekiel tells us that he has a vision of God while among the exiled Israelites in Babylon “on the banks of the Kebar River.”

In this vision, as the Biblical historian Cynthia R. Chapman says, “God gets wheels.” Literally; Ezekiel sees God enthroned upon what I can’t help thinking of as a super-high-tech, four-likeness-of-living-creature-powered motorized wheelchair. That strange image aside, the point of the vision is that the God of Israel is mobile, that God is personally and actually present with the Israelites even in their exile. As a side note, my NIV says that Ezekiel is taken back to the “Kebar River near Tel Aviv”–this should be read as Tel Abib (in modern-day Iraq) by the Chebar River.

Hearing about the underlying spiritual-cultural concerns with regards to these (and other) Old Testament passages did much to “resolve” my problem of “historicity” in the OT (for purposes of this post, I have left aside all of the issues of the construction of the Old Testament text–whether discussion of the three hypotheses of its construction or the timing of its creation).

What I find here is something that makes much more sense to me than either extreme of the historicity debate–humans writing stories of their evolving understanding of and relationship with God. These stories are neither entirely myth nor entirely history; they are stories that draw upon historical experience (and the religious issues raised by that experience), mythological content that may or may not be based in fact (I’m not worried about the answer to that), revelation of the nature of God from God (there’s that spirit-breathed bit), and human reactions and struggles in response to that revelation.

I see this especially as the Israelite understanding of the nature of God breaks free from social precedent and evolves from polytheism to henotheism to true monotheism.

In some ways, what we have in the Old Testament is the macrocosm of Jacob’s struggle with God at Penuel–a back and forth between God and man that may defy explanation but results in relationship.

Does that make interpreting the Bible difficult? Absolutely; I don’t have an answer for you on how we best sort God’s intent from the voice of the writers from the historical record from the cultural context, etc. But I’m certainly willing to say that it’s not supposed to be easy. I can’t imagine that God would decide not to directly appear before all people in an unmistakeable way (which, to be clear, God hasn’t) and yet make Biblical interpretation something as simple as looking at words verbatim.

In the near future, I’m going to return to the Babylonian captivity and the Book of Job to talk a bit about theodicy in Christianity.

 

This Year in Jerusalem

In less than two months, I’m going to Israel. This will be my favorite kind of trip–a study trip with accompanying professors, homework beforehand, and the goal of coming to understand the cultures, history and geography of Israel and its surrounds to better understand scripture.

I am fortunate enough to have this opportunity because of K’s ministry; this is a trip for young clergy and their spouses. Some of my favorite people are going and we could be going to spend two weeks on the Jersey Shore (place or TV show) and I wouldn’t mind. But we’re going to Israel.

We’ve been given books to read and maps to study and mark up before we go. For me, it is a fascinating and tedious process–I see mentions of peoples or places or cultures from the ancient past and go on hours-long rabbit trails that conclude with me writing prodigious notes to myself on the backs of maps that have little to do with the main point of study. Did you know that there’s a good chance that the near-mythical “Hanging Gardens of Bablyon” were actually constructed by Sennacharib in Ninevah? The same Sennacharib mentioned in the Book of Kings as carrying off people from the Kingdom of Israel and nearly destroying the Kingdom of Judah?

None of that time is wasted, though I imagine it’s going to annoy the hell out of K as I offer (unsolicited) trivia throughout our trip. This course of study has made me feel like I’m in grad school again and–being the perpetual student–I’m loving it. Even moreso, it’s directed my thoughts in some ways that I believe will profoundly effect both my theological work and my fiction writing. Many of the things below will likely see their own posts and soon, but here’s some of what I’ve been brought to ponder lately:

Studying maps and geography has been eye-opening in terms of Biblical (and historical understanding). There are so many things that become clearer about both the broader historical context and the context of passages in the Bible when you understand (however abstractly at the current juncture) the “lay of the land.” The locations and movements of the Babylonians, the Assyrians, the Egyptians, the Arameans, the Edomites and the Moabites and all the other ancient cultures of the Levant sheds light the context of the Israelite people as they journey from polytheism to henotheism to monotheism (and back-and-forth quite a bit). The strategic importance of the Levant on the world stage and the many times it has changed hands over the millennia is just fascinating. There is much to be said about all this (much of it already written by experts in books, so who am I to spend lots of words on it?), but suffice to say that I have studied history extensively and am only now starting to see lots of things come into clarity now that I am locating events and peoples on the map relative to one another. Even some idea of motives become clear when you put things together. All in all, I’m actually a little disappointed that, in all of my historical studies, none of my classes spent a whole lot of time (if any at all) using the geography of the time and place to place events in context.

Of course, all of this study of geography makes me think of Avar Narn. This, coupled with the fact that I’ve recently spent some time looking through Tolkien’s old maps and drawings and their evolution has me going back to the layout and maps of my own fantasy world. I have posted some previous maps of parts of Avar Narn on this site, but I am afraid to say (but not really) that they will soon be obsolete and superseded as I replace them with a more thorough effort at consistently mapping the world (or at least the continent where most of my stories will be set) in line with its history, cultures, and languages. I am beginning to very much envy Martin’s choice to name everything in Westeros in English. In breaks from the map work for my impending journey, I’ve re-sketched the continent, placed it longitudinally and latitudinally, worked out air and sea currents (as best I know how) and, correspondingly, the various climate zones of the space.

Because of some changes in the history of Avar Narn which have occurred both in direct rewriting of that history and from ideas better explored (or created) in the first Avar Narn novel (still very much in progress, of course), there will be further changes to the locations of some of the nations and peoples as shown on the previous maps. Alongside that, I’ll be doing some renaming of locations to tie them in better with the linguistic work I’ve done for the setting. There will definitely be more posts about all of this in the future.

Back to the theology side, where I’ve also been thinking a lot about historicity in theology. This has been something of a journey and a struggle for me. Intellectually–as I’ve communicated through the blog before–I don’t believe that concern with historicity is the most fruitful or meaningful of concerns in understanding our faith. I believe that the text of scripture is divinely inspired–but also filled with human agency and not to be taken literally–but that it’s just not that important to know whether the Exodus actually happened as written (there’s no corroborating evidence that it did). What is important is what the story of the Exodus tells us about our God and the evolving Israelite understanding of God as a macrocosm of the way in which the individual gradually struggles with an understanding of the divine.

At the same time, I cannot deny that the historical context of the Biblical writings is illuminating in its interpretation. And in addition, I find that archeological and historical research can help us understand the human motives in the Bible and perhaps filter some of that from the divinely inspired aspects. For instance, there’s very little evidence to support that the conquering of Canaan as told in Joshua occurred; the gradual occupation of the land by the Israelites as told in Judges seems far more likely.

I’m fine with this; it doesn’t bother me that the Bible may not be historically accurate on all fronts–it’s not meant to be a history (well, maybe Joshua and Judges are somewhat) and has a different purpose for us that should not be confused with absolute historicity. In the same way, Genesis is not intended to be a science manual for the creation of all things.

What I’m struggling with is how emotionally bound to questions of historicity I find myself (in spite of myself). There’s a pang of upset in my stomach when I find that an event as told in the Bible is probably not actually how something happened.

I believe in the historicity of Jesus, but I’d be a Christian even if I didn’t–I believe that the story of Jesus tells us Truth about the nature of our Creator, Sustainer and Savior that is unfettered by our existential reference points. So why do I find myself caring so much about the answers to historicity? Especially when it’s clear that–in broad strokes at least–the Old Testament narrative about the Jews’ evolving relationship with God is borne out by the historical record. I think that this is a matter of wanting things to simple, clear and absolute. And this from someone who finds great wonder in the complexity, ambiguity and mystery of existence! Perhaps that’s what it comes down to–a minor crisis of identity. And that makes me wonder whether all obsessions with the historicity of Biblical events spawns from that very human concern.

In general, I think that the Christian testimony–as an unfairly broad generalization–would be perceived as more reasonable (which I believe it inherently is) if we were able to communicate our faith in a way that incorporates questions of historicity without being dominated by them. Many a man has gone in search of Noah’s Ark (with many claims to have found it), but even an indisputable confirmation of its existence would not tell us much about who God is that the Bible doesn’t tell us already.

I’m sure you’ll hear more about that as I post (to the extent that I’m able) from Israel, where the question will, in some part, be a constant concern of mine (if not all of us traveling together). Stay tuned for more on all fronts.

See my journal of the trip here.

Punctuation

This is going to seem like a relatively random posting, but as I’ve been writing on my novel, reviewing a friend’s novel and having some discussions about Biblical interpretation, I’ve been thinking a lot about punctuation lately. Here are some of my musings:

Punctuation is critical in all forms of writing; understanding and properly using punctuation lends authority to anything you write. In my experience, most people do not use proper punctuation. I don’t mean that they make occasional mistakes in their punctuation–everyone does that. I mean that they flagrantly ignore the rules of punctuation and how to use (and intentionally misuse) those rules to greatest effect.

The most egregious culprit is the semi-colon. When I was a graduate student and teaching assistant in English, I would (usually frustratedly and spontaneously after grading the first round of tests) spend a class period reviewing grammar and punctuation with my classes, with a particular focus on the semi-colon. Much to my dismay, what I typically found is that after this session, many students would liberally disperse semi-colons throughout their writing in an effort to seem more capable writers. “What’s the problem with that?” you ask? Nothing, if done correctly. But my students seemed to sprinkle semi-colons over their papers like literary glitter without regard for whether their sentences required glitter. Have I mentioned that I hate glitter? It’s craft herpes–once you’ve contracted it, you’ll be finding it on you forever.

So my students committed a cardinal sin of writing–using something (whether punctuation, a word, a stylistic device, etc.) you don’t understand in an attempt to come across as more talented than you are. Like most good writing techniques, punctuation is most effective when subtle, when it influences the reader without their perceiving that it is doing so. Like much social subtlety, this is crass when recognized and only acceptable in polite society when carefully concealed. The improper use of punctuation breaks the illusion, making this manipulation painfully and embarrassingly clear. A misused piece of punctuation–whether a comma splice or an unneeded semi-colon for instance–thrusts itself into the mind of the reader like an unwanted and socially awkward guest who cannot read the room. It breeds mistrust of the writer and should thus be avoided at all costs.

I think many of us, myself included, are embarrassed to look up rules of punctuation when we don’t know a proper usage. These are things we’re taught in elementary school, so we assume that they are so basic that a person of reasonable intelligence would not forget them. Nothing is farther from the truth. We start learning punctuation and grammar early because these things are difficult and require much practice. Writing is like a muscle, not like riding a bicycle–it atrophies if unused. Because of that, there is nothing wrong with having to refresh your memory about “basic” grammatical concepts. If it’s that big a deal, clear your browser history afterward. But, for the love of God, look up the rule in the first place if there’s any question.

The opposite of the above is, thankfully, also true. The proper use of punctuation is an extremely effective aspect of writing style. To be clear, the word “proper” as used here relies heavily on context. In (most) professional writing, rules of grammar and punctuation should be kept religiously. In fiction writing or circumstances where the perspective and mind of the author are part of the writing itself, the rules should be liberally–but carefully and thoughtfully broken.

I came across an excellent example of this (and probably the impetus for this post) while starting to read a friend’s young adult novel. The novel (at least as far as I’ve gotten) is told in the first person point-of-view of a sixteen year-old young woman. The style of the writing is clipped, using short sentences, sentence fragments and well-placed punctuation to convey the fleeting, sometimes confused and quite excited thoughts of this character as she attends a sort-of debutante party that she knows represents a crucial fork in the road of her life. The character comes to life not just in her words, but in the way that the punctuation groups her thoughts into clusters, abruptly changes subjects and gives us a feel not just for what she thinks but how she thinks. That is great writing; the kind we should all strive for. I’d love to include some examples here, but it’s not my writing to share.

And in that effort, we should bear in mind that there are a number of approaches to punctuation in any writing, but fiction in particular. I would–admittedly making this up as I go–call the above example an heuristic approach to punctuation. But maybe I ought to be less pretentious and call this a “character-based” approach. Alternatives, if you like, might be to call this a “stream-of consciousness” approach or even a “Joycian” approach. The punctuation defines the character, not the author or the style of the writing itself necessarily.

We might alternatively use a dramatic or theatrical approach. In dramatic works, actors are trained to use the punctuation as keys to the pacing, pauses and breaths in speech. Here, the punctuation serves as a code to help the written word mimic normal speech patterns. I find that most people naturally follow this approach when reading aloud, whether or not the piece is dramatic. So, using this method, good punctuation should be used to assist the flow of the text for the reader and to enhance both comprehension and enjoyment of the text. Does this sometimes overlap with the first-described approach? Probably, but not necessarily. Some people don’t think or speak in ways that are easy for others to understand, and not all points of view in narrative are going to be able to characterize and define those involved in the action described.

A more formal adherence to the “rules-as-written” of punctuation would likely serve the same function as the theatrical approach, though perhaps with a different feel. The ease of communication of content is paramount here, but should not be sacrificed for other cognitive effects that might be created in the mind of the reader through creative and effective punctuation.

I don’t think that it’s necessary, nor probably even helpful, to spend a lot of time trying to categorize your punctuational approach by the groups given above (or any others for that matter). What is important is to be intentional about your punctuation. This takes us back to Professor Brooks Landon’s comment that writing is “brain hacking.” Punctuation is an integral part to how your text creates, divides and sequences images and thoughts in the mind of the reader. Your punctuation should always be calculated to bolster the substance of the text to your desired effect. Is that easy? Hell no. But it’s certainly worth the effort.

 

Review: The Last Jedi

This is my first review of a film instead of a book, but Star Wars merits an exception, doesn’t it?

Disclaimer: I’m a huge Star Wars fan. I don’t own a lightsaber or much in the way of memorabilia; I’ve never been to a Star Wars con; and I don’t spend any time on Star Wars-specific forums or subreddits. But I’m still a huge Star Wars fan.

I grew up on the original films, and my first roleplaying game was the second edition of the old West End Games Star Wars RPG. There’s a special place for Star Wars in my heart, and it’s probably fair to say that, as a young person, it and The Lord of the Rings had the greatest influence on my fascination with fantasy and science fiction. I’m not sure I’ve played all of the Star Wars video games ever produced, but I’m sure I’m close. When Disney “reset” the canon, I began to pick up the books as well, vowing that I’d try to keep up with the universe this time in a way I never did previously.

So, like most of us, I think I went into this film with great expectations. I enjoyed The Force Awakens, but it followed too closely to the formula of A New Hope for my tastes. A few days before my trip to the theater, I heard a glowing review for the film on NPR–this only increased my anticipation.

The Last Jedi is, to date, my favorite Star Wars film. Before seeing it, I probably would have said that Rogue One was my favorite, as (predictably) I loved its grit and its willingness to take some narrative risks that the “main” films mostly shied away from.

The Last Jedi is currently my favorite Star Wars film because it does an excellent job of capturing the wonder of the original films while throwing in modern sensibilities. From the tactical gear worn by stormtroopers to the new variety of settings (like the casino-city of Canto Bight), the visuals of the film expanded on and brought the setting out of the late 70’s and early 80’s (while still sporting that retro style and incorporating the feel of McQuarrie’s art).

More important, the film moved away from pure Campbellian structure and adopted a depth and complexity that made everything feel that much more real. Both Rey and Kylo Ren have a depth to them that lacked in previous Star Wars films, and Skywalker himself added bore a combination of concealed hope, determination and burned-out jadedness that made us (me, at least) simultaneously love and hate him.

It’s quite possible that what’s going on here is that nuance is one of my very favorite things; The Last Jedi brings nuance to Star Wars in spades. One of the greatest things about the Star Wars universe is the ability to explore it–through the films, other media, roleplaying games, etc. The latest installment gives us permission to explore more than just the variety of the aliens and worlds in the setting, but a variety of moral questions and morally ambiguous characters–such as the rogue DJ.

In this, Star Wars has finally come into its adulthood. At forty years old, it’s certainly a late bloomer, but well worth the wait.

Additionally, this film follows some very interesting trends in the setting since its acquisition by Disney. The first of these is, as a friend put it, “the democratization of the Force.” We’ve seen that in the series Star Wars: Rebels, which adds several surviving Jedi other than Luke to the canon, and its certainly a driving force (pun intended, I have) in Luke during this film.

For me, this is very well taken. As much as I love Jedi as the samurai priest-knights of science-fiction bushido–Buddhism, I’ve long been of the opinion that, from the perspective of the common person in the Star Wars universe, they’re more trouble than they’re worth. From that perspective, they tend to be self-righteous, religiously fanatic, prudish and unwelcome intervenors with a tendency to bring at least as much (and possibly more) conflict than peace. Their obsession with balance in the Force makes them seemingly culpable of making peace with some injustices and the Jedi Code (to me, at least) reeks of insupportable philistinism–they are supposed to represent light and good, but are told that they should never love and should avoid attachments. Rather than embracing suffering and attempting to overcome it, they simply attempt to avoid it altogether. If the only thing we have to fear is fear itself, the Jedi Code is–again in my estimation–emblematic of the corrupting power of that meta-fear.

I realize my nerd is showing; but you knew what this was before you started reading.

As Luke says, it is time for the Jedi to die. They ought to be replaced by a new type of Jedi who eschews a rigid and unflexible Code in favor of striving for the greatest good–in favor of following the Light side of the Force with reckless abandon. But keep the lightsabers, because they’re cool. Before the film released, there was much speculation that there’d be movement toward the philosophy of the “Gray” Jedi (look it up). I think The Last Jedi has given us some indication of that.

Not to overly combine my interests in this blog, but the message of this film regarding the Force is quite apropos for the times. It is a call to move away from the uncompromising nature of fundamentalist religion and toward the truer (but more difficult) ambiguity of seeking after good and valuing Creation and relationships. It is a condemnation of the consequences of unquestioning religious fanaticism which, paradoxically, tends to ignore and reject the deeper and more important ideals on which the religion (whichever it may be) is based.

And maybe that’s what I liked so much about this film. Yes, it was a lot of fun. Yes, it was well-written (there are some arguments about this, but I stand by my statement). Yes, the characters were good. Yes, it’s Star Wars. But most important, it’s a deeper Star Wars that allows us to struggle with philosophical, moral and existential ideas rather than giving us a mythopoeic argument for a two-dimensional worldview. It’s Star Wars that is, at its core, theological.

 

The God Who Chooses Us

It’s Advent, and I’ve been thinking about the Incarnation (no surprise there).  I am less concerned with the “how” of the Incarnation and more concerned with the “why.” My faith in the sovereignty of God means that I believe that God could have invented all manner of possible solutions to the problem of sin (not that we humans have intellect sufficient to speculate very effectively about what those infinite possibilities might be).

Nevertheless, there can be no doubt that the Incarnation tells us something about the nature, purpose and personality of God. What I find in exploring these issues is one of the most profound aspects of my faith, one I’d like to share with you.

Let’s begin with the question of God’s “passibility.” This is often defined as “the ability of God to suffer,” but this is not entirely correct. The truer definition of passibility is “the ability of God to be affected by some force or influence external to God’s self.” In plainer terms: can something make God be or feel a certain thing or do a certain thing?

The question is important because it presupposes a problem: If God is “passible” there is something in the universe that is more powerful than God because it can overcome God in some way, challenging God’s sovereignty. On the other hand, if God is “impassible” and cannot be affected by any external thing, can God feel sympathy with us? Does God “feel” anything, since feelings are responses caused (at least sometimes) by external forces?

This is perhaps the most fundamental question of theology–can God be both sovereign and good? If the answer is yes, then the basic nature of existence should be one of hope. If not, despair. All aspects of theology are influenced by the answer to this question. In theodicy, the question of evil only exists in such a troublesome state if God is both good and all-powerful. If not, we have an easy explanation for the existence of evil. The meaning of scripture, of the working out of salvation, of the Incarnation, all of these turn on this answer.

Let me propose that there actually is no problem in the question of passibility, though what the solution tells us is nothing short of amazing in its furtherance of the understanding of our God. We affirm that God is sovereign over all things and cannot be unwillingly affected by something external to God’s self. But if God cannot allow God’s self to be affected by some external factor, than God would not be sovereign, for God could not overcome God’s self. The God who cannot self determine is not impassible.

So, it does not follow that the all-powerful God is not good or cannot feel–God has chosen to be good and God has chosen to feel, to be affected by God’s Creation. To be in active and meaningful relationship with all of Creation.

The theologian Thomas Jay Oord has very convincingly argued exactly this–that God is impassible but has affirmatively chosen to suffer with us. For me, though, the realization of this came from G.K. Chesterton’s Orthodoxy:

“Christianity is the only religion on earth that has felt that omnipotence made God incomplete. Christianity alone felt that God, to be wholly God, must have been a rebel as well as a king. Alone of all creeds, Christianity has added courage to the virtues of the Creator…But in the terrific tale of the Passion there is a distinct emotional suggestion that the author of all things (in some unthinkable way) went not only through agony, but through doubt. It is written, ‘Thou shalt not tempt the Lord thy God.’ No; but the Lord the God may tempt Himself; and it seems as if this was what happened in Gethsemane. In a garden, Satan tempted man; and in a garden God tempted God. He passed in some superhuman manenr through our human horror of pessimism. When the world shook and the sun was wiped out of heaven, it was not at the crucifixion, but at the cry from the cross: the cry which confessed that God was forsaken of God….”

The Incarnation and the crucifixion represent God’s choice about so many aspects of existence. Much has been written about its meaning as God’s choice to redeem humanity (I, personally, favor Karl Barth for this investigation and discussion), but I think that far too little has been put to paper (or screen as the case may be) about what the Incarnation says about God’s justice.

In the Incarnation we see God’s choice of (semi-)passibility as one of the few answers to the problem of evil that we humans can actually understand: No matter what suffering God has allowed to befall humanity, no matter why this suffering has been allowed (which we are ultimately incapable of explaining), God is so just as to not allow God’s creation to suffer anything that God will not suffer with us in the most personal and intimate of ways.

In Christ’s birth, we see God’s choice to be with us, not just physically, but existentially. How amazing is it that the God of all Creation willingly suffers with us for us. God is all-powerful; God has chosen to be good to infinite extents we cannot possibly imagine.

I invite you to keep this in your heart as we await the Christ child.