The End of Violence, Part II: Jesus and Just War

For the previous post in this series, click here.

Jesus is pretty clear about violence, it seems. We are to “turn the other cheek” to “do unto others as we would have them do unto us” and to “love our neighbors.”

The “sheepdog” community (those tactically-trained civilians who see it as their duty to protect the unarmed masses from threats—we’ll talk more about this later) often likes to refer to Luke 22:36, where Jesus tells the disciples, “And let the one who has no sword sell his cloak and buy one…” as justification for the carrying of weapons and use of defensive force. This, I think, takes the comment out of context. Just two lines later (Luke 22:38), the disciples bring Jesus two swords and he says “that is enough.” Something non-literal, something symbolic is taking place here.

Some theologians point to the passage as Jesus ensuring that a prophecy is fulfilled, without any real intention that the disciples take up arms. Indeed, Jesus has up to this point defied the expectations of those awaiting the Messiah and avoided leading an armed rebellion against Roman overlords.

The authors of the New Bible Commentary: Revised Third Edition go even further, comparing the statement with previous times Jesus has sent the disciples out with nothing—especially not swords—and they had been provided for without fail. The statement “That is enough,” is Jesus ending a conversation the disciples have failed to understand rather than commenting on the number of swords he has been brought.

This jibes well with the events in the garden of Gethsemane, when Jesus rebukes Peter for cutting the ear off of one of the men who comes to arrest Jesus, healing the man and warning that “they who take the sword shall perish by the sword”—violence begets violence (Matthew 26:52).

The entire thrust of Jesus’ ministry makes clear that Jesus would have us love, and that his conception of love does not brook violence, right?

I think that we can definitely say that Jesus tells us (and experience bears this out) that violence is never a good solution to a problem. But does that mean that the Christian should never use violence as a last resort?

An example of the other side of the coin can be found in Heinlein’s Starship Troopers, where the main character’s civics teacher, a military veteran himself, retorts to a student who complains that “violence never solves anything” to “tell that to the Carthaginians!” (During the Punic Wars the Romans completely devastated the Carthaginians so that they could never again be a competing world power against the Republic.)

It is true; violence does solve problems. A person you’ve beaten into submission or killed is no longer someone you have to argue with (at least not directly). But that doesn’t mean that violence is ever a good solution. Still, the exchange in Starship Troopers does, I think, lay bare the purpose of violence—to end a conflict that cannot be ended through peace, agreement and reason. Let’s explore whether that end can ever be legitimate in light of Jesus’ example.

I’d like to talk about Just War Theory or Doctrine. Just War Theory (within Christianity) has two primary concerns: when it is just to go to war (or to use violence) and how war must be ethically conducted (how violence may permissibly be used). In essence, this is the same inquiry I’m making in these posts, but I’d like to point out some places where I disagree with the commonly propounded aspects of the doctrine.

Both Augustine and Aquinas believed that war could only be justified by a proper governmental institution. To them, this was a safeguard that the aim of a war complied with the greater good of the people. Unfortunately, I think there is a greater tendency for violence authorized by the state to be unjust than to be just. In many cases, the desire for the highest degree of national security and the desire to act ethically are diametrically opposed. To make the state the arbiter of when war is just or not implies that the actor contributes to the righteousness of the thing at least as much as intention.

I also disagree with doctrines of Just War that assert that the punishment of a guilty party is sufficient cause for war (see below) or that a preventative war might be permissible as just. One of the conclusions that I’ll ultimately arrive at is that violence must be used only to prevent an immediate threat. A peremptory strike may obviate the possibility of attempts at peaceful resolutions.

Regarding the conduct of war, experience shows that no war is just in its conduct. Even in the best of circumstances and the noblest of intents, there is death, suffering, exploitation, humiliation, fear and a whole host of other undesirable ripple effects. These are things that may be necessary, but should never be called just. Can war be conducted ethically? Yes, particularly on the individual level. In the greater scheme of things, I’m not so sure.

On the subject of justice, my reading of history seems to indicate that most wars simply set things up for the reasoning of the next war. Consider the American Revolution and the War of 1812, the first World War and the second, with the Treaty of Versailles placing Germany in such a position as to allow the rise of one like Hitler. Shouldn’t a just war result in lasting peace? I’m not sure that there’s ever been such a thing.

That said, I don’t want to say that wars are never necessary or that they never accomplish some good. Certainly, Hitler and the Third Reich needed to be stopped because greater suffering would have resulted from their victory than from fighting them, steep as the cost was. Even less do I want to say that soldiers are evil, or even necessarily wrong, in the professional practice of violence. I hope that this will become clear as these posts continue.

So, point and counterpoint—Jesus tells us to avoid violence, but World War II gives us a seeming example of when violence proved necessary. And here’s the crux of this whole issue: we Christians want (or at least ought to want) to love as fully and deeply as Jesus did and to avoid violence, but sometimes violence seems like the best of alternatives. How do we resolve that discrepancy?

For the next post in this series, click here.

The End of Violence, Part I: Introduction

This Saturday, I’m going to a combined defensive pistol and defensive carbine class. It’s not my first tactical shooting class, and I’ve in the past been an NRA Pistol Instructor and a Texas Concealed Handgun Instructor. Regardless, the occasion seems a good one on which to share some of my thoughts about violence given my Christian faith.

This is not an easy subject and, while I’ve spent a fair amount of time studying various martial arts—krav maga, karate, sport fencing, historical European martial arts (swordplay, knife/dagger fighting and wrestling, mostly) and shooting—I can’t say that I’ve ever been in a real fight. As such, I simply don’t have access to the experience of either the event itself or the psychological aftermath. I invite those with such experience to comment on this series; I’m going to attempt to restrict myself to the abstract and philosophical side of things.

As a person of staunchly moderate political leanings and progressive theological positions, I’ve had the rare opportunity to be considered both conservative and liberal. Coming from one of the most diverse counties in the U.S. and being a theatre person with friends holding a diversity of sexual orientations and gender identities, moving to College Station put me solidly in the liberal minority, at least among the studentry (I nevertheless had no problem finding likeminded people—Texas A&M is a big school, after all). Then, going to Austin for graduate school, I suddenly found myself to be considered a conservative by my peers.

When many of my fellow students of medieval and renaissance literature discovered that I had a license to carry a concealed handgun, they suddenly had this idea that I had fashioned myself a vigilante; that I wanted to live in the Old West and have a shootout at high-noon; that I had naively decided that combat would be fun (or evilly decided that hurting other people could be enjoyable). When I explained myself, however, I often found them surprised.

I told them that I preferred to carry—legally, and not all of the time (campus carry was illegal back then, of course)— a firearm that I had trained seriously with because that way I knew that I would walk away from a violent confrontation (or, I at least had a good chance of doing so) and that I could try every non-violent dispute resolution technique I could think of rather than responding out of fear. Indeed, as a Resident Advisor at Texas A&M I had been trained in conflict de-escalation, and Texas requires similar training as part of the concealed handgun license coursework. I am convinced that there is no more valuable skill that a person may learn—whosoever they may be—than how to communicate peaceably, respectfully, empathetically and constructively with others, even if that results only in an agreement to disagree. In the broader scheme, more training in the world in how to relate and talk to people with competing interests would save more lives than all the firearms training in the world.

That was certainly my experience the only time I ever came even remotely close to drawing my weapon when carrying it. This was, conveniently, in Austin. K and I were living in an apartment on the southwest side in a suburban area well away from campus. Nevertheless, at about 2:30 in the morning one night, some undergrads in the next building over were blaring music, drinking heavily, and throwing beer bottles into the parking lot from their third floor balcony.

Admittedly, I am a very grumpy person when disturbed from my slumber. I got up, put some clothes on, and holstered my pistol in concealed holster just in case. The first move was mine, and I immediately made a mistake: upon getting close to their building I yelled up at them to turn the music down, using no expletives but not in the friendliest of voices. Immediately, three men, all very inebriated, ran down the stairs to confront me. I stood my ground but tried to backtrack, apologizing for yelling and explaining that I wanted to come ask them to turn the music down and stop throwing beer bottles rather than just calling the police.

They responded with threats. I kept my hands up and palms toward them in a non-threatening manner (also because it happened to be a good defensive position, just in case), but I also made clear that I was not intimidated. I repeated my request matter-of-factly, despite their threats at my mention of the police (they were happy to remind me that they outnumbered me, despite the fact that they were all practically falling over on their own and any collaboration between them was certainly out of the question). In the end, it became clear, perhaps as it should have been from the beginning, that I could not reason with them. I cautiously removed myself from the situation, returned to my apartment and called the police. The next day, I reported the confrontation (although not my possession of a firearm, which was immaterial as it was never produced) to the apartment management. The offending tenants were evicted for threatening fellow residents—a clear violation of the Texas Apartment Association form lease.

I’d like to think that, despite my rough start, the confrontation went about as well as I could have hoped for—I walked away unscathed and without the regret and what-ifs what would have attached if I had injured someone else—justifiably or not.

But the point of this post is not to talk about me (although I hope the long introduction has provided some background to my own biases and experience). Let’s talk about weapons, violence and Christianity. We’ll start in the next post.

Skepticism in Faith, Part 2: Logical Skepticism

For the first post in this series, click here.

We talked about a general skepticism of what we can know and how we know what we know in the last post. From this point on, we’re going to take the position that, despite our inability to be absolutely certain about our knowledge, we humans are capable of gaining “functional” knowledge of at least some things—that is, knowledge that approximates capital “T” Truth closely enough that we can reasonably rely on it.

Under that position, the next point of skepticism I’d like to discuss is a healthy skepticism about the ways in which we achieve knowledge and about claims made about the limits (or lack thereof) about certain paths to knowledge.

Let’s talk about science. I must first admit that science does an excellent job of telling us how the world works. However, I would argue that we must maintain skepticism about the extent of science’s ability to tell us about existence, particularly when it comes to the spiritual or metaphysical.

Reputable science requires implementation of the scientific method.[1] Under scientific method, the researcher/investigator must be able to create testable predictions about the object or process under study, a falsifiable hypothesis that may potentially be disproven through experimentation. If the predictions cannot be evaluated in a way that actually tests them, scientific method cannot be applied.

In a way, scientific method follows with a form of epistemological skepticism. Despite talk about the “laws” of physics and such, science doesn’t actually prove things in the way we laypersons tend to think of proof. Instead, science steadily disproves alternative explanations until we reach explanations that seem to be creeping ever closer to reality, but never absolute certainty (although close enough to treat it as such—by this point, Newton’s laws are as much a certainty as is possible).

Science, and particularly theoretical physics (which I greatly enjoy learning about so long as you don’t ask me to do any calculations), does often start with a theory based on observation and testing for refinement, but the testing of theories still involves attempts to disprove them to see whether they survive such analysis.

Here’s the issue where skepticism of the scientific method (as a general example of what I’m calling “logical” skepticism) comes in: some purport that science “proves” things that cannot be falsified by experimentation. Here’s a short list of examples:

(1) The existence of God. There’s not a scientifically testable hypothesis here. Yes, you can have a hypothesis, but it’s only as good as something like “I speculate that the color blue looks the same to me as it does to other people.”

(2) The materialist worldview. Again, this is a hypocritical application of science to try to “disprove” the existence of a spiritual reality; science isn’t equipped to answer those questions and those who use materialist to assert the absence of a spiritual reality have created an atheistic religion around science; a certain threshold of honesty has been crossed. To me, just the fact that there are very intelligent scientists who say “science made me a believer” and also very intelligent scientists who say “science made me an atheist” reveals the failing of science to definitively answer such questions.

(3) Near-death or mystical religious experiences. The problem here is in the name; it’s an experience, and thus not fully communicable between individuals. That said, the thrust of materialist science has been to “prove” that such experiences are actually the result of chemicals affecting the brain (ketamine for one) or electromagnetic effects on the same (the famous “God Helmet” experiment). Scientifically, those types of experiments are flawed in that they can demonstrate correlation but not causation (which takes us back to Mr. Hume, interestingly)—they can say, “we notice high levels of ketamine in the brains of people who later claim near-death experiences,” but they can’t logically claim that that means that ketamine was the cause. It could be possible that a near-death experience causes a release of ketamine in the brain; we just can’t know. Further, many experiments of this nature have been shown to be irreproducible, a key factor in scientific theory—a group of northern European scientists attempting to recreate the “God Helmet” study concluded that the results came from bad scientific method and the power of suggestion upon test subjects, not electromagnetic fields.

(4) Qualia. The “thingness” of subjective conscious experience. Both philosophy and science have thus far proved of little help in the analysis of experience. This is a natural consequence of the existential fact that we do not have the ability to share our own experiences with others and are therefore inhibited by the constraints of language from making deep comparisons of subjective experience between individuals.

Perhaps advances in science and scientific understanding will help us to answer some of the questions above with experiments I simply cannot conceive of with the knowledge available. However, I choose to believe that there is a damn good reason the most important questions are not readily answerable, and I think that that reason points to God’s purposefulness. I digress; we can discuss that another time.

It has become popular among certain scientists, like Steven Pinker, to create new fields of science starting from preconceived suppositions about the way the world works and using the new field to support those suppositions—“evolutionary psychology” is, I think, the foremost offender in this field. If you’re not familiar, evolutionary psychology seeks to explain modern human psychology as the result of greater or lesser degrees of evolution, in a similar way to the evolution of the human physiology. Now, admittedly, the theories of evolutionary psychologists could be absolutely true (though I strongly doubt it). The problem is that they sell the field as science. We don’t know enough about the psychology of ancient homo sapiens and his predecessors to do anything but speculate about the origin of our own psychologies, much less create a falsifiable hypothesis that can be tested—the conditions in which to test such theories have long expired. Interesting ideas to be sure, but it remains disingenuous to call them science.

It is only fair, as someone who believes in both science and faith and sees the relative boundaries for their application to certain questions, to apply the same sort of logical skepticism to faith.

At the end of the day, faith is the belief in certain answers to questions we cannot otherwise answer. That does not mean that we should look only to faith and tradition to answer every question about the world around us.

In the first of two points I want to make about logical skepticism in faith (with, of course, particular reference to Christianity), let’s talk about the Book of Genesis.

There is a trend among evangelical Christians, particularly in America, to believe in the literal truth of the Bible. Having read some of my other posts, you should know that I do not ascribe to, and passionately resist, such a belief as a necessary (or even beneficial) aspect of Christian faith.

Genesis gives us a creation story that, if read for allegory and metaphor, actually doesn’t clash much with what science tells us about the Big Bang, evolution and other well-supported theories about the physical origins of matter and life. Adam Hamilton has written some good work going through the ways in which faith and science coincide in Genesis; I believe that this is in his Making Sense of the Bible but I’m not sure as I write this post.

And yet, many Christians want to read Genesis as a literal explanation of Creation. Here’s where logical skepticism comes in:

First, let’s apply some logical skepticism to Biblical literalism in general. The doctrine asks us to believe that every book in the Bible was written directly by God through some form of automatic writing in the humans that penned it. I would not say that God could not do this (that would be foolish), but experience indicates that this doesn’t seem to be God’s usual modus operandi. Of course, using strict logic, this is not a question that can be definitively answered.

So, let’s consider some additional thoughts. When Jesus speaks, he usually tells stories and uses metaphor (see my earlier posts on Ambiguity in Scripture for an examination of how this makes his words more powerful and effective); rarely does he speak in a straightforward and plain manner—when he does, it is almost certainly a command to love.

If we want to result to hard literary criticism, we can note definite stylistic differences in books of the Bible, sometimes competing purposes or concerns (each of the Gospels recounts many of the same events but with different perspectives, motivations and goals) and even different underlying ideologies (like Platonism in Paul’s epistles). While God is certainly capable of using different approaches and different purposes between books, multiple authorial voices may be a better explanation.

Historically, we can point to the different periods of writing of the books of the Bible—Paul didn’t have access to the Gospels, for instance—and the long history of the compilation of the certain books that form what is accepted as the canonical Bible with the selection of certain texts over others, concerns about forgeries, dubious authorship and poor copies all along the way. We didn’t have the Bible as we commonly think of it until rather late in the 4th century.

One that needs little explanation: If we take Jesus’s statement that we ought to cut off body parts that cause us to sin literally, we ought to have a lot more amputees.

Again, none of this disproves the position of Biblical literalism and infallibility, but the evidence taken together makes it highly unlikely that such a position can stand under its own weight.

More important, because it applies not only to the question of Biblical literalism but to theology in general is that any theological system must maintain internal consistency; it should not contradict itself. When we take literally both the Old Testament events in which we are told that God endorses wholesale slaughter and Jesus’s command to love our neighbors as ourselves, we have problems in logical consistency.

I have heard many seemingly-commonly-held theological positions within Christianity that openly court such contradiction. Take “God cannot stand to be in the presence of sin”, for instance, a statement that is sometimes used to explain the need for Jesus’s redemption. The very statement is self-contradictory, because Jesus spends most of his time (all of it really) in the presence of sinners. If Jesus is wholly human and wholly God, the statement cannot stand. That it begins with “God cannot…” should be our first clue. We can’t rightly talk about “God could not”, though we might talk about “God does not” (or, correctly, “God does not seem to”).

To combine our skepticisms of both science and theology, when there is dispute between science and scripture, we ought to rely on the science to tell us how the world works and our faith to explain to us how existence works. I believe in evolution as the likeliest explanation for how humans became humans, but that doesn’t tell us why there are humans, or why, in a cosmic sense, there is life at all. I believe that we should incorporate new scientific understandings into our understanding of God—if God created the world in a certain way, why might God have done that? The synthesis of science and faith can do much more for us than vainly attempting to pit the two against one another.

But this brings me to the ultimate point: logical skepticism gives us some intellectual honesty. The tendency to question whether certain evidences prove something (much less how they prove it) protects us from logical fallacies.

__________________________________________________________

[1] There are some competing theories of scientific methods, such as the “anything goes” approach espoused by Karl Feyerabend, but these I think are sufficiently held to be “out in left field” by the scientific community at large to be largely discounted.

First Day of School

Maybe “school” isn’t the right word for it; at five months (Abe) and almost two-and-a-half (Bess), there’s not going to be a lot of hardcore academics. There will, I’m sure, be the learning of things  just as important–how to make friends, how to deal with the unexpected, how to adapt to unfamiliar places (already a competency for them).

Yesterday afternoon we got the call that the children had (finally) been approved to attend the Montessori School where we’d wanted to put them. Some background:

One of my partners at the law firm has her son there in the nursery; her daughter just graduated into kindergarten from the school. On top of that, this partner’s husband is a Montessori-certified teacher himself, so if it gets the stamp of approval from him, that speaks volumes. We toured the facility some months back before taking our placement and were well-satisfied.

Here’s the rub: private school is expensive. The school was solidly out of the price range for a church-worker and a young attorney with a start-up law firm. But, foster children are sometimes eligible for pre-public school education to be paid for, and the Montessori School just happened to be one of the two places approved by CPS for such funding.

When we first got our placement, the original CPS worker had told us that she’d filed the NCI (the funding program) paperwork for us, but that it could be 30 to 45 days before we’d get approved. “No problem,” we said and set about using vacation time to each each work half days in the office and half days at home.

This week K was bound for Dallas to attend internship orientation for her seminary program. We knew this in advance and hoped that the NCI would clear before then.

Thus, it came as a shock when we found out mid-week last week that the NCI paperwork had in fact not been submitted. Fortunately, we now have a good team behind us–our DePelchin clinician has been excellent all the way through and we now have a solid long-term CPS worker who knows the ropes.

Our CPS worker faxed K the paperwork we needed to fill out the same day it was discovered that the first worker had not submitted anything, and we were assured that things would be expedite as much as possible.

That left me taking off three days of work this week to manage the kids. I had great help from our parents (with whom it was nice to get to spend the time), but it was still exhausting. So, when we got word yesterday that they could start today, we were both relieved. I’m finally back to the office full-time, where spare moments can be devoted to writing instead of chasing little ones. At the same time, it does feel strange to spend so much time apart from them today.

I’m excited to find out how the first day went (and excited to have another full day in the office tomorrow)!

Some Clarity

A few weeks ago, K and I met with the ad litem in the kids’ case (the attorney appointed by the court to represent the best interests of the children). He’s a good guy and provided us with a lot more clarity about the situation than CPS has.

Unfortunately, the news was not the news we wanted to hear. Not only does the ad litem believe the children will be going back to family, but he indicated that they would likely go back well before the twelve months for the permanency plan is complete.

We’re likely to have Abe and Bess for a few more months, but it is very unlikely that the two will be our “forever family,” as they say. The upside is that the ad litem believes there will be a safe place with family for the kids to return to: the situation was described to us as “a good family with a wayward daughter” (the mother of the children). That being the case, it probably is in the best interest of the children to return to family members who can love and care for them. But that will not make it easy to let go.

I’m not sure if knowing this far in advance is a good thing, either. Yes, it gives us time to prepare for the day when we will have to send the kids away; if worked through properly, that could prove very helpful. Conversely, if we don’t work through the impending loss in a positive way, it could be quite the opposite. Most of all, K and I must be careful not to guard our hearts too much–we need to give these kids all the love we can in the time that we have with them. And, nothing is done until it’s done. Despite the high likelihood that the kids will go back, nothing is a sure thing yet.

This puts K and I in the awkward position of needing to decide what our plan  will be in the likely event that the kids go back to family. We’ve started to discuss, but a plan is still in the works. We’ve decided it will be best to take some time off before accepting a new placement to make sure we’ve properly worked through our emotions. How much time has not been decided. With our available time away from work largely exhausted for the rest of the year, our next placement would need to be school-age children if we accept a placement sooner rather than later. If we want to try again with small kids, we’ll likely need to wait until 2017. No decision has been made about this.

In the meantime, we’re going to focus on getting and giving all the joy we can, continuing to strengthen our relationships with Abe and Bess and providing whatever we can to brighten their futures, whatever that future may be.

Skepticism in Faith, Part I: Epistemological Skepticism

I’ve said before, and will likely say many times again, I believe that a skeptical approach is essential to true faith, as it causes us to test ourselves and our beliefs. In this series of indeterminate length, I want to look at a few types of skepticism and why they are helpful to us. We start with epistemological skepticism.

If you haven’t studied formal philosophy (and why would you; you want to have a job, right?), epistemological skepticism is a pretentious way to say doubt (skepticism) about human knowledge (the study of which is called epistemology). I am of the opinion that is highly unlikely (perhaps impossible given the limits of our understanding) that humans have perfect knowledge of any aspect of reality.

Let me borrow an example from eighteenth-century philosopher David Hume. Let’s say you have a billiards table and balls on the table. When you shoot pool, you rely on the expectation that the angle and speed at which the cue ball hits your target ball will determine the direction and speed of that target; this is simply vector physics.

But think about that exchange for a minute. Why do you believe that hitting the target ball with the cue ball—or hitting the cue ball with the pool cue for that matter—will result in the struck object moving? Because every time you have done it before, that’s how it’s worked. In fact, every time you’ve applied force to any object in the physical world, it has reacted in relation to the intensity and direction of that force.

Now ask yourself whether that experience proves the relevant laws of physics. If your answer is “yes,” you’re unfortunately wrong. What you have is a one-hundred percent correlation between the cue ball striking the target ball and the target ball moving in a specific way. Correlation is not causation. You cannot prove that the balls might not do something different the next time they are struck, or that it is steady coincidence that they have moved in the way that they have.

Now, on the one hand, this is an argument ad absurdum.[1] You “know” that that’s how physics works, you’ve relied on that your whole life and regardless of what I say here, you’re going to continue to rely on that. You should; it would impossible to live a reasonable life without relying on that expectation.

On the other hand, it does pose some important questions: how do we know what we know? Do we know what we think we know? In short, the causalities that we rely on are really high levels of correlation that strongly imply but do not prove causality. This is just one example, and epistemological skepticism as a whole is doubt about our ability to accurate understand reality for what actually is.

Epistemological skepticism keeps us humble—it reminds us that we may only have good approximations of answers and not answers themselves. Such a thought requires us always to revisit our ideas to determine if they may be improved, if we may edge just a little closer to actual reality, understanding that we remain ever within a cloud of uncertainty around the actual point of truth.

If we humans through our own efforts can never know exact truth, do we have any access to capital “T” Truth? God’s omniscience understands all things as they actually are and God’s omniscience allows God to reveal that Truth to us according to divine will. Hence scriptures that tell us God’s understanding surpasses human understanding as the stars are far above the earth and that God’s wisdom makes fools of the (human) wise. We have, in modern society, lost much of the mystical and intuitive practice of the Christian faith.

On the flipside of this, skepticism about the quality of our knowledge also helps us discern what might be a revelation from God and what might be us fooling ourselves, or engaging in wish fulfillment, or trying to cover our own desires with God’s permissions.

More important, this kind of skepticism makes manifest the importance of where we put our faith and belief. If there’s little or nothing that we can be absolutely sure of, what statements of truth do you believe so fervently as to call them Truth and live as if they are absolutely true?

Such questions, I hope, make it clear why the Bible warns us not to judge others—can we really be so sure that our judgments are right? If this skepticism leads us to try to live in peace with one another despite our differences, it is priceless.

For the next post in this series, click here.

 


[1] Admittedly, there is a circular logic to the strictest of epistemological skepticisms—if we can’t know anything, how can we know that epistemological skepticism is a valid position? Like most philosophical statements, there is a rabbit-hole to be leapt down here into a wonderland of nuance and complexity. I’ll leave it to you to investigate further if you are so inclined.

Fortune and Glory

I am concerned about the way we talk about God’s glory in the modern church. Not because there’s something wrong with wanting to pursue God’s glory, but because I think the focus we have on God’s glory skews our theology in problematic ways.

I began preparing for this post by studying the Hebrew and Greek words in the Bible translated into English as “glory”. I thought to go through each of them, but they are similar enough in meaning as to be amendable to summary. The Hebrew words (Strong’s H155, H1926, H1935, H1984, H3367, H3519, H6286, H8597) translate to “glory, splendor, dignity” in most senses, but occasionally “reverence.” There is a strong intimation in the Hebrew (at least for H3519, the most commonly-used word) of importance and weight, as in when we say that something has “gravity” in English. The Greek words (Strong’s G1391, G2744) include “a high opinion” and “splendor or brightness, as of the stars,” in addition to the specific meanings “the majesty belonging to God (or Jesus)” and “an exalted state or glorious condition to which Jesus was raised after the crucifixion and to which true Christians shall enter after the return of the savior from heaven.”

In much of the Bible, when the “glory of God” is mentioned, the intended understanding is that “glory” is an attribute of God, something that is revealed to humanity in the presence of God. I would venture to speculate that “glory” is our crude way of describing the existence-altering experience of a confrontation with the all-powerful and loving uncreated creator of all things. In other places, we are told to “give glory to God.” When the words are used in this fashion, the intent, I think, is to give reverence and deference to God, not to attempt to add to the majesty of God.

I want to dwell on that last idea for a moment, because I think that’s what’s held in mind in the modern usage of doing something “for the glory of God.” God is. When God tells us that God’s name is “I am,” we need to read the full mystery into that precise but expansive statement. God is complete in and of God’s self. Part of the theological definition of God (as omnipotent and sovereign) is that God does not need anything and is self-sufficient. By that understanding, God’s glory is something that simply is, that cannot be added to by humans, because if it could, it would no longer be complete within itself. So, to be clear, our actions do not give God glory in the sense that we add to God’s glory. And so, we must be very careful when we say that we are doing something “for” or “to” the glory of God.

The word “glory” functions in the Gospels in much the same way; when God’s glory is spoken of, word “glory” seems to signify God’s awesome (in the classical sense) and transformative presence. On the other hand, when the words appear to “give glory” to God, the meaning is to praise. A very notable exception that seals this interpretation for me appears in John 17:24, when Jesus asks that the believers see the Glory which God has given to Jesus. This exception proves the rule because the meaning of the given glory is Jesus’s exultation and divinity, not praise or fame or reputation. The use of the same word (in Hebrew, English and Greek) for two very different ideas is confusing.

Looking at Romans, Paul seems to have the same understanding of the usage of the word “glory,” as when he says that men “…exchanged the incorruptible glory of God for an image in the form of corruptible man…” Romans 1:23. Likewise, in Romans 4:20, Paul uses the phrase “giving glory” in the sense of praise.

In Romans 2:9-10, he states that “There will be tribulation and distress for every soul of man who does evil, of the Jew first, and also of the Greek, but glory and honor and peace to everyone who does good, to the Jew first and also to the Greek. I believe that what Paul has in mind here is a promise of glorification in the same way that God glorified Jesus. But the inclusion of the words “honor” and “peace” make us think of glory in the context of fame and reputation—the human understanding of the word. And therein lies the real problem.

In the scriptures, as a descriptor of God, “glory” is ontological: it is an aspect of God’s being. In human uses, “glory” is teleological: it is based upon achievement and reputation. Thus it is that Indiana Jones speaks of “fortune and glory,” the rewards of the treasure hunter—er, archeologist.

The first entry under “glory” on www.dictionary.com says: “very great praise, honor or distinction bestowed by common consent; renown.” Only farther down the list do the Biblical definitions occur.

The linguistic mistake, then, comes with the assumption that all glory comes from the opinion of others. Were that the case, we could add to God’s glory by changing the opinion of others. But, as I said above, God’s glory simply is. The pursuit of God’s glory is a pursuit of God’s presence and being, not cheerleading, or marketing or (as is the sexy term these days) “branding.”

In a sense then, it is entirely appropriate to do something for the glory of God—if the meaning is that one is moved by the experience of relationship with God to do something. But when I hear the phrase used, it seems that the usage of “for” means “for the benefit of.” And in this sense, the phrase “doing something for (or to) the glory of God” is not for God, it is for self.

Such a statement must of course be defended. Let me use an example—sports teams. When fans talk about a sports team they favor, they usually don’t say, “the Patriots won;” they say “we won,” or “my team won.” Psychologists and sociologists attribute this to a pleasure derived from associating oneself with success. Sports on some subconscious and abstract level allow us to appropriate the human glory of others and to claim it personally. This thought is supported by the prevalence of fan superstitions: lucky underwear, ritual action, or even whether one must be watching (or attending) a game in order to assist the team’s chances of success. These superstitions allow us to rationalize our appropriation of the glory of the team; we can tell ourselves that we personally (in some supernatural way, perhaps) contributed to the team’s victory.

Let’s take that back to God. If we believe that God’s glory is in the opinion of others, then by raising God’s reputation we are raising our own reputation as God’s children. There are two fallacies here: that God’s glory becomes our glory by anything other than grace and that God’s glory is dependent on something outside of God.

I’ve been working on this post for a few days now, mulling it about in my head (it still seems clear as mud). Last night I attended a non-study study group at my church led by a young pastor I greatly admire. The subject for that night and several weeks to follow was “Christian Words”: those words we use so commonly as Christians but often fail to think about what they mean, leading to shallow or misguided theology. Use of the word “glory” fits squarely on this list, I think.

So perhaps we are misusing words when we talk about God’s glory. That could perhaps be a minor thing except for the emphasis Christians (particularly American evangelical Christians) place on God’s glory. If we’re going to emphasize God’s glory, we’d better make damn sure we use the words right.

What I see is a belief that, perhaps second to going to heaven, our focus is mainly upon God’s glory, but understood under the human definition as reputation. This idea is so pervasive that I have spoken with many Christians who, some avowedly, believe that the purpose of humanity’s creation was “to give glory to God.”

This is not attractive to the unchurched. In one sense, this can be construed as postmodern—God is only as powerful as we all agree God is. Hmm. Worse, we get the image of a narcissistic God who cares only about being praised. Thankfully, neither of these ideas are theologically sound.

We need to be clear to ourselves and others about the place that God’s glory has in our theology. God does not need our praise and we cannot add to God’s glory. Therefore, God’s own glory is not God’s purpose in creation, nor some demanded obeisance from us.

Of course, it is just and right and proper for us to “give glory” (in the Biblical sense of acknowledgement and praise) to God—God has given us much to be thankful and grateful for. More important, I think, is that one who has a personal experience of God cannot but be in joyful awe.

We ought, then, to focus on helping others to experience God’s glory; that is, to have a personal experience of the transformative glory of God. It is in relationship with Jesus that God’s glory is experienced—once experienced one’s opinion is forever changed. That relationship, I think is God’s purpose in creating us and should be our purpose in making disciples of others.

Writing “Race” in Fantasy

Every aspiring fantasy author or worldbuilder must eventually answer the question of what kind of sentient beings will populate his or her world. At that juncture (again) myself, I thought I’d write my way through the problem(s). I’ve agonized over and over again in designing Avar Narn about what “races” (they should really be called “species,” I think) would occupy that world. I’ve made changes and undone them, remade them and tweaked them over and over and (I hope) I’m ready to finally make the decision once and for all. We’ll see at the end of this post.

So what are the problems about “race” in fantasy works?

(1) Ideas of race have meaning and are problematic. Since you’re on the internet to read this, I’m going to assume that you’re aware of how big a deal race currently is in our world (and in the U.S. in particular). How we discuss and think about race is important, and it’s quite easy to make a misstep.

From one perspective, having various races (I’m just gonna say species from now on) in your fantasy world can do a lot for you.

First, the genre is called “fantasy”—readers want to see the fantastic. It’s part of the fun. Second, you have an instant source of potential conflict (and therefore plot) when you have groups of people (in this case fantastic species) who are unlike one another.

If we want to be highbrow, the encounters between different species allow us to look at “otherness” (to borrow the academic term) in a lot of interesting ways—we can analyze and critique how we (by our culture, our ideologies or our very humanity) define and react to the Other. We can, if we want to be heavy-handed, even talk about specific races race-relation issues in the real world through the metaphor of created fantastic species. To be honest, I’m not sure how you could portray enslavement in a written work and not have an American reader not think about the historical slavery of blacks, and the line between what is said about slavery in general and what is a specific commentary on the experience of a particular people is blurry at best.

At the same time, when we create a fantasy species, we have to bring them to life and individuate them. It’s no use saying “these people are like humans, but they’re blue and have an extra eye.” If our differences are only cosmetic, readers will be understandably disappointed in the lost opportunity. But defining peoples and cultures is difficult, and it’s tempting to resort to shorthand: “These guys are like Tolkien’s orcs, but they’re more intelligence and have a culture like ancient Egypt.” Time constraints and a desire to give the reader quick access to understanding of a story push us in this direction. But there’s a trap here—this sort of cribbing can easily drive us to base our fantastic species off of racial stereotypes.

Even Tolkien was guilty of this. He later acknowledged without reservation that the dwarves in his stories had a lot in common with European Jews. Re-read the stories (or re-watch the movies) and think about that—Tolkien’s dwarves have big noses, are geographically displaced, are often greedy and selfish. If the dwarves weren’t such beloved characters, we’d really see some elements of anti-Semitism here. I’m not saying that Tolkien was anti-Semitic; I have no idea about the answer there. But if it’s possible to say that his portrayal of the dwarves perpetuated negative Jewish stereotypes (mostly medieval ones that somehow persist in this case), something negative has been accomplished through writing, and that’s to be avoided.

(2) Clichés. Look at some of the most popular works of current fantasy fiction (A Song of Ice and Fire and The Name of the Wind both come to mind) and you’ll see settings in which you will not find elves and dwarves and Hobbits. There are several reasons for this.

If you want to have fantastic species in a setting or story, ask yourself, “why?” really. Can you tell the same type of story (or even the exact same story) with humans instead of different species? In most cases, the answer is “yes.” If there’s an Occam’s Razor of writing, maybe this is where it best fits—don’t put things in the story you don’t need. That advice sounds really good, but that doesn’t mean I can bring myself to follow it, necessarily. Sometimes there are things I want a story to have.

The more important reason, I think, that there’s a current move away from fantastic species in modern fantasy, or at least the “standard” species (elves, dwarves, halflings, etc.) is that the portrayals of these species has become hackneyed. We’ve had the same pointy-eared elves and pseudo-Norse dwarves for seventy years and, after a while, that starts to lose its fantastic luster.

This is partially a result of Tolkien’s looming presence over the genre—if you’re not doing it like him you’re not doing it right—but it’s also a result of the influence of Dungeons and Dragons. Multiple generations of fans of fantasy have grown up with the roleplaying game’s definition of elves and dwarves (influenced, of course, by Tolkien) setting the standard. We writers now must worry that, if we change the stereotype, readers will say, “that’s not what orcs are like!” while established writers (and many readers) also say, “if you’re using the same old stereotypes, you’re not writing something worthwhile.” I don’t think that the latter statement is necessarily true, but the risk of writing overly-derivative works certainly increases with the use of the “stock” fantasy species. As an aside on that note, perhaps we could argue that the “stock” species should be thought of in the same line as Commedia dell’Arte: as stereotypes that allow us to quickly pull in the reader and get on with the story. After all, avoiding an infodump is usually a good thing.

To be clear, there are modern writers doing wonderful things with (at least mostly) traditional stereotypes. The books of The Witcher world contain elves and dwarves but manage to depict them in a believable and relatable conflict with humanity (that disturbingly resembles a race-war, because it is one). Of course this works especially well for Sapkowski in the larger context of taking traditional fairy tales and twisting them for his own purposes.

(3) It’s impossible to get inside the head of ultimately alien creatures. As humans, we simply cannot fathom what it would be to be a thousand-year-old elf with confidence in her immortality. How differently we would view the world.

To be fair, that’s a surmountable obstacle. We also cannot create a character who is actually every bit as complex and idiosyncratic as a real person. But we can create the illusion of the same. The same principle applies to writing about fantasy species (or alternatively, alien cultures in sci-fi settings)—we can create the illusion of unfathomable otherness.

Though crafting the illusion is possible, it’s nevertheless very difficult. It requires great care and thought to do well, otherwise you end up with phenotypically-variant humans and nothing more.

It’s not enough to give them a culture based on human cultures, I think. If you’re going to create species that really deserve to be something other than humans, they should really feel different, probably even uncomfortable (but not necessarily frightening).

(4) Complicated Relationships. I grew up a big fan of the Shadowrun setting. One of the things that bugged me about it though, is how they treated race. By this, I don’t mean the fact that there were Orks and Trolls and Elves and Dwarves, but about the ethnic differences we tend to mean when we use the word “race” in modern context. The Shadowrun rationale was just too simplistic.

The explanation went something like this: “Twentieth-century racism is a thing of the past. People don’t care about someone’s skin color anymore when the troll over there can crush you with his bare hands.” In other words, the existence of the alternative species of the Shadowrun world had completely subsumed “traditional” racism.

There’s no reason to believe that that would be the case even if people in our world were to suddenly turn into elves and such. There’d still be plenty of “good ol’ fashioned racism” to go round.

This is just an example of a problem that’s really inherent to all fantasy writing–the need to balance complexity with both the writer’s time and energy and the importance to the story.

(5) Monocultures. This relates closely to (4). Humans have a diversity of very different cultures, ideologies and values, but fantasy species tend to be portrayed as monolithic. This practice is most prevalent, at least in my experience, in roleplaying games. A setting may have many different human cultures for players to choose from for their characters, but only one for any character of a non-human species. Sometimes there are two or three options, but these are not terribly fleshed out and are based more on in-game bonuses than real cultural differences. The ad absurdum example, of course, is early D&D, where you could have either a class (magic user, thief, fighting man) or a “race” (like elf). That’s right, all elves are so similar that they need only the name of their species to define their abilities.

The point is, believable species must have individuation between groups and between individuals. If you’re using elves in your fantasy world, they shouldn’t all be flower-loving hippies (or, even more offensive, all be evil if they happen to have black skin). It takes extra time, yes, but if you’re going to be using fantastic species in your writing, they ought to be diverse like humans are diverse (or there had better be a good reason why they have a monolithic culture).

(Potential) Solutions

(1) Avoid the subject altogether. Just don’t use fantastic sentient creatures. Throw in all the griffons and gargoyles and what not that you want, but leave the thinking, feeling characters human.

(2) Cheat. Here’s what I mean: in Avar Narn, several of the fantastic species used to be human—they were reformed, accidentally or on purpose, willingly or not, by magic. That’s happened long ago enough that they’ve developed somewhat alien perspectives on existence and certainly cultures that vary from those of most human cultures, but it leaves within them a core of humanity that somewhat eases the problem of creating an entirely alien culture—humans will definitely be able to relate to these beings on some level, but not completely. One of the reasons that I’ve chosen this path for some (but not all) of the fantastic species in Avar Narn is that it reinforces one of the setting’s themes—the horrible things that humans would do to themselves if given the power to reshape the world through magic.

(3) Be defiant. Just say, “Damn the torpedoes; full speed ahead!” and use the traditional fantasy “races” in your stories or setting. If you write those peoples in a believable and interesting way, and especially if the other aspects of your stories are well done, you won’t have to worry about most people complaining. Ignore the ones who complain anyway.

(4) Use alternate mythologies. Tolkien was a philologist, a student of (ancient) languages. He had spent a long time studying Old English and Old Norse, and he drew from Germanic mythology to create his elves and dwarves. But there are many other cultural mythologies from which could be drawn a plethora of new and interesting species to populate your fantasy world. There are plenty of authors, published and not, taking this tack, though, so move quick (this is what Miéville did in Perdido Street Station and the books that follow, for instance).

(5) Use Archetypes. Here I mean Jungian or Campbellian archetypes. I’m not sure that I buy into the whole “monomyth” thing, and I’m a little skeptical about there being a collective unconscious from which we separately derive the same concepts (fascinating as that idea is—especially for fantasy writers). But there are some very common “places” occupied in various mythologies around the world—there are “hidden folk” in both Scandinavia and Southeast Asian countries, smith creatures in all manner of cultures, creatures to be sought for wisdom in many mythologies. So, find those common themes and, instead of drawing upon an existing mythology to find your fantastic species, create your own that fits the motif. Both dwarves and giants are associated with smithing in various European cultures, so what other type of creature might fit there?

(6) Do the twist. Take a traditional fantasy race and tweak it until it’s either an interesting and innovative take on the species or no longer resembles the original concept.

(7) Use biology. Look to how organisms develop and change based on environment and use that to create your fantastic creatures. There’s a caveat here, though—you’re creating a world that readers will be willing to suspend disbelief for, not an alternative science textbook (unless, of course, that’s your postmodern, avant garde sort of writing style), so use what you need and maintain plausibility to the extent you can, but don’t worry too much about having things perfect. And avoid the math. For the love of God, avoid the math.

(8) Create from scratch. This is not “create ex nihilo,” which humans are incapable of doing. But, if you take your building blocks from less-visited wells, you can create something that feels different and unique.

(9) Relax. At the end of the day, the important question is not whether you include or exclude elves in your fantasy—it’s whether you craft a world that seems to make sense (that is at least internally consistent and plausible-seeming based on human experience), craft species and characters that are entertaining and interesting to read about, and tell stories that draw the reader in and make him not want to finish. Can you do that with stories that include elves? Yes. Can you do it with stories that have unique fantasy species? Of course. Could you do either badly, absolutely. So suck it up, figure out what you want to have in your world, do your homework to create diverse and interesting inhabitants for your setting, and get writing.

Have I made my own decisions now? Not exactly, but at least I’ve given myself the kick in the pants I needed to make my final decisions and let it ride.

An Alternative Reading of the Fall

Here’s how, in my own semi-irreverent way, I would summarize the traditional, mainstream story of Adam and Eve’s Fall in the Garden of Eden:

“So you’ve got the first man, Adam, and the first woman, Eve[1], and they live together in paradise, and everything’s cool. God gives them one command. One! ‘Do not eat from the tree of the knowledge of good and evil.’ And God tells them, ‘don’t do it or you’ll die.’ And then comes along a serpent-thing. Maybe it’s a lizard, I don’t know; it’s kind of snake-like but it has legs. Either way, the serpent’s really the devil in disguise. The serpent tells Eve that it’s just fine if she eats the fruit of the tree of knowledge of good and evil and she won’t die. It tells her that, if she eats, she’ll be like God because she’ll know good from evil. So she eats some and gives some to Adam. And now they’ve disobeyed God, and that’s the first sin, and everything kind of sucks after that because they messed up. And so, we need Jesus.”

There are a few logical problems with this interpretation, common though it is.

First, while Adam and Eve do disobey a command from God, they cannot be held responsible for this. For the story to make any sense whatsoever, Adam and Eve must be without the knowledge of good and evil before they eat the fruit—otherwise, what’s the point in the first place? But, if they do not understand good and evil when they disobey God, they don’t understand that what they’re doing is wrong. No credible system of justice holds people culpable when they did not understand that what they were doing was wrong and intended to do wrong.

So, if eating the fruit from the tree of the knowledge of good and evil was the sin that caused the Fall, God has acted arbitrarily in declaring mankind to have fallen. I don’t believe that our God does anything arbitrarily. This by itself breaks the traditional interpretation.

It is undeniably true that Adam and Eve intentionally disobeyed God. I am not disputing that. But if they did not understand that disobeying God was wrong, we need to derive a different meaning from the story.

There is a second problem. If God is omnipotent and omniscient, and God did not want Adam and Eve to have knowledge of good and evil, why put the freaking tree in the Garden? I think that we have to assume that God is purposeful in God’s actions and that the tree is there for a reason.

It would not be sufficient of me to criticize the traditional reading of the Fall so significantly without offering an alternative explanation. I suggest that we read the story of Adam and Eve a little more mythologically[2]—as expressing a fundamental truth about the nature of existence in a way that shows rather than tells. In other words, let’s get metaphysical.

The scriptural passages make clear that Adam and Eve have free will—they have the ability to choose their actions without determinism from God. Otherwise, there is no need for God to give them a command and warning about the tree. While they cannot be held accountable for their actions prior to eating from the tree, it is nevertheless their intentional choice to disobey God.

It is safe to say, then, that the existence of Adam and Eve’s free will leads to their disobedience of and separation from God as an inevitable consequence—without having to bring moral judgment into the interpretation. That’s a profound assertion—free will is a fundamental aspect of God’s creation of humanity, but one that brings a set of problems with it.

If God’s goal for humanity is relationship, as I believe that it is, God must (at least under the laws of reality as we understand them; one can’t ever really say must of God in any truly absolute meaning) give humans free will, because a meaningful relationship requires that both parties to the relationship willingly agree to be in relationship with one another. But with free will, there’s a possibility that one party will choose not to enter into relationship. On the same lines, this means that humans may choose not to be righteous and obedient to God.

How can the lack of righteousness created by the gift of free will be resolved? This, I think, is one of the fundamental problem Christianity answers—through the grace, salvation and guidance of Jesus Christ, humanity may be both free and good.

Under this reading, the tree of the knowledge of good and evil is not a catalyst, it is a symbol. If God’s command and Adam and Eve’s subsequent disobedience is indicative of the “problem” caused by free will, then the eating of the fruit is symbolic of the fact that man and woman have a knowledge of good and evil and thus are responsible for the use of their will. In order to be able to choose to be good, they must have this knowledge; now they need a guide in the ways of righteousness. Having this knowledge also means that they are now culpable for their evil; now they need a savior to forgive their trespasses as they struggle to learn to be righteous. Both of these needs are fulfilled in the person of Jesus Christ, and this reading of the Fall sets us up to look for our savior almost from the moment of creation.

The writer of the Gospel of John tells us that Jesus is the Word, and the Word was with God at the creation and was God. When we look back to the Fall with that knowledge in mind, now we see a long-term plan from God to create beings that are free and independent of God (and thus capable of meaningful relationship with God) and giving them a path to also be righteous of their own volition.

Along with this reading—to bring things into Wesleyan perspective—we might call the fundamental problem caused by the existence of free will “original sin,” that condition in which we will inevitably separate ourselves from God and creation in an effort to satisfy only ourselves. Having young children in my home, it does occur to me that, upon discovery of the existence of the will, that seems to be the path that naturally follows. “Prevenient grace,” then, would be that grace of God that goes before us and allows us to see beyond our own selfish desires enough to do good and to seek after God.

My favorite theologians, Paul Tillich among them, advise that we ought not to define sin as particular acts, but instead of the condition of separation from God, self and others that occurs because of certain acts. I think that my alternative reading of the Fall lends itself to that definition, which also fits well with what I called in a previous post the “positive morality” of Jesus—sin is what results when we fall short of the Great Commandment. This means that we must look to both intent and result of any act to determine whether it is sinful or not—we cannot simply categorize sin without context.

As I’m sure I’ll discuss in future posts, I think that the reading of the Fall that I’ve provided gives us more logical and more useful understanding of our place in the universe and the nature of sin than the traditional view. What do you think?

———————————————————————

[1] I said mainstream. I’m aware of all the Lilith legendry about her being the first wife of Adam, etc. While a fascinating story, it was probably generated by early attempts to syncretize the two accounts of humanity’s creation (Gen 1:26-27 and Gen 2:7, 18-25). I’m willing to chalk up the two accounts to sloppy editing, but I can’t deny the possibility that some theological insight is meant by the existence of the two differing accounts.

[2] The “curses” given by God either fit into a “traditional” mythological role—a pre-scientific attempt to explain why certain things are the way they are (why snakes have no legs, why we have to work for our food, why childbirth hurts so damn much, etc.), or a theological role—childbirth is a symbol that real creation sometimes requires pain and sacrifice, the requirement to work the land tests our choices when we exist in a world of limited resources, etc.

Avar Narn

Avar Narn Hand-drawn B&W Complete.png

As most aspiring fantasy writers do, I think, I have had for over a decade a fantasy world being built inside my head. For more than a decade, I’ve called that world Avar Narn.

There have been major changes and reworkings to the world over time, and it is currently under what I hope is its last set of major revisions before I’m content enough (emphasis on enough–I’ve discovered, if only recently, that if I wait for it to be perfect it will never go anywhere) to start writing seriously within the setting for publication. In fact, I hope to soon finish editing on a few short stories set within the Avar that will be posted to the blog.

Avar Narn is an eclectic place, influenced as it is by a wide range of authors. It’s neither Tolkien nor Martin, but the influence of both are undeniable. There’s definitely a streak of Miéville (one of my favorites for so many reasons) and, I think, some tone if not theme borrowed (stolen, really) from Mark Smylie’s Artesia graphic novels and setting. Undoubtedly, my long experience with various roleplaying game settings has formed some of my opinions about what makes a good fantasy setting. I could go on, but it’s perhaps best to let it stand on its own rather than to list a bunch of people whose works I can strive for but about whom I make no pretentious affectation of emulating.

Like Tolkien and Lewis, my faith is an important factor in the design of the world. Unlike those fine gentlemen, I prefer my theological assertions to be a bit less heavy-handed. I do not want to use the Biblical narratives as the core of my plots, nor do I want my works to come across as a form of thinly-veiled apologetics. My hope is that the ways my faith influences my worldview will come out in the types of stories I tell and the style in which I tell them–speaking to the human condition as I understand it without making the story an argument of faith.

I have lofty goals, as fantasy writers should; it will be for readers to judge how well I achieve those goals. Foremost must be the telling of great stories that delight and inspire the reader to think about life and existence. Along with this is the desire to participate in that ephemeral but satisfying practice of mythopoeia.

I am not an artist by any means, although my latest pursuits in world-building have given me the desire to become one. I’ve acquired a small Wacom tablet, some decent drawing pencils, an artist’s sketchbook, a subscription to an online drawing class and several instructional books. As my first significant effort, I’ve created the above hand-drawn map of the area of Avar Narn where most of my stories for the foreseeable future will take place. It’s not bad as a first attempt, though it could be much better and I learned a lot while doing it.

I’ll likely be posting more about Avar Narn soon, along with the promised short stories. For now though, I hope this map takes you back to your childhood, looking at the maps in the Hobbit and the Lord of the Rings, or those of other fantasy stories, in wonder and excitement. It does me.