politics, ruminations

Good, Necessary, and Just

The wars about which there is the least dissent, both contemporary and historical, are those which are judged to have been good, necessary, and just. And though there can be extensive debate against how much any war fits any or all of these categories, it’s hard to doubt that a war that is seen as good, just, and necessary is a “better” war than one that fit into none of those categories.

We can use Iraq as an example. Some would contend that America’s invasion of Iraq was none of the above. Not good, not necessary, not just. The vast consensus at the time, however, was that it was a good war, and if not a just war, at least necessitated by weapons of mass destruction.

Goodness in war is something judged by external moral absolutes. America’s mythical neoconservatives like to fight wars against evil. In such a black-and-white world, all wars waged by America are inherently good. Even if one doesn’t believe that America is always on the side of the good, there are some clear situation where we unquestionably wage wars on the side of the good. World War II, which is generally the most clear-cut war in history, saw the Allies fighting the good fight. It would be essentially impossible to define either the Nazis or the Japanese, both of whom believed they were racially superior and thus engaged in genocidal tactics, as much other than evil.

Necessity is perhaps more difficult to pin down than good. Realists, who believe in unwavering pragmatism in foreign policy, generally prefer to fight only the necessary wars. One can easily say that it is necessary to fight back when your territory has been invaded and your citizens are being killed. Leaving aside the Dalai Lama, who doubts the necessity of war for even self-defense, it’s generally acknowledged that a defensive war is a necessary war. More recently, it has also become recognized that in cases of genocide, war is necessary. It is with this belief that the Vietnamese invasion of Cambodia–ending the reign of the Khmer Rouge–was necessary, that United Nations intervention in Bosnia was necessitated, and NATO action in Kosovo was legitimate.

This, however, gets to the final and most difficult point. When is a war just? Some liberal institutionalists believe that a war is only just if it has the blessing of the biggest international body of all: the UN. In this view, only the intervention into Bosnia was just. Because NATO intervention into Kosovo didn’t come with United Nations assent that it was good and necessary, the war was unjust. Others would say that assent from any existing multilateral institutions can make a war just. Thus, intervention in Kosovo, because it was blessed by NATO, was more legitimate than intervention in Iraq, where assent only came from an ad-hoc “coalition of the willing.” As it was viewed at the time, Vietnam’s intervention into Cambodia was actually the least just of all of these; it was completely unilateral.

But most commentators now agree that Vietnam’s intervention in Cambodia was if not just, at least good and necessary, and thus worthy of respect. Rarely is a war waged by anyone seen by the whole world is good, necessary, and just. In this respect, WWII is a widely recognized exception.

It should also be noted that a war that seems good and necessary, if not just, when it begins is not necessary seen as such when it ends (or in historical hindsight). It’s hard to deny that America’s involvement in Vietnam, beginning with Eisenhower and not ending until the presidency of Gerald Ford, was initially seen as good and necessary. Good because Communism was broadly seen in America and the western world as irredeemably evil, necessary because without it all of Asia would fall to the evil of Communism. Yet today–and in some quarters, at the start of the war–it’s recognized that it was neither good nor necessary. The Vietnamese may have embraced communism, but are widely seen to have been seeking only independence. And the string of dominoes theory–if one falls the rest will too–is widely recognized as both unrealistic and silly.

Thus, in hindsight, Vietnam is seen as neither good nor necessary (it was never widely seen as just). It is thus widely seen as one of America’s lowest moments and worst wars. Wars that were not good, and not necessary, and not just are usually and understandably sources of national shame.

And though one could reasonably argue that all wars are a shame, it’s hard to deny that without at least goodness and necessity, or justness and goodness, or justness and necessity, a war truly is a shame.

personal, ruminations

Necessarily Callous

Current figures suggest that more than 22,000 perished in Myanmar (Burma) this weekend. Now the story seems to be the most consequential in the world.

Yesterday’s figures suggested that more than 350 perished in Myanmar (Burma) this weekend. Then the story seemed like a regrettable natural disaster.

There’s that old axiom, attributed to Josef Stalin, that “one death is a tragedy, one million is a statistic.” I think there’s undeniably something to that. But I also can’t deny that I’m staring in the face two different numbers that make two very different impressions on me. In this cases, 20,000 deaths are a tragedy and 300 is a statistic.

It’s an ugly truth that I willfully ignore disasters when damage estimates are small. Unless you know someone who lives near the site of a natural disaster, it’s easy to ignore all the reports of earthquakes, tornadoes, hurricanes, and tsunamis. It’s probably smart not to get too worked up over natural disasters we humans, by definition, have no ability to control. It may even be wise.

And yet I can’t escape the fact that doing so seems terribly, inhumanely callous.

People who’ve known hard labor know calluses. That toughening of the skin so that the pressure so often put upon it starts to cause no injury. Perhaps even no feeling. The toughening can be unsightly, but it’s the body’s natural and necessary response to pressures that would otherwise cause tissues to rip and bleed. Given the choice between a callus and an injury requiring attention and rest, our bodies will usually choose to toughen rather than tear.

Perhaps, in our concern for the welfare of others, we need a similar amount of callousness. A similar detachment and unconcern that allows us to get on with what needs doing in our lives. That allows us to get up after hearing about five American deaths in Afghanistan, or the death of 30 Iraqis in an explosion, or 20 in a tsunami, or one in an industrial accident.

We have no time to mourn all these losses. We cannot, perhaps, spare the time and energy to consider, regret, and mourn every loss of life anywhere in the world. We cannot even spare the time and energy to mourn every loss of a fellow citizen of our country. Or even of every loss of a fellow citizen of the city, province, or state in which we live. Sometimes, it seems like we don’t even have the ability to mourn those family member we lose.

I see the necessity of this callousness. I think it makes good practical sense as a means of survival. But that doesn’t make me any less disappointed to notice it within myself or others. Any less sure that it’s wrong to stare at immense loss and be unable to shed even a tear. Any less disappointed that I only see a tragedy when the death toll reaches 22,000. Any less sure that 350 is a tragedy. Any less disappointed when I overlook the tragedy of one.


Consuming and Creating

In school, Sunday’s the day where you have to make up for the procrastinating you did all weekend. Out of school, Sunday’s only the day where you recognize that you’ve done nothing all weekend.

Surely this doesn’t hold true for everyone, but my weekends tend to naturally fill themselves with consumption of media. All the things I didn’t get to read, watch, or listen to during the week become the priority during a distraction-less weekend. As such, the whole weekend can easily be consumed by the act of consuming.

If Clay Shirkey’s assertions are to be believed–and I’m not saying they are–in previous decades all free time went to consuming. In the pre-radio age, it went mostly to consuming alcohol. In the television age, it went mostly to consuming sitcoms. But while Mr. Shirkey’s certainly right that television’s roles as the thing people do with down time is waning, consumption still plays a large part in modern existence.

Today one isn’t restricted to watching what’s on television or listening to what’s on the radio, or reading what’s on paper and in possession, but we still consume a great deal. There’s little doubt that young people watch less television than they used to. But they also have the ability to spend hours in front of YouTube, a different-but-similar dummy box.

The most interesting contention that Mr. Shirkey makes about the future is that we’ll be creating more and consuming less. It’s certainly a possible trend, but it’s doubtful that we’ll move from “all free time being devoted to TV” to “all free time being devoted to creating.” After all, one must consume things in order to create. Things created in a vacuum are usually uninteresting rehashes of painfully common ideas. (Something with which I’m intimately familiar…)

Surely one can go too far in consuming. A quick guesstimation says that of the 15 hours I was awake Sunday, 11 of them were devoted to the act of consuming media. Surely I learned a lot and laughed a lot but by the end I had a bad case of consumption fatigue.

It’s possible that the mythical people who used to constantly watch TV in their free time never had a bout of consumption fatigue, but you can count me a doubter.

They probably didn’t combat consumption fatigue by creating, but it’s possible that did combat it. Because creating was a harder task, people could spend more time doing tasks that were not explicitly either. Cooking from a recipe is both an act of consumption (of the recipe) and creation (or foodstuffs). So too is knitting, sewing, or drawing while watching television an intermediate between the two. Then of course there’s running, hiking, biking, walking, and playing, all of which are neither consuming nor creating by any traditional understanding of the words.

The fact is, we’re not moving from a world of consuming to one of creating. At best, we’re shifting the balance slightly. It’s easier to create and share things today than at any time in the past. Today anyone can write a blog, edit a wiki, create digital art, or mash-up two old things.

But everyone has experienced creation fatigue as “writer’s block.” Or procrastination. Or a general feeling that “it’s just not coming.” We’ll never be able to create infinitely without encountering these roadblock.

People, too, know consumption fatigue. Rarely do they identify it as such, but that general feeling of needing to get out of the house is one of many possible misdiagnoses of the problem. And I’d guess that it’s no more or less common today than it was in the past.

I don’t think the ratio of consuming and creating will change much in the future. Surely more will be publicly shared, but I’m not certain much more time will be spent on non-consuming behavior than has been in the past. And despite some bouts of consumption fatigue, I’m pretty sure I’m fine with that.

politics, ruminations

In Defense of Voting on Character

Public DomainThe presidential seal

Law making, like many things in life, is about compromise. But the problems with which politicians must deal are not always about compromise. Some things are too important and too urgent to be dealt with adequately through endless compromises with other politicians and the public at large. Sometimes, in the course of running a country, laws are broken. Sometimes this is done willfully, sometimes through misunderstandings. I’d guess that it’s often done with a heavy heart.

I’m willing to guess that presidents seem to age so rapidly because they are so often forced to break laws or enter moral “gray areas” to do what they honestly feel is best for the country.

George W. Bush has deservedly gotten a lot of flack for all the laws that have been broken under his administration. Torture was once against the law. We now know that for a least a few years after September 11, it was an approved policy used by the administration. Breaking the cover of a CIA agent and subsequently lying about doing was once grounds for imprisonment. Hundreds of other actions of questionable morality and legality have no doubt occurred.

When one makes laws, they do so by fighting over inches in the hope that with enough concerted effort they’ll make progress of feet or even yards. No legislator has ever reached the end of her career convinced that she made it all the way to the end zone. That she accomplished all that she set out to do. That the laws are all of the kind and character that she would like them to be.

But much that the president does is of a different type entirely. Surely sometimes he does engage in the same game of inches that it so often played under the Capitol’s rotunda, but that’s hardly his only duty. Sometimes he must authorize snooping in violation of laws, either foreign or domestic. Perhaps he authorizes the use of force without congressional approval. Sometimes, I’m sure, he must decide whether individual men live or die.

Most of this is hidden from both the American people and the world. Part of this is probably for fear of prosecution of the president or his administration, but the full extent of government knowledge or action cannot ever be publicly known. Then there is, of course, the usefulness of allowing the American people to think that the government doesn’t make decisions in private.

Commentators often claim that elections should be decided on issues alone. That middle class Americans should have voted en masse for John Kerry, Al Gore, Walter Mondale, and George McGovern. That people shouldn’t judge their presidential candidates on anything but their legislative agendas.

Ignoring the fact that presidents lack any meaningful power to pursue a legislative agenda, the fact remains that the presidency is a job that requires careful decisions in the face of hard choices. Decisions that cannot be predicted by a legislative agenda, and so must be judged by external factors. A candidate’s temper, history, or friends are legitimate ways for the American public to judge a president’s character. To determine how she will act when faced with urgent decision for which laws provide no clear right answer.

Surely there’s less information about a person’s character in whether or not they wear a flag pin than there is in my little finger. And the fact that you once met a terrorist or were endorsed by a closed-minded bigot doesn’t count for much. But the notion that people shouldn’t be allowed to decide who they’ll vote for on more than a legislative agenda is patently absurd. I’d certainly rather have a president whose judgement I trusted than one who promised to legislate in my favor.


No Going Back

Sometimes it hits. It’s rarely anticipated. That desire to feel that feeling you felt in the past. Maybe it was your first day of school, or your first kiss, or your first home run. Maybe it was that night when you did that thing, or that afternoon when you did that other thing. Maybe it was just that one time that you don’t remember very well but do remember fondly.

But you’ll never feel quite that way ever again.

One could, of course, question if you ever felt that way you remember yourself feeling. After all, memory is a flawed device that frequently deceives. It’s not only possible but likely that dinners at Grandma’s house were a little less magical than you remember them being. It’s hard to doubt that memories sometimes papers over the worst parts, colors in the bits that have faded with time, and generally makes events from your past look better than they really were.

But that’s a different matter. This is about how you’re no longer the same person you were ten years ago. If that’s true, you’re also not the same person you were five years ago. Or two years. Or a year. Or six months ago. Or three months ago. Or last month. Or last week. Or yesterday. Or 10 minutes ago. Or just a second ago.

This of course could lead us to ask, “Well, who are we anyway?” But again, that’ll have to be left to a different time.

The fact is, any feeling you had in the past was shaped by all the feeling you’d had until that moment. And the second you’ve had the feeling of first riding a roller coaster, you’ll never feel that way again. Your first experience of something colors the way you’ll experience that thing the rest of your life. So does that second experience of it. Every experience changes your relationship to those you’ve had and those you’ll have in the future. Some of these changes are probably for the better, some may not be.

The reason you’ll never get to relive that moment again is not that you’ll never be 12  or 21 ever again. It’s because you’ve already experienced that. And then you’ve experienced other things. And so you’ll never feel precisely that way ever again.

This can be a sad thought. It’s not exactly exuberating to think that you’ll never experience the joys of your childhood ever again. To think that you’ll never feel that way you did again.

But there’s no way to avoid it. You’ll never be that person again. You’ll never feel that way again. Time “marches on, whether we act as cowards or heroes.” We’ll never be the same again. There’s no going back.

american society, ruminations

Length and Strength

If you’ll indulge me, I’m going to try something. I’ll present the same argument three different ways. I hope that by the end, you’ll understand why.


The length of an argument is directly proportional to its strength.


Generally, the length of an argument is proportional to it’s strength. Barring excessively and pointlessly wordy arguments, five words are much less likely to convince than even fifty. Surely five words on taxation can energize those who already agree with you on the topic, but it’s much less likely to convince those that oppose you than is a thoroughly reasoned 500 words. There’s no denying that some may never be fully convinced, but they’re more likely to understand if they hear a thorough explanation than if they hear a sound bite.


I have this idea that the length of an argument is, generally speaking, directly proportional to it’s strength. That is: a long argument is far more likely to succeed in actually convincing someone to change their opinion than a short one. Now, having said that, I should add that not all arguments that are long will be strong. A long and rambling argument is a long and rambling argument. But given a roughly constant rhetorical strength and skill, a short quip is likely to leave the opposition in opposition.

Consider: “A woman has a right to privacy.” If you’re for a woman’s “right to choose” you’re probably convinced that that’s a good argument. But you won’t convince anyone standing outside an abortion clinic with a sign by such an argument. You may succeed, however, if you gave them a longer explanation about how you feel that a woman should be guaranteed a safe medical procedure when she feels it is necessary. And that you also hope that it’s rarely necessary. Surely a sudden conversion is unlikely, but I find it hard to believe that it wouldn’t be more likely.

So too with the argument for “higher taxes,” which the political left in most countries desires. Couched in those terms, it turns off everyone but the most ardent supporters. But expanded to explain all the good that those taxes would empower the government to do on behalf of its citizen, people would become more likely to accept the argument. Soon, they too might take to the streets shouting “higher taxes.” Again, they’re not likely to convince many that way, but they’ll learn.

Much of people dissatisfaction with the “sound biting” or all cultural and political arguments is because they understand the implicit logic of the relationship between length and strength. They understand that you’re much less likely to convince a person in a 30-second television commercial than in a 30-minute discussion. I think that implicit understanding should not only be illuminated, but expanded so that everyone will finally come to understand the argument.


Infinite Information

Perhaps I’m the only one who hadn’t realized before, but there are over six billion people in the world. Those people are, at a given time, in 6 billion different places, doing 6 billion different things, and thinking six billion different thoughts. That means that each second, 18 billion potential–but very inexact–data points are being generated. The number quickly gets into the trillions if we seek data related to say, their health. Each of those people at each of those instants had different red blood cell counts, blood glucose levels, blood alcohol levels… I won’t even try to name all the possibilities.

The simple reality is that in a given instant the world’s population if full of more information than a person could know in a lifetime. If we were to include information about other animals, the planet itself, or the universe, it becomes impossible to fathom the quantity of data that we could amass and know.

Even if we limit ourselves to information that is being recorded–written and stored, by people or computers–there’s more than a single person could reasonably expect to know. Even if we further limit ourselves to information that is available to us, there’s more than a single person could reasonably hope to know. Surely the internet’s done a lot of good things, but by making so much information available so easily it’s no longer possible for someone to have “read everything” within more than 100 feet of themselves. (Yes, I’m making the indefensible assumption that you’re never more than 100 feet from an internet connection.)

It’s because of thoughts like this that people often complain about “information overload.” With more people and more computers than ever before, there’s more stored information than ever before.

The problem with information overload is that it fails to distinguish between what a person “can know” what they “want to know.” Those 18 billion or more data points available at any second offer precious little information that I actively “want to know.” Surely I’d think it was cool to know what a random person in India, Zimbabwe, France, or Paraguay was doing right now, but that’s different than those 18 billion semi-knowable data points.

Of course internet–or is it information?–skeptics maintain that people shouldn’t be able to know only those things they “want to know.” They lament that allowing that will create a world of small groupings of self-selected people who know roughly the same information and hold roughly the same biases about it.

It’s absolutely possible that a small circle between 10 and 50 people could create enough information and media that you could spend all of your free time consuming nothing but the ideas and products of that small circle of people. This is what gives way to fears of the mythical “echo chamber” that the internet is supposed to create.

Of course, such echo chambers existed before. Then, they were generally called “small towns” and the only means of escape were geographic. Today they can exist virtually, but the price of escape is much lower. A new website is a few clicks away, not a few hundred miles.

There has alway been an infinite amount of information. Now much more of it is recorded, and thus far easier to know. The fact that there is more information recorded and accessible than ever before doesn’t mean that we’re automatically more informed than ever before, or smarter than ever before. Surely coping with all the data on the internet can be daunting task. But the possibilities that all of this information offers are so great that I would never want to go back.


On Being Small

Tub GurnardA Tilt-Shift Photograph of a Construction Site

I was just looking at some tilt-shift photos. For those who’ve never seen any (some samples are available here and here, and of course, above), the technique is a way to make real sights look like they are tiny models. Beyond simply being clever and looking cool, the technique can force you to look with new eyes.

Surely one of the greatest purposes of art is to force one to look at something anew and to see it differently than they previously had. Tilt-shift photography does that very effectively. It baldly asks the difficult question of “Why?” A picture of commuters shows a reality we understand well. A tilt-shift picture of commuters forces one to take a step back and ask why is this thing occurring?

Surely there are 100 very valid reasons. People need money to live in this modern world. The surest way to get money is to exchange your time for it. The surest way to be able to exchange your time for it is to regularly go to a place were there is a concentration of work in need of people with time. And thus: commuting.

It’s all so rational that it’s almost impossible to question. But, look at a tilt-shift photo of a model train pulling into a model train station with model people preparing to board and that whole logic becomes alien. These people look small and their world looks small and their routine looks small. They are small.

I am small.

Being small is generally taken in one of two ways. One is either comfortable or uncomfortable with it. I’m not sure which, if either, is preferable.

One can see their smallness in the face of the epic vastness of the universe as a great relief. Proof that the biggest mistakes you’ve ever made in your life are of little consequence. That time marches forward whether you succeed or fail, so there’s no use fretting much over your failings. This thinking can, however, give way to a nihilism that leaves me cold. A “it doesn’t matter” attitude can reject the fact that what one does in their life does matter, if only because all those other people are just as small as they are.

Discomfort is the opposite to that nihilistic tendency. Truly, it says, that picture does make you look small, and consideration of the whole universe makes your failures small. But we don’t live from far away or behind tilt-sift photographs. We live in it among small inconsequential people on a small inconsequential planet. If everything’s small, it’s all really quite large.

This thinking can, however, give rise to concerns out-of-balance with reality. The kind of concern that makes even a small error seem like an utter catastrophe. I’m not comfortable with that either.

There is, I think, no great comfort in being incredibly small nor incredibly large. No great comfort to be found in being completely powerless or fully omnipotent. So I guess that means I’ll just try to be as medium-sized as I can be.


This One’s About Fear

Just-Us-3Sign in the NY Subway

The TV was showing Today–one of those typical morning fluff shows–when I woke up. They were talking to a fourth or fifth grader who rode the subway alone. You should probably know that I’ve (1) never ridden any subway alone and (2) never ridden a New York subway alone. But I was amazed this merited discussion, even on a fluff show. How anyone thought this was scary enough to be notable amazed me.

Later, I walked the dog. We were about 50 feet–about 16 meters, for those on a more rational measurement system–away from the house when the dog stopped, as is his habit. Usually this is to sniff or pee, sometimes it’s to eat feces. Why the dog likes to eat poop is beyond me, but I only manage to stop him half the time. Realizing that he was again trying, I pulled on the leash. He was already eating it and managed to swallow.

As I pulled him across the street he started making strange noises. A coughing wheeze, though I’d never known that dogs could do either of the two. And he kept doing it. 

By now, I was worried. I thought about doing a tracheotomy. Then I realized that I didn’t know how to do a tracheotomy on a dog. For that matter, I didn’t even know what a tracheotomy was. I thought about going home, but I decided there was no solution to this problem there.

Maybe he’d stop breathing right there. Maybe he’d suffocate from trying to eat a turd. Of all the ways one can die, that would have be among the most embarrassing and disgusting. Slowly, as I began to think about maybe trying the Heimlich maneuver–not that I know much more about that than tracheotomies–the wheezing abated.

During this whole incident I’d made up my mind that going home was still smarter than continuing this walk. I probably wouldn’t walk the dog again for another week. I thought, “If this can happen, I’d rather not risk it.” But now the dog was pulling to get going. So continue on the walk we did.

I don’t know what small fraction of the thoughts that ran through my head as this small old dog wheezed went through his. I have to assume that he was at least aware he was wheezing and struggling to breathe, past that I don’t suspect he was much concerned. He certainly wasn’t as ready to give up on this walking idea as I was.

I’m trying to distill something from this story that doesn’t sound trite. That “you can’t live your life in fear” would be one easy conclusion to make. And I’m completely convinced that that’s a valuable lesson that people need to learn. But it feels too simplistic.

Perhaps I can end with this: fear is much different in abstraction than in reality. I can easily think it’s silly that you’re worried about a 9-year-old riding the subway, but that doesn’t make me less scared when something that causes me irrational fear comes along. Fear is usually irrational, but it never feels that way when you yourself are afraid. It’s not a prefect conclusion, but I’m afraid it’s the best one I’ve got.

metablogging, ruminations

Some Days

Some days I have nothing planned for this site and start to worry about it far too much. In worrying about it far too much, almost every idea I have feels forced. The ideas feels forced because (1) they are a little forced, and (2) this pointless stress tends to make me hyper-aware of any possible imperfection that can seep into what I’m doing. It’s not until a deadline finally appears to really be approaching quickly that I begin to accept anything that seems the least bit feasible.

Some days, yes today is one of those some days, I like to try odd devices that I wouldn’t usually use. Repetition is a favorite. I start consecutive paragraphs with the same word or sentence. In school, I learned that authors sometimes use this to emphasize a point. I just use it because it makes it easier to start the next paragraph.

Some days starting that next paragraph is the only thought in my head. Though the hardest “next paragraph” is usually the first one, it’s sometimes the third. You see, with the faintest spark of an idea the first paragraph is probably already written before one begins writing. There’s usually at least enough extra from the spark that launched the first paragraph to fill up a second. But by the third paragraph, if that idea really was just a faint spark, it’s likely that the idea’s dead.

Some days I push through that difficult third paragraph. If I can manage to make a third paragraph that feels alright, there’s a good chance that the next paragraphs will all come out all right and I’ll be able to sew the thing up into a nice enough package that I’m satisfied.

But some days that third paragraph doesn’t come. Some days the idea I had really was only a two-paragraph idea. In my time writing I’ve at least learned that a two-paragraph idea doesn’t get better if you try to make it look like an eight-paragraph idea. When teachers gave you back papers with a C or below, there’s a good chance it was because you tried to write your whole paper with a few-paragraph idea. Teachers have a keen eye for ideas stretched too far.

Some days I wonder what a teacher would give me for this. This short essay whose sole excuse for over-stretching an idea is that that idea is what the whole thing is built on. From the title down through every paragraph you clearly see an idea being stretched and stretched and stretched. I think that some teachers would think it’s clever, this stretched-out idea. Others would probably give it a D and a curt note about trying harder next time.

Some day I’ll win those teachers over. Perhaps with a device like I just used there. I broke the repetition. Maybe now that teacher who gave me a D would say, “Oh, he knows he’s stretched this idea very thin. A+.”

Then again, maybe not.