Wednesday, October 20, 2010

Social Software: Continuing the Story

Background
In 2004, technologist Christopher Allen wrote a short article about the development of social software  since the 1940's. Back in 2004--pre-Facebook, pre-YouTube, pre-social networking--most of this software was geared towards collaborative business and/productivity tools(ex. Adobe Connect, Skype, and, in a very limited way, Second Life-like environments). At the time of writing, he made a few vague predictions about how such software would be characterized in 2010:

"Typically, a visionary originates a term, and a community around that visionary may (or may not) adopt it. The diaspora of the term from that point can be slow, with 10 or 15 years passing before a term is more generally adopted. Once a term is more broadly adopted, it faces the risk of becoming a marketing term, corrupted into differentiating products rather than explaining ideas.

Is 'social software', which just now gaining wide acceptance, destined for the same trash heap of uselessness as groupware? And, if so, what impact does the changing of this terminology have on the field of social software itself?"

The Re-Write:

Although the trend of social software through the opening years of the millennium had primarily been tools for academic and business collaborations, beginning with Friendster in 2002 the popular adoption of social software to facilitate social networking and online community building became the predominant form of social software. The three most important exemplars of this type of social software that emerged by the close of the decade were Facebook, Twitter, and YouTube.

Facebook, which emerged from an initially crowded social networking market that included MySpace, Orkut, and Friendster, has become the de facto social network of mainstream culture, not just in the United States where it originated, but globally, with over 150 million users from over 60 different countries. Facebook evolved into not only a service for linking people and maintaining "weak" social relationships, but also a gaming, advertisement, and entertainment platform, with all the applications reinforcing the site's collaborative, socializing nature.

YouTube, a video sharing site, was started in 2005 and was by 2010 one of the most popular sites on the web. It regularly spurs collaborated creative efforts, often in the form of creative video responses (replies) to original media from other YouTube users.  Definitely falling within the realm of social software, the site is primarily viewed as an entertainment site.

Twitter, a streaming feed of short text posts ("tweets") created by individual users, is a cultural data feed that incorporates everyone from the average joe, politicians, entertainers, athletes, and fictional characters.  The overall effect of the site is universal inclusion of individual messages, organized temporally, but not connected by any specific theme, narrative, or prompt.

The Future: 2020

The popular social networking sites of 2010 had already revealed that traditional social norms, traditions, and mores often came into conflict with the types of views, attitudes, and customs that were revealed by users. Ultimately, these sites erode not only users notion of personal privacy, but also society's expectation of it. This will have a revolutionary effect on public-private social norms, morality, and social transparency.

Sunday, October 17, 2010

The Game of the Year that Wasn't: Medal of Honor

Afghanistan circa 1988


The Great Pashtun Hope
There was a lot of hype about EA's new Medal of Honor game.  It set series pre-sale records.  EA's stock price went up--before reviews of the game came out, at least. And there was the press-so-good-you-can't-buy-it controversy surround the ability to play as a Taliban "Opposing Force" player, modeled as accurately as possible on the current enemy of US Forces in Afghanistan. EA made much of its efforts to bring as much "reality" to the game as possible, widely publicizing its use of active duty special forces service members as consultants on the game. Early previews praised the games rendering of the Afghan theater and no-expense-spared sound effects and soundtrack.

Then the game was released.

To be fair, the reviews haven't been horrible. It has been holding steady at 75 on Metacritic, which is a fairly strong score.  Yet, every reviewer seems to lament the same foibles: short, unimpressive single player campaign, glitchy graphics and gameplay, simple AI, and definitely not as good in any way, shape, or form as its major First-Person-Shooter contemporaries such as Call of Duty 2: Modern Warfare or Halo: Reach.

In general, I agree with these criticisms.  The campaign levels are at times claustrophobic and woefully dull.  I usually felt like I was playing a video game cross of Disney's Splash Mountain and whack-a-mole, forced to follow down a narrow mountain path shooting at predictable, cover and shoot enemies in the same manner over and over again. There were occasional graphical glitches and slow frame rates--especially during the helicopter stages.  While I certainly appreciated the tactical chatter my AI-controlled squadmates provided and their attempts at utilizing actual tactical movements, they were frequently dumb enough to walk right in front of my machine gun as I was firing down some canyon or into a valley of bad guys. That was ok though, because the one new thing I did learn about special forces in Afghanistan is that they are both invincible and carrying unlimited ammunition at all times. A game designed to be 'accurate' can't be wrong, right?

Colonel Trautman: I'm sorry I got you into this, Johnny.
Rambo: No you're not.
--Rambo III
This brings me to my biggest criticism of the game: It's not even remotely realistic enough.  In addition to all the US Forces in Afghanistan being super-soldiers, Afghanistan has no females, no children, no dogs--no civilians whatsoever. In the game, you and your squad-mates raid numerous villages, with the order to essentially shoot everything that moves.  Thanks to some mysterious force called "intel," you know there are "only bad guys" in these villages, and thus shooting everything that moves is the absolutely just thing to do. 


Likewise--and I'm mystified how the folks at EA could have included this in the wake of the infamous Wikileaks video--while operating as a helicopter gunner, you can essentially decimate entire villages with combinations of machine gun and rocket fire without so much as a mention of the potential for civilian casualties. Of course, this same "intel" is constantly being criticized by your squadmates for underestimating the number of opposing forces, the nationality of those forces, the location of those forces, and even the weather, but when it comes to identifying the civilian population of an entire province as being vacated, "intel" is spot on. 


By removing the moral challenges of the Afghan conflict from the narrative of the game, the folks at EA, Danger Close, et al, have abandoned the single most challenging aspect of the contemporary military conflict they wish to convey. I could handle the dumb AI, the campaign-on-rails level design, but given all the attention and promotion of the game's "realism" and tactical accuracy, far more effort should have been spent on creating a realistic Afghan operation. 


I think the game developers were at least tangentially aware of this as well. Somewhat lazily, they included a few moral conundrums in a few of the games cinematic scenes. In one scene an enlightened colonel is overruled by an aloof, far-away general (in civilian clothes?) over teleconference, leading to the massacre of an unknown number of Afghan allies. Likewise, in what is perhaps a nod to Lone Survivor, when your squad comes across a shepherd at the beginning of a level, they opt to knock him unconscious rather than shooting him.  Nice touches, perhaps, but they are completely removed from the gameplay itself, and the player has no opportunity to exercise their own decision making abilities at any point in the game, nor do the acts conveyed require any sort of moral reflection on why exactly this particular war is hell. 


Oddly enough, the game's insanely challenging and complex multi-player mode does a better job of conveying the challenges of modern war.  Other, human players allow for a more cunning, more' human' foe. All of a sudden, the Taliban "opposing force" players play with a degree of humanity that makes them understandable as human foes and the voice acting in this mode, alongside the explosions and gunfire, create an urgent, visceral experience far beyond what the single player campaign ever offers.


 "This film is dedicated to the gallant people of Afghanistan."
--Rambo III
Danger Close and EA close the single player campaign with a several paragraph dedication to the men of the US Special Forces community.  It's an earnest, for some tear jerking, attempt at memorializing the figures featured in the game. And yet, and yet... The efforts the game designers took to humanize the special forces characters--closely rendered faces, great voice acting, humanizing pre-combat rituals of rubbing rabbit feet and chewing tobacco--only highlights by contrasts their inability to humanize the conflict itself. While there were moments--just moments--where I felt a recognition that I was playing/fighting in a scenario alongside what were the in-game equivalents of contemporary, human warfighters, I never felt that the enemies represented anything more human than Halo's Covenant aliens--perhaps even less so. Likewise, with an Afghanistan populated only by targets and target shooters, the gameplay itself removed any pretension of human conflict. 


What is left is at best a spiritual sequel to the Rambo films of the 1980's. Stallone wrote those films to memorialize the forgotten military heroes of a generation that largely tried to forget the conflicts that they fought in. Likewise, this game makes an effort, albeit a meek one, to bring to the popular fore the conflict that so few contemporary Americans have any connection to.  Yet, it isn't Stallone's films that offer anything near an accurate portrait of war or those that fight in them (later films such as Blackhawk Down, We Were Soldiers, The Thin Red Line, Saving Private Ryan, Letter from Iwo Jima, etc all do a far better job), and because of this, much of the earnest sentiment is lost amongst the explosions and oiled set-pieces. All that one remembers is the visceral war porn that remains. 


I have higher hopes for a number of games that are coming out in the coming weeks, Fallout New Vegas and Fable III, to name a few. While these games do not purport to represent reality or contemporary conflict, I actually expect these games to feel more human, more provoking than Medal of Honor, a game that, by design, should have been so much more. 
Afghanistan circa 2005
(Apparently, all the women and children moved to Pakistan?)



Wednesday, October 13, 2010

Online Community Examination: The Economist Debates

I've passively followed some of the debates on The Economist's website for a few years now. By passively, I mean as--in web parlance--a lurker.  I've never posted a comment.  I have voted a few times in the "agree with the motion" or "disagree with the motion" but I've never felt interested/motivated or qualified enough to contribute a written opinion. I suppose this makes me a quasi-lurker, but I digress.

In general, the community is made of readers of the Economist and its website (the debates have been regularly plugged in the print magazine), but it can also include guest luminaries and pundits varying by the subject of debate.

Purpose:


The debates are designed as an enhanced forum for traditional Oxford style debate. A motion is proposed by the "house," and the debate is whether to "agree" or "disagree." At the end of a series of  phases, a winner is declared by the editorial staff (the "moderator").  The debates tend to focus on issues tangent to current events. As such, the Economist can present a number of related archived stories, featured agree/disagree opinions by expert or celebrity commentators, and yet another forum for expert but objective commentary on the points made in the user forums and by experts.  As a whole, the debate forum as The Economist presents it allows participants to further educate themselves on the facts of the situation presented, and provides exposure to the breadth of opinions presented the (assumed) most capable pundits and its readership. In the classical sense, the purpose of this website is education via a dialectic.

Product:


Although the site is mostly textual, it does what I feel is a fairly good job of presenting the multitude of information surrounding a debate in parallel. Although dense, the site is organized professionally, adhering to design concepts argued for by Redish, Williams, etc. That said, a fair amount of literacy and openness to moderated argument is assumed of the site's readership.  While flame wars have been present in the forums, they are far more polite that what one might find in the comments section of a CNN.com article, with most forum posts beginning "Dear Sir," or "Dear Moderator." Rarely are comments directed at specific posters. (This may be against the Oxford Union-house rules that the forum is modeled on. Regardless, it is effective.)

Process:


Debates have five parts: Overview, Opening, Rebuttal, Closing, and Post-Debate. Each of the latter four phases marks the closing of the forums for that part, the starting of a new forum, the inclusion of new expert pundits, and additional commentary by the moderator. Forum members can change their agree/disagree votes at any time, and the votes are charted daily up until the closing phase. This is very unlike other online forums that may be moderated, but are not closed off into temporal sections of discussion. While an online thread on, say, Brett Farve's love life on a Viking's Fan Site, has the potential to go on into perpetuity, that this forum is designed to actually come to some sort of moderated conclusion on a social, economic, or political issue makes it somewhat unique.

Has a debate ever changed my opinion? Not completely.  I have never found myself transitioning from the "agree" to "disagree" or vice versa, but my view on the topics presented have always ended up more nuanced, more complex than prior to following the debates. This house proposes that this makes this community an effective, intriguing one, both for the quality of its presentation and the occasionally thought provoking post in the floor forum.

Monday, October 4, 2010

The Trolley Problem


The first time I sat down and took part in an undergraduate ethics seminar (plebe year, just glad not to be in Bancroft Hall), I was greeted with cookies and the following scenario:

"Suppose you are the driver of a trolley. The trolley rounds a bend, and there come into view ahead five track workmen, who have been repairing the track...[Y]ou must stop the trolley if you are to avoid running the five men down. You step on the breaks, but, alas, they don't work. Now you suddenly see a spur of track leading off to the right. You can turn the trolley onto it and thus save the five me on the straight track ahead. Unfortunately...there is one track workman on that spur of track. He can no more get off the track in time than the five can, so you will kill him if you turn the trolley onto him.

Is it morally permissible for you to turn the trolley?" (Thomson I).

And such was my introduction to the question that Phillipa Foot first posed when addressing the question of abortion in 1978 in her The Doctrine of Abortion and the Doctrine of the Double Effect. Although the trolley problem's ties to Foot and the subject of her original essay have been largely forgotten, the type of thought experiment posed by the question has become a critical feature of any number of fields of study.  Judith Thompson, an MIT professor and Foot's contemporary expanded on the problem considerably, crafting all sorts of morally frustrating variations. Law schools soon latched on as well and the question is often posed to first year law students when the question of 'what is morally just versus what is morally permissible' is initially posed (perhaps to hasten the moral cynicism that all law students eventually succumb to).

Study of the trolley problem is currently of interest to neuroscience, as morality thought experiments, when posed to populations observed by fMRI scanners, have revealed how the brain performs moral reasoning and, more interestingly, how members of different gender, ethnic, and social groups actually may process such question differently than other populations. At the forefront of this type of research is a Harvard researcher named Joshua Greene who runs that university's Moral Cognition Lab. (See exemplar studies below.)

Similarly, psychology has found all interesting uses for variations of the trolley problem. Of recent note was a study released by Cornell research psychologist David Pizarro, who used varying nouns to implicate race distinctions in the various descriptions of the problem, revealing telling inclinations of certain political groups towards favoring one ethnic group over another. (See Wired.com summary here, article cited below.)

Those interested in these fields will find each discipline’s study of the problem fascinating in its own right, but the trolley problem and its frustrating rhetorical implications has come to represent thought experiment par excellence of our contemporary age.

(Now if we could just manage to convey the same sort of thing with a video game...)


References:

Ditto, P.H., Pizarro, D.A., & Tannenbaum. (2009). Motivated moral reasoning. Psychology of Learning and Motivation, 50, 307-338. http://www.sciencedirect.com/scidirimg/clear.gifdoi:10.1016/S0079-7421(08)00410-6.

Foot, P. (1967). The problem of abortion and the doctrine of double effect. Oxford Review, 5Retrieved from http://www2.econ.iastate.edu/classes/econ362/hallam/Readings/FootDoubleEffect.pdf.

Greene, J.D., Nystrom, L.E., Engell, A.D., Darley, J.M., & Cohen, J.D. (2004). The neural bases of cognitive conflict and control in moral judgment. Neuron, 44(2), 389-400. doi:10.1016/j.neuron.2004.09.027

Greene, J.D., Sommerville, R.B., Nystrom, L.E., Darley, J.M., & Cohen, J.D. (2001). An fMRI investigation of emotional engagement in moral judgment. Science 293, 2105-2108. doi: 10.1126/science.1062872.

Thompson, J.J. (1985). The trolley problem. Yale Law Journal. Retrieved from http://amirim.mscc.huji.ac.il/law_ethics/docs/Thomson.pdf.

Sunday, October 3, 2010

My Netiquette Top 10

1. Don't forward chain e-mails. Ever.
My dad is the worst offender I know on this big number one netiquette rule.  Because my earliest email experiences involved corresponding with him over email, an on-again-of-again conversation that has lasted since at least 1996, this has also been one of my most persistent annoyances. To be fair, my dad's affection with chain emails is well intentioned. I get the feeling that he has looked at every email or link he's ever forwarded. Likewise, I remember early on in the 'internet age' when getting jokes and funny little stories (see Darwin Awards) in your email was a real treat. Friends and colleagues took the time to discuss the most recent emails at parties and lunches. This probably doesn't happen anymore. In fact, I imagine that except for a dedicated few, most the emails sent by my dad beginning with "FW:" end up as they do in my email box, subject lines bold and unread into perpetuity.

2. Don't friend request your boss.
This has always seemed obvious to me, but I still see people try to do it all the time. That said, if your boss friends you, it's probably ok to accept (cynically mandatory, actually).  Friending former bosses is passable behavior as well. On the other end of the spectrum, however, is friending someone who you've just been assigned under, but have never actually met.  This is the definition of cyber awkward. Expect rejection.


3. Don't start Facebook wall posts with "Dear," "To whom it may concern:," etc. or end a post with "Love," "Sincerely," "Best wishes," etc. (Unless you're being ironic.)
Facebook posts aren't letters. They aren't even email. They are wall posts, descendants of graffiti in urban spaces. Your identity is obvious as soon as it shows up on the wall, so there's no need to identify yourself. Any niceties on the front or back of your post make it seem like you are completely lost or are suffering from momentary amnesia. The bright, electric screen and pictures of your cousin's birthday party should make it obvious you're not crafting a epistemological masterpiece, Mr. Hemingway.

4. Don't harass gamer girls.
I've been playing a lot of Halo: Reach lately.  Every so often, a female gamer will be part of one the games.  The second a female voice is heard by the rest of the (male) players, the discourse, gaming, and general vibe of the experience goes downhill immediately.

Exemplar comments:
-'You sound hot. Where do you live?'
-'Do you have a boyfriend?'
-'I have a girlfriend, but I'd do you.'
-'How about you give me your phone number? I'll sext you.'
-'I just killed you. That kinda turns me on. Does it turn you on?'
And so on.

This really ruins the experience of all the gamers, not just the females.  I feel like there are enough experiential boundaries to getting into a competitive, relatively complex game like Halo: Reach to start throwing up crude social barriers to the gaming experience as well. And while not every gamer will look like the girls in this pertinent video, the gaming experience should be as open and welcoming to everyone as possible. If nothing else, having a larger number of players simply enriches the game play of all.

5. People don't care about your Farmville cows. (And neither should you.)
I know Zenga is making a zillion dollars a week by incentivizing time wasting to people around the world, but the last thing I want on my Facebook newsfeed is to hear about your new baby calf having diarrhea, what your mafioso crew did to some speakeasy, or that your pirates just assassinated a trove of pink unicorns.Yes, I know I can block stuff like this, but should I really have to?

6. You are not your dog, Martin Luther King Jr., or a Teenage Mutant Ninja Turtle.
Your Facebook profile picture should reflect this.

7. People do not want to be fans of your employer.
Congratulations! I'm glad you got that great new job at Target, I really am. I was tired of paying your half of the rent, anyway. But this does not mean I want to become a fan of Target, and I think you owe it to me not to suggest such a thing. Please. (By the way, you still do owe me $1200 and I'm not a fan of that, either.)

8. If you're going to post a link to an article, add a personal comment about it.
It's great that you think that Tom Friedman/Michelle Malkin/Any Huffington Post Contributor's latest diatribe is so insightful.  But why?  As Yoda once said: Posting an article does not an opinion make.

9. Don't hate (on Social Media).
I've been in more than one conversation with people who state that they 'hate, just absolutely hate' Twitter/Facebook/bloggers. And yet, when the question is asked, 'have you ever actually been on Twitter/Facebook/a blog?' The answer is almost always no.

People are entitled to opinions. But uninformed opinions... Not so much.

10. Yes, Professor, I am looking at Facebook during class.
And so is everyone else.

Although the argument is sometimes made that computers in the classroom detract from the classroom experience, in computer science classes, new media classes, and the like computer-centric classrooms are a necessary evil. In more and more college classes these days students are bringing their laptops (and at least their cell phones) into class.  And all of these students have Facebook or Twitter or GMail open while they are in class. It's a fact of contemporary life.  For instructors, this simply means that your margin of attention grabbing error is smaller than ever.  The best perspective, in my opinion, is to see classroom Facebook use as a symptom of a problem, not the actual problem.

The actual problem is that you haven't sold me on the idea of what you're teaching is actually important and/or interesting. Although we all know cognitively that we're already paying tuition, you're going to give us grades at the end of the semester and that this material will be on the final exam, it all doesn't matter when the cute girl/guy from the party last night just added us as a friend and we now have access to all their pictures from last spring break. Yes, professors, this means that you are competing against co-eds in swimsuits on the beach.

Time to earn your tenure.